WorldWideScience

Sample records for greater computational cost

  1. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  2. A cost modelling system for cloud computing

    OpenAIRE

    Ajeh, Daniel; Ellman, Jeremy; Keogh, Shelagh

    2014-01-01

    An advance in technology unlocks new opportunities for organizations to increase their productivity, efficiency and process automation while reducing the cost of doing business as well. The emergence of cloud computing addresses these prospects through the provision of agile systems that are scalable, flexible and reliable as well as cost effective. Cloud computing has made hosting and deployment of computing resources cheaper and easier with no up-front charges but pay per-use flexible payme...

  3. Cloud Computing-An Ultimate Technique to Minimize Computing cost for Developing Countries

    OpenAIRE

    Narendra Kumar; Shikha Jain

    2012-01-01

    The presented paper deals with how remotely managed computing and IT resources can be beneficial in the developing countries like India and Asian sub-continent countries. This paper not only defines the architectures and functionalities of cloud computing but also indicates strongly about the current demand of Cloud computing to achieve organizational and personal level of IT supports in very minimal cost with high class flexibility. The power of cloud can be used to reduce the cost of IT - r...

  4. Cost/Benefit Analysis of Leasing Versus Purchasing Computers

    National Research Council Canada - National Science Library

    Arceneaux, Alan

    1997-01-01

    .... In constructing this model, several factors were considered, including: The purchase cost of computer equipment, annual lease payments, depreciation costs, the opportunity cost of purchasing, tax revenue implications and various leasing terms...

  5. On efficiency of fire simulation realization: parallelization with greater number of computational meshes

    Science.gov (United States)

    Valasek, Lukas; Glasa, Jan

    2017-12-01

    Current fire simulation systems are capable to utilize advantages of high-performance computer (HPC) platforms available and to model fires efficiently in parallel. In this paper, efficiency of a corridor fire simulation on a HPC computer cluster is discussed. The parallel MPI version of Fire Dynamics Simulator is used for testing efficiency of selected strategies of allocation of computational resources of the cluster using a greater number of computational cores. Simulation results indicate that if the number of cores used is not equal to a multiple of the total number of cluster node cores there are allocation strategies which provide more efficient calculations.

  6. Incremental ALARA cost/benefit computer analysis

    International Nuclear Information System (INIS)

    Hamby, P.

    1987-01-01

    Commonwealth Edison Company has developed and is testing an enhanced Fortran Computer Program to be used for cost/benefit analysis of Radiation Reduction Projects at its six nuclear power facilities and Corporate Technical Support Groups. This paper describes a Macro-Diven IBM Mainframe Program comprised of two different types of analyses-an Abbreviated Program with fixed costs and base values, and an extended Engineering Version for a detailed, more through and time-consuming approach. The extended engineering version breaks radiation exposure costs down into two components-Health-Related Costs and Replacement Labor Costs. According to user input, the program automatically adjust these two cost components and applies the derivation to company economic analyses such as replacement power costs, carrying charges, debt interest, and capital investment cost. The results from one of more program runs using different parameters may be compared in order to determine the most appropriate ALARA dose reduction technique. Benefits of this particular cost / benefit analysis technique includes flexibility to accommodate a wide range of user data and pre-job preparation, as well as the use of proven and standardized company economic equations

  7. Towards Greater Harmonisation of Decommissioning Cost Estimates

    International Nuclear Information System (INIS)

    O'Sullivan, Patrick; ); Laraia, Michele; ); LaGuardia, Thomas S.

    2010-01-01

    The NEA Decommissioning Cost Estimation Group (DCEG), in collaboration with the IAEA Waste Technology Section and the EC Directorate-General for Energy and Transport, has recently studied cost estimation practices in 12 countries - Belgium, Canada, France, Germany, Italy, Japan, the Netherlands, Slovakia, Spain, Sweden, the United Kingdom and the United States. Its findings are to be published in an OECD/NEA report entitled Cost Estimation for Decommissioning: An International Overview of Cost Elements, Estimation Practices and Reporting Requirements. This booklet highlights the findings contained in the full report. (authors)

  8. Personal computers in high energy physics

    International Nuclear Information System (INIS)

    Quarrie, D.R.

    1987-01-01

    The role of personal computers within HEP is expanding as their capabilities increase and their cost decreases. Already they offer greater flexibility than many low-cost graphics terminals for a comparable cost and in addition they can significantly increase the productivity of physicists and programmers. This talk will discuss existing uses for personal computers and explore possible future directions for their integration into the overall computing environment. (orig.)

  9. Estimating pressurized water reactor decommissioning costs: A user's manual for the PWR Cost Estimating Computer Program (CECP) software

    International Nuclear Information System (INIS)

    Bierschbach, M.C.; Mencinsky, G.J.

    1993-10-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the US Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personnel computer, provides estimates for the cost of decommissioning PWR plant stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  10. Fixed-point image orthorectification algorithms for reduced computational cost

    Science.gov (United States)

    French, Joseph Clinton

    Imaging systems have been applied to many new applications in recent years. With the advent of low-cost, low-power focal planes and more powerful, lower cost computers, remote sensing applications have become more wide spread. Many of these applications require some form of geolocation, especially when relative distances are desired. However, when greater global positional accuracy is needed, orthorectification becomes necessary. Orthorectification is the process of projecting an image onto a Digital Elevation Map (DEM), which removes terrain distortions and corrects the perspective distortion by changing the viewing angle to be perpendicular to the projection plane. Orthorectification is used in disaster tracking, landscape management, wildlife monitoring and many other applications. However, orthorectification is a computationally expensive process due to floating point operations and divisions in the algorithm. To reduce the computational cost of on-board processing, two novel algorithm modifications are proposed. One modification is projection utilizing fixed-point arithmetic. Fixed point arithmetic removes the floating point operations and reduces the processing time by operating only on integers. The second modification is replacement of the division inherent in projection with a multiplication of the inverse. The inverse must operate iteratively. Therefore, the inverse is replaced with a linear approximation. As a result of these modifications, the processing time of projection is reduced by a factor of 1.3x with an average pixel position error of 0.2% of a pixel size for 128-bit integer processing and over 4x with an average pixel position error of less than 13% of a pixel size for a 64-bit integer processing. A secondary inverse function approximation is also developed that replaces the linear approximation with a quadratic. The quadratic approximation produces a more accurate approximation of the inverse, allowing for an integer multiplication calculation

  11. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Pendergrass, J.H.

    1978-01-01

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  12. Computer Software for Life Cycle Cost.

    Science.gov (United States)

    1987-04-01

    34 111. 1111I .25 IL4 jj 16 MICROCOPY RESOLUTION TEST CHART hut FILE C AIR CoMMNAMN STFF COLLG STUJDET PORTO i COMpUTER SOFTWARE FOR LIFE CYCLE CO879...obsolete), physical life (utility before physically wearing out), or application life (utility in a given function)." (7:5) The costs are usually

  13. Greater healthcare utilization and costs among Black persons compared to White persons with aphasia in the North Carolina stroke belt.

    Science.gov (United States)

    Ellis, Charles; Hardy, Rose Y; Lindrooth, Richard C

    2017-05-15

    To examine racial differences in healthcare utilization and costs for persons with aphasia (PWA) being treated in acute care hospitals in North Carolina (NC). NC Healthcare Cost and Utilization Project State Inpatient Database (HCUP-SID) data from 2011-2012 were analyzed to examine healthcare utilization and costs of care for stroke patients with aphasia. Analyses emphasized length of stay, charges and cost of general hospital services. Generalized linear models (GLM) were constructed to determine the impact of demographic characteristics, stroke/illness severity, and observed hospital characteristics on utilization and costs. Hospital fixed effects were included to yield within-hospital estimates of disparities. GLM models demonstrated that Blacks with aphasia experienced 1.9days longer lengths of stay compared to Whites with aphasia after controlling for demographic characteristics, 1.4days controlling for stroke/illness severity, 1.2days controlling for observed hospital characteristics, and ~1 extra day controlling for unobserved hospital characteristics. Similarly, Blacks accrued ~$2047 greater total costs compared to Whites after controlling for demographic characteristics, $1659 controlling for stroke/illness severity, $1338 controlling for observed hospital characteristics, and ~$1311 greater total costs after controlling for unobserved hospital characteristics. In the acute hospital setting, Blacks with aphasia utilize greater hospital services during longer hospitalizations and at substantially higher costs in the state of NC. A substantial portion of the adjusted difference was related to the hospital treating the patient. However, even after controlling for the hospital, the differences remained clinically and statistically significant. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Cloud Computing and Information Technology Resource Cost Management for SMEs

    DEFF Research Database (Denmark)

    Kuada, Eric; Adanu, Kwame; Olesen, Henning

    2013-01-01

    This paper analyzes the decision-making problem confronting SMEs considering the adoption of cloud computing as an alternative to in-house computing services provision. The economics of choosing between in-house computing and a cloud alternative is analyzed by comparing the total economic costs...... in determining the relative value of cloud computing....

  15. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    Science.gov (United States)

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  16. An approximate fractional Gaussian noise model with computational cost

    KAUST Repository

    Sørbye, Sigrunn H.

    2017-09-18

    Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood-based approach is ${\\\\mathcal O}(n^{2})$, exploiting the Toeplitz structure of the covariance matrix. In most realistic cases, we do not observe the fGn process directly but only through indirect Gaussian observations, so the Toeplitz structure is easily lost and the computational cost increases to ${\\\\mathcal O}(n^{3})$. This paper presents an approximate fGn model of ${\\\\mathcal O}(n)$ computational cost, both with direct or indirect Gaussian observations, with or without conditioning. This is achieved by approximating fGn with a weighted sum of independent first-order autoregressive processes, fitting the parameters of the approximation to match the autocorrelation function of the fGn model. The resulting approximation is stationary despite being Markov and gives a remarkably accurate fit using only four components. The performance of the approximate fGn model is demonstrated in simulations and two real data examples.

  17. Costs of cloud computing for a biometry department. A case study.

    Science.gov (United States)

    Knaus, J; Hieke, S; Binder, H; Schwarzer, G

    2013-01-01

    "Cloud" computing providers, such as the Amazon Web Services (AWS), offer stable and scalable computational resources based on hardware virtualization, with short, usually hourly, billing periods. The idea of pay-as-you-use seems appealing for biometry research units which have only limited access to university or corporate data center resources or grids. This case study compares the costs of an existing heterogeneous on-site hardware pool in a Medical Biometry and Statistics department to a comparable AWS offer. The "total cost of ownership", including all direct costs, is determined for the on-site hardware, and hourly prices are derived, based on actual system utilization during the year 2011. Indirect costs, which are difficult to quantify are not included in this comparison, but nevertheless some rough guidance from our experience is given. To indicate the scale of costs for a methodological research project, a simulation study of a permutation-based statistical approach is performed using AWS and on-site hardware. In the presented case, with a system utilization of 25-30 percent and 3-5-year amortization, on-site hardware can result in smaller costs, compared to hourly rental in the cloud dependent on the instance chosen. Renting cloud instances with sufficient main memory is a deciding factor in this comparison. Costs for on-site hardware may vary, depending on the specific infrastructure at a research unit, but have only moderate impact on the overall comparison and subsequent decision for obtaining affordable scientific computing resources. Overall utilization has a much stronger impact as it determines the actual computing hours needed per year. Taking this into ac count, cloud computing might still be a viable option for projects with limited maturity, or as a supplement for short peaks in demand.

  18. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  19. The Hidden Cost of Buying a Computer.

    Science.gov (United States)

    Johnson, Michael

    1983-01-01

    In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)

  20. Synergy effects of fluoxetine and variability in temperature lead to proportionally greater fitness costs in Daphnia: A multigenerational test.

    Science.gov (United States)

    Barbosa, Miguel; Inocentes, Núrya; Soares, Amadeu M V M; Oliveira, Miguel

    2017-12-01

    Increased variability in water temperature is predicted to impose disproportionally greater fitness costs than mean increase in temperature. Additionally, water contaminants are currently a major source of human-induced stress likely to produce fitness costs. Global change models forecast an increase in these two human-induced stressors. Yet, in spite the growing interest in understanding how organisms respond to global change, the joint fitness effects of water pollution and increased variability in temperature remain unclear. Here, using a multigenerational design, we test the hypothesis that exposure to high concentrations of fluoxetine, a human medicine commonly found in freshwater systems, causes increased lifetime fitness costs, when associated with increased variability in temperature. Although fluoxetine and variability in temperature elicited some fitness cost when tested alone, when both stressors acted together the costs were disproportionally greater. The combined effect of fluoxetine and variability in temperature led to a reduction of 37% in lifetime reproductive success and a 17.9% decrease in population growth rate. Interestingly, fluoxetine and variability in temperature had no effect on the probability of survival. Freshwater systems are among the most imperilled ecosystems, often exposed to multiple human-induced stressors. Our results indicate that organisms face greater fitness risk when exposed to multiple stressors at the same time than when each stress acts alone. Our study highlights the importance of using a multi-generational approach to fully understand individual environmental tolerance and its responses to a global change scenario in aquatic systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Resource utilization and costs during the initial years of lung cancer screening with computed tomography in Canada.

    Science.gov (United States)

    Cressman, Sonya; Lam, Stephen; Tammemagi, Martin C; Evans, William K; Leighl, Natasha B; Regier, Dean A; Bolbocean, Corneliu; Shepherd, Frances A; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R; Mayo, John R; McWilliams, Annette; Couture, Christian; English, John C; Goffin, John; Hwang, David M; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J; Goss, Glenwood D; Nicholas, Garth; Seely, Jean M; Sekhon, Harmanjatinder S; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D; Tan, Wan C; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J

    2014-10-01

    It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer's perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400-$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553-$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254-$52,200; p = 0.061). In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure.

  2. Waste Management Facilities Cost Information report for Greater-Than-Class C and DOE equivalent special case waste

    Energy Technology Data Exchange (ETDEWEB)

    Feizollahi, F.; Shropshire, D.

    1993-07-01

    This Waste Management Facility Cost Information (WMFCI) report for Greater-Than-Class C low-level waste (GTCC LLW) and DOE equivalent special case waste contains preconceptual designs and planning level life-cycle cost (PLCC) estimates for treatment, storage, and disposal facilities needed for management of GTCC LLW and DOE equivalent waste. The report contains information on 16 facilities (referred to as cost modules). These facilities are treatment facility front-end and back-end support functions (administration support, and receiving, preparation, and shipping cost modules); seven treatment concepts (incineration, metal melting, shredding/compaction, solidification, vitrification, metal sizing and decontamination, and wet/air oxidation cost modules); two storage concepts (enclosed vault and silo); disposal facility front-end functions (disposal receiving and inspection cost module); and four disposal concepts (shallow-land, engineered shallow-land, intermediate depth, and deep geological cost modules). Data in this report allow the user to develop PLCC estimates for various waste management options. A procedure to guide the U.S. Department of Energy (DOE) and its contractor personnel in the use of estimating data is also included in this report.

  3. Waste Management Facilities Cost Information report for Greater-Than-Class C and DOE equivalent special case waste

    International Nuclear Information System (INIS)

    Feizollahi, F.; Shropshire, D.

    1993-07-01

    This Waste Management Facility Cost Information (WMFCI) report for Greater-Than-Class C low-level waste (GTCC LLW) and DOE equivalent special case waste contains preconceptual designs and planning level life-cycle cost (PLCC) estimates for treatment, storage, and disposal facilities needed for management of GTCC LLW and DOE equivalent waste. The report contains information on 16 facilities (referred to as cost modules). These facilities are treatment facility front-end and back-end support functions (administration support, and receiving, preparation, and shipping cost modules); seven treatment concepts (incineration, metal melting, shredding/compaction, solidification, vitrification, metal sizing and decontamination, and wet/air oxidation cost modules); two storage concepts (enclosed vault and silo); disposal facility front-end functions (disposal receiving and inspection cost module); and four disposal concepts (shallow-land, engineered shallow-land, intermediate depth, and deep geological cost modules). Data in this report allow the user to develop PLCC estimates for various waste management options. A procedure to guide the U.S. Department of Energy (DOE) and its contractor personnel in the use of estimating data is also included in this report

  4. Computing Cost Price for Cataract Surgery by Activity Based Costing (ABC Method at Hazrat-E-Zahra Hospital, Isfahan University of Medical Sciences, 2014

    Directory of Open Access Journals (Sweden)

    Masuod Ferdosi

    2016-10-01

    Full Text Available Background: Hospital managers need to have accurate information about actual costs to make efficient and effective decisions. In activity based costing method, first, activities are recognized and then direct and indirect costs are computed based on allocation methods. The aim of this study was to compute the cost price for cataract surgery by Activity Based Costing (ABC method at Hazrat-e-Zahra Hospital, Isfahan University of Medical Sciences. Methods: This was a cross- sectional study for computing the costs of cataract surgery by activity based costing technique in Hazrat-e-Zahra Hospital in Isfahan University of Medical Sciences, 2014. Data were collected through interview and direct observation and analyzed by Excel software. Results: According to the results of this study, total cost in cataract surgery was 8,368,978 Rials. Personnel cost included 62.2% (5,213,574 Rials of total cost of cataract surgery that is the highest share of surgery costs. The cost of consumables was 7.57% (1,992,852 Rials of surgery costs. Conclusion: Based on the results, there was different between cost price of the services and public Tariff which appears as hazards or financial crises to the hospital. Therefore, it is recommended to use the right methods to compute the costs relating to Activity Based Costing. Cost price of cataract surgery can be reduced by strategies such as decreasing the cost of consumables.

  5. Low cost spacecraft computers: Oxymoron or future trend?

    Science.gov (United States)

    Manning, Robert M.

    1993-01-01

    Over the last few decades, application of current terrestrial computer technology in embedded spacecraft control systems has been expensive and wrought with many technical challenges. These challenges have centered on overcoming the extreme environmental constraints (protons, neutrons, gamma radiation, cosmic rays, temperature, vibration, etc.) that often preclude direct use of commercial off-the-shelf computer technology. Reliability, fault tolerance and power have also greatly constrained the selection of spacecraft control system computers. More recently, new constraints are being felt, cost and mass in particular, that have again narrowed the degrees of freedom spacecraft designers once enjoyed. This paper discusses these challenges, how they were previously overcome, how future trends in commercial computer technology will simplify (or hinder) selection of computer technology for spacecraft control applications, and what spacecraft electronic system designers can do now to circumvent them.

  6. Cost-benefit analysis for invasive species control: the case of greater Canada goose Branta canadensis in Flanders (northern Belgium)

    Science.gov (United States)

    Casaer, Jim; De Smet, Lieven; Devos, Koen; Huysentruyt, Frank; Robertson, Peter A.; Verbeke, Tom

    2018-01-01

    Background Sound decisions on control actions for established invasive alien species (IAS) require information on ecological as well as socio-economic impact of the species and of its management. Cost-benefit analysis provides part of this information, yet has received relatively little attention in the scientific literature on IAS. Methods We apply a bio-economic model in a cost-benefit analysis framework to greater Canada goose Branta canadensis, an IAS with documented social, economic and ecological impacts in Flanders (northern Belgium). We compared a business as usual (BAU) scenario which involved non-coordinated hunting and egg destruction with an enhanced scenario based on a continuation of these activities but supplemented with coordinated capture of moulting birds. To assess population growth under the BAU scenario we fitted a logistic growth model to the observed pre-moult capture population. Projected damage costs included water eutrophication and damage to cultivated grasslands and were calculated for all scenarios. Management costs of the moult captures were based on a representative average of the actual cost of planning and executing moult captures. Results Comparing the scenarios with different capture rates, different costs for eutrophication and various discount rates, showed avoided damage costs were in the range of 21.15 M€ to 45.82 M€ under the moult capture scenario. The lowest value for the avoided costs applied to the scenario where we lowered the capture rate by 10%. The highest value occurred in the scenario where we lowered the real discount rate from 4% to 2.5%. Discussion The reduction in damage costs always outweighed the additional management costs of moult captures. Therefore, additional coordinated moult captures could be applied to limit the negative economic impact of greater Canada goose at a regional scale. We further discuss the strengths and weaknesses of our approach and its potential application to other IAS. PMID

  7. Cost-benefit analysis for invasive species control: the case of greater Canada goose Branta canadensis in Flanders (northern Belgium

    Directory of Open Access Journals (Sweden)

    Nikolaas Reyns

    2018-01-01

    Full Text Available Background Sound decisions on control actions for established invasive alien species (IAS require information on ecological as well as socio-economic impact of the species and of its management. Cost-benefit analysis provides part of this information, yet has received relatively little attention in the scientific literature on IAS. Methods We apply a bio-economic model in a cost-benefit analysis framework to greater Canada goose Branta canadensis, an IAS with documented social, economic and ecological impacts in Flanders (northern Belgium. We compared a business as usual (BAU scenario which involved non-coordinated hunting and egg destruction with an enhanced scenario based on a continuation of these activities but supplemented with coordinated capture of moulting birds. To assess population growth under the BAU scenario we fitted a logistic growth model to the observed pre-moult capture population. Projected damage costs included water eutrophication and damage to cultivated grasslands and were calculated for all scenarios. Management costs of the moult captures were based on a representative average of the actual cost of planning and executing moult captures. Results Comparing the scenarios with different capture rates, different costs for eutrophication and various discount rates, showed avoided damage costs were in the range of 21.15 M€ to 45.82 M€ under the moult capture scenario. The lowest value for the avoided costs applied to the scenario where we lowered the capture rate by 10%. The highest value occurred in the scenario where we lowered the real discount rate from 4% to 2.5%. Discussion The reduction in damage costs always outweighed the additional management costs of moult captures. Therefore, additional coordinated moult captures could be applied to limit the negative economic impact of greater Canada goose at a regional scale. We further discuss the strengths and weaknesses of our approach and its potential application to other

  8. Estimating boiling water reactor decommissioning costs. A user's manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    International Nuclear Information System (INIS)

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  9. Cost-effectiveness of PET and PET/Computed Tomography

    DEFF Research Database (Denmark)

    Gerke, Oke; Hermansson, Ronnie; Hess, Søren

    2015-01-01

    measure by means of incremental cost-effectiveness ratios when considering the replacement of the standard regimen by a new diagnostic procedure. This article discusses economic assessments of PET and PET/computed tomography reported until mid-July 2014. Forty-seven studies on cancer and noncancer...

  10. Cost-effectiveness of computer-assisted training in cognitive-behavioral therapy as an adjunct to standard care for addiction.

    Science.gov (United States)

    Olmstead, Todd A; Ostrow, Cary D; Carroll, Kathleen M

    2010-08-01

    To determine the cost-effectiveness, from clinic and patient perspectives, of a computer-based version of cognitive-behavioral therapy (CBT4CBT) as an addition to regular clinical practice for substance dependence. PARTICIPANTS, DESIGN AND MEASUREMENTS: This cost-effectiveness study is based on a randomized clinical trial in which 77 individuals seeking treatment for substance dependence at an outpatient community setting were randomly assigned to treatment as usual (TAU) or TAU plus biweekly access to computer-based training in CBT (TAU plus CBT4CBT). The primary patient outcome measure was the total number of drug-free specimens provided during treatment. Incremental cost-effectiveness ratios (ICERs) and cost-effectiveness acceptability curves (CEACs) were used to determine the cost-effectiveness of TAU plus CBT4CBT relative to TAU alone. Results are presented from both the clinic and patient perspectives and are shown to be robust to (i) sensitivity analyses and (ii) a secondary objective patient outcome measure. The per patient cost of adding CBT4CBT to standard care was $39 ($27) from the clinic (patient) perspective. From the clinic (patient) perspective, TAU plus CBT4CBT is likely to be cost-effective when the threshold value to decision makers of an additional drug-free specimen is greater than approximately $21 ($15), and TAU alone is likely to be cost-effective when the threshold value is less than approximately $21 ($15). The ICERs for TAU plus CBT4CBT also compare favorably to ICERs reported elsewhere for other empirically validated therapies, including contingency management. TAU plus CBT4CBT appears to be a good value from both the clinic and patient perspectives. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  11. Operational technology for greater confinement disposal

    International Nuclear Information System (INIS)

    Dickman, P.T.; Vollmer, A.T.; Hunter, P.H.

    1984-12-01

    Procedures and methods for the design and operation of a greater confinement disposal facility using large-diameter boreholes are discussed. It is assumed that the facility would be located at an operating low-level waste disposal site and that only a small portion of the wastes received at the site would require greater confinement disposal. The document is organized into sections addressing: facility planning process; facility construction; waste loading and handling; radiological safety planning; operations procedures; and engineering cost studies. While primarily written for low-level waste management site operators and managers, a detailed economic assessment section is included that should assist planners in performing cost analyses. Economic assessments for both commercial and US government greater confinement disposal facilities are included. The estimated disposal costs range from $27 to $104 per cubic foot for a commercial facility and from $17 to $60 per cubic foot for a government facility. These costs are based on average site preparation, construction, and waste loading costs for both contact- and remote-handled wastes. 14 figures, 22 tables

  12. Low-cost computer mouse for the elderly or disabled in Taiwan.

    Science.gov (United States)

    Chen, C-C; Chen, W-L; Chen, B-N; Shih, Y-Y; Lai, J-S; Chen, Y-L

    2014-01-01

    A mouse is an important communication interface between a human and a computer, but it is still difficult to use for the elderly or disabled. To develop a low-cost computer mouse auxiliary tool. The principal structure of the low-cost mouse auxiliary tool is the IR (infrared ray) array module and the Wii icon sensor module, which combine with reflective tape and the SQL Server database. This has several benefits including cheap hardware cost, fluent control, prompt response, adaptive adjustment and portability. Also, it carries the game module with the function of training and evaluation; to the trainee, it is really helpful to upgrade the sensitivity of consciousness/sense and the centralization of attention. The intervention phase/maintenance phase, with regard to clicking accuracy and use of time, p value (p< 0.05) reach the level of significance. The development of the low cost adaptive computer mouse auxiliary tool was completed during the study and was also verified as having the characteristics of low cost, easy operation and the adaptability. To patients with physical disabilities, if they have independent control action parts of their limbs, the mouse auxiliary tool is suitable for them to use, i.e. the user only needs to paste the reflective tape by the independent control action parts of the body to operate the mouse auxiliary tool.

  13. Plant process computer replacements - techniques to limit installation schedules and costs

    International Nuclear Information System (INIS)

    Baker, M.D.; Olson, J.L.

    1992-01-01

    Plant process computer systems, a standard fixture in all nuclear power plants, are used to monitor and display important plant process parameters. Scanning thousands of field sensors and alarming out-of-limit values, these computer systems are heavily relied on by control room operators. The original nuclear steam supply system (NSSS) vendor for the power plant often supplied the plant process computer. Designed using sixties and seventies technology, a plant's original process computer has been obsolete for some time. Driven by increased maintenance costs and new US Nuclear Regulatory Commission regulations such as NUREG-0737, Suppl. 1, many utilities have replaced their process computers with more modern computer systems. Given that computer systems are by their nature prone to rapid obsolescence, this replacement cycle will likely repeat. A process computer replacement project can be a significant capital expenditure and must be performed during a scheduled refueling outage. The object of the installation process is to install a working system on schedule. Experience gained by supervising several computer replacement installations has taught lessons that, if applied, will shorten the schedule and limit the risk of costly delays. Examples illustrating this technique are given. This paper and these examples deal only with the installation process and assume that the replacement computer system has been adequately designed, and development and factory tested

  14. Manual of phosphoric acid fuel cell power plant cost model and computer program

    Science.gov (United States)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    Cost analysis of phosphoric acid fuel cell power plant includes two parts: a method for estimation of system capital costs, and an economic analysis which determines the levelized annual cost of operating the system used in the capital cost estimation. A FORTRAN computer has been developed for this cost analysis.

  15. Cloud Computing and Security Issues

    OpenAIRE

    Rohan Jathanna; Dhanamma Jagli

    2017-01-01

    Cloud computing has become one of the most interesting topics in the IT world today. Cloud model of computing as a resource has changed the landscape of computing as it promises of increased greater reliability, massive scalability, and decreased costs have attracted businesses and individuals alike. It adds capabilities to Information Technology’s. Over the last few years, cloud computing has grown considerably in Information Technology. As more and more information of individuals and compan...

  16. Clean air benefits and costs in the GVRD [Greater Vancouver Regional District

    International Nuclear Information System (INIS)

    Gislason, G.; Martin, J.; Williams, D.; Caton, B.; Rich, J.; Rojak, S.; Robinson, J.; Stuermer, A. von

    1994-01-01

    Air pollution is a major concern in the Greater Vancouver Regional District in British Columbia. An analysis was conducted to assess the costs and benefits of an innovative plan to reduce the emissions of five primary pollutants in the GVRD: nitrogen oxides (NOx), sulfur oxides (SOx), volatile organic compounds (VOCs), particulates, and CO. The study adopts a damage function approach in which the benefits of reduced emissions are given by the averted damages to human health, crops, and so on. Under a base case scenario, motor vehicle emission controls and additional measures proposed in the region's air quality management plan (AQMP) are projected to lead to emission reductions of 873,000 tonnes in the GVRD by the year 2020, compared to the emission level projected without intervention. The AQMP is projected to avert over its life some 2,800 premature deaths, 33,000 emergency room visits, 13 million restricted activity days, and 5 million symptoms. Crop losses due to ozone are projected to decrease by 1-4%/y over the next several decades due to the AQMP. Damage averted to materials and property per tonne of pollutant reduced ranges from $30 for VOC to $180 for particulates. Under base-case conservative assumptions, the AQMP generates $5.4 billion in benefits and $3.8 billion in costs, nearly 2/3 of which are paid by the industrial and commercial sectors. 1 tab

  17. Development of computer program for estimating decommissioning cost - 59037

    International Nuclear Information System (INIS)

    Kim, Hak-Soo; Park, Jong-Kil

    2012-01-01

    The programs for estimating the decommissioning cost have been developed for many different purposes and applications. The estimation of decommissioning cost is required a large amount of data such as unit cost factors, plant area and its inventory, waste treatment, etc. These make it difficult to use manual calculation or typical spreadsheet software such as Microsoft Excel. The cost estimation for eventual decommissioning of nuclear power plants is a prerequisite for safe, timely and cost-effective decommissioning. To estimate the decommissioning cost more accurately and systematically, KHNP, Korea Hydro and Nuclear Power Co. Ltd, developed a decommissioning cost estimating computer program called 'DeCAT-Pro', which is Decommission-ing Cost Assessment Tool - Professional. (Hereinafter called 'DeCAT') This program allows users to easily assess the decommissioning cost with various decommissioning options. Also, this program provides detailed reporting for decommissioning funding requirements as well as providing detail project schedules, cash-flow, staffing plan and levels, and waste volumes by waste classifications and types. KHNP is planning to implement functions for estimating the plant inventory using 3-D technology and for classifying the conditions of radwaste disposal and transportation automatically. (authors)

  18. Addressing the computational cost of large EIT solutions

    International Nuclear Information System (INIS)

    Boyle, Alistair; Adler, Andy; Borsic, Andrea

    2012-01-01

    Electrical impedance tomography (EIT) is a soft field tomography modality based on the application of electric current to a body and measurement of voltages through electrodes at the boundary. The interior conductivity is reconstructed on a discrete representation of the domain using a finite-element method (FEM) mesh and a parametrization of that domain. The reconstruction requires a sequence of numerically intensive calculations. There is strong interest in reducing the cost of these calculations. An improvement in the compute time for current problems would encourage further exploration of computationally challenging problems such as the incorporation of time series data, wide-spread adoption of three-dimensional simulations and correlation of other modalities such as CT and ultrasound. Multicore processors offer an opportunity to reduce EIT computation times but may require some restructuring of the underlying algorithms to maximize the use of available resources. This work profiles two EIT software packages (EIDORS and NDRM) to experimentally determine where the computational costs arise in EIT as problems scale. Sparse matrix solvers, a key component for the FEM forward problem and sensitivity estimates in the inverse problem, are shown to take a considerable portion of the total compute time in these packages. A sparse matrix solver performance measurement tool, Meagre-Crowd, is developed to interface with a variety of solvers and compare their performance over a range of two- and three-dimensional problems of increasing node density. Results show that distributed sparse matrix solvers that operate on multiple cores are advantageous up to a limit that increases as the node density increases. We recommend a selection procedure to find a solver and hardware arrangement matched to the problem and provide guidance and tools to perform that selection. (paper)

  19. Addressing the computational cost of large EIT solutions.

    Science.gov (United States)

    Boyle, Alistair; Borsic, Andrea; Adler, Andy

    2012-05-01

    Electrical impedance tomography (EIT) is a soft field tomography modality based on the application of electric current to a body and measurement of voltages through electrodes at the boundary. The interior conductivity is reconstructed on a discrete representation of the domain using a finite-element method (FEM) mesh and a parametrization of that domain. The reconstruction requires a sequence of numerically intensive calculations. There is strong interest in reducing the cost of these calculations. An improvement in the compute time for current problems would encourage further exploration of computationally challenging problems such as the incorporation of time series data, wide-spread adoption of three-dimensional simulations and correlation of other modalities such as CT and ultrasound. Multicore processors offer an opportunity to reduce EIT computation times but may require some restructuring of the underlying algorithms to maximize the use of available resources. This work profiles two EIT software packages (EIDORS and NDRM) to experimentally determine where the computational costs arise in EIT as problems scale. Sparse matrix solvers, a key component for the FEM forward problem and sensitivity estimates in the inverse problem, are shown to take a considerable portion of the total compute time in these packages. A sparse matrix solver performance measurement tool, Meagre-Crowd, is developed to interface with a variety of solvers and compare their performance over a range of two- and three-dimensional problems of increasing node density. Results show that distributed sparse matrix solvers that operate on multiple cores are advantageous up to a limit that increases as the node density increases. We recommend a selection procedure to find a solver and hardware arrangement matched to the problem and provide guidance and tools to perform that selection.

  20. An approximate fractional Gaussian noise model with computational cost

    KAUST Repository

    Sø rbye, Sigrunn H.; Myrvoll-Nilsen, Eirik; Rue, Haavard

    2017-01-01

    Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood

  1. Adaptive Radar Signal Processing-The Problem of Exponential Computational Cost

    National Research Council Canada - National Science Library

    Rangaswamy, Muralidhar

    2003-01-01

    .... Extensions to handle the case of non-Gaussian clutter statistics are presented. Current challenges of limited training data support, computational cost, and severely heterogeneous clutter backgrounds are outlined...

  2. Costs incurred by applying computer-aided design/computer-aided manufacturing techniques for the reconstruction of maxillofacial defects.

    Science.gov (United States)

    Rustemeyer, Jan; Melenberg, Alex; Sari-Rieger, Aynur

    2014-12-01

    This study aims to evaluate the additional costs incurred by using a computer-aided design/computer-aided manufacturing (CAD/CAM) technique for reconstructing maxillofacial defects by analyzing typical cases. The medical charts of 11 consecutive patients who were subjected to the CAD/CAM technique were considered, and invoices from the companies providing the CAD/CAM devices were reviewed for every case. The number of devices used was significantly correlated with cost (r = 0.880; p costs were found between cases in which prebent reconstruction plates were used (€3346.00 ± €29.00) and cases in which they were not (€2534.22 ± €264.48; p costs of two, three and four devices, even when ignoring the cost of reconstruction plates. Additional fees provided by statutory health insurance covered a mean of 171.5% ± 25.6% of the cost of the CAD/CAM devices. Since the additional fees provide financial compensation, we believe that the CAD/CAM technique is suited for wide application and not restricted to complex cases. Where additional fees/funds are not available, the CAD/CAM technique might be unprofitable, so the decision whether or not to use it remains a case-to-case decision with respect to cost versus benefit. Copyright © 2014 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  3. The performance of low-cost commercial cloud computing as an alternative in computational chemistry.

    Science.gov (United States)

    Thackston, Russell; Fortenberry, Ryan C

    2015-05-05

    The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services. © 2015 Wiley Periodicals, Inc.

  4. A survey of cost accounting in service-oriented computing

    NARCIS (Netherlands)

    de Medeiros, Robson W.A.; Rosa, Nelson S.; Campos, Glaucia M.M.; Ferreira Pires, Luis

    Nowadays, companies are increasingly offering their business services through computational services on the Internet in order to attract more customers and increase their revenues. However, these services have financial costs that need to be managed in order to maximize profit. Several models and

  5. Low-Budget Computer Programming in Your School (An Alternative to the Cost of Large Computers). Illinois Series on Educational Applications of Computers. No. 14.

    Science.gov (United States)

    Dennis, J. Richard; Thomson, David

    This paper is concerned with a low cost alternative for providing computer experience to secondary school students. The brief discussion covers the programmable calculator and its relevance for teaching the concepts and the rudiments of computer programming and for computer problem solving. A list of twenty-five programming activities related to…

  6. Software Requirements for a System to Compute Mean Failure Cost

    Energy Technology Data Exchange (ETDEWEB)

    Aissa, Anis Ben [University of Tunis, Belvedere, Tunisia; Abercrombie, Robert K [ORNL; Sheldon, Frederick T [ORNL; Mili, Ali [New Jersey Insitute of Technology

    2010-01-01

    In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder. We also demonstrated this infrastructure through the results of security breakdowns for the ecommerce case. In this paper, we illustrate this infrastructure by an application that supports the computation of the Mean Failure Cost (MFC) for each stakeholder.

  7. A low-cost vector processor boosting compute-intensive image processing operations

    Science.gov (United States)

    Adorf, Hans-Martin

    1992-01-01

    Low-cost vector processing (VP) is within reach of everyone seriously engaged in scientific computing. The advent of affordable add-on VP-boards for standard workstations complemented by mathematical/statistical libraries is beginning to impact compute-intensive tasks such as image processing. A case in point in the restoration of distorted images from the Hubble Space Telescope. A low-cost implementation is presented of the standard Tarasko-Richardson-Lucy restoration algorithm on an Intel i860-based VP-board which is seamlessly interfaced to a commercial, interactive image processing system. First experience is reported (including some benchmarks for standalone FFT's) and some conclusions are drawn.

  8. Low cost highly available digital control computer

    International Nuclear Information System (INIS)

    Silvers, M.W.

    1986-01-01

    When designing digital controllers for critical plant control it is important to provide several features. Among these are reliability, availability, maintainability, environmental protection, and low cost. An examination of several applications has lead to a design that can be produced for approximately $20,000 (1000 control points). This design is compatible with modern concepts in distributed and hierarchical control. The canonical controller element is a dual-redundant self-checking computer that communicates with a cross-strapped, electrically isolated input/output system. The input/output subsystem comprises multiple intelligent input/output cards. These cards accept commands from the primary processor which are validated, executed, and acknowledged. Each card may be hot replaced to facilitate sparing. The implementation of the dual-redundant computer architecture is discussed. Called the FS-86, this computer can be used for a variety of applications. It has most recently found application in the upgrade of San Francisco's Bay Area Rapid Transit (BART) train control currently in progress and has been proposed for feedwater control in a boiling water reactor

  9. Comparative cost analysis -- computed tomography vs. alternative diagnostic procedures, 1977-1980

    International Nuclear Information System (INIS)

    Gempel, P.A.; Harris, G.H.; Evans, R.G.

    1977-12-01

    In comparing the total national cost of utilizing computed tomography (CT) for medically indicated diagnoses with that of conventional x-ray, ultrasonography, nuclear medicine, and exploratory surgery, this investigation concludes that there was little, if any, added net cost from CT use in 1977 or will there be in 1980. Computed tomography, generally recognized as a reliable and useful diagnostic modality, has the potential to reduce net costs provided that an optimal number of units can be made available to physicians and patients to achieve projected reductions in alternative procedures. This study examines the actual cost impact of CT on both cranial and body diagnostic procedures. For abdominal and mediastinal disorders, CT scanning is just beginning to emerge as a diagnostic modality. As such, clinical experience is somewhat limited and the authors assume that no significant reduction in conventional procedures took place in 1977. It is estimated that the approximately 375,000 CT body procedures performed in 1977 represent only a 5 percent cost increase over use of other diagnostic modalities. It is projected that 2,400,000 CT body procedures will be performed in 1980 and, depending on assumptions used, total body diagnostic costs will increase only slightly or be reduced. Thirty-one tables appear throughout the text presenting cost data broken down by types of diagnostic procedures used and projections by years. Appendixes present technical cost components for diagnostic procedures, the comparative efficacy of CT as revealed in abstracts of published literature, selected medical diagnoses, and references

  10. Computing, Information, and Communications Technology (CICT) Program Overview

    Science.gov (United States)

    VanDalsem, William R.

    2003-01-01

    The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies

  11. External costs of nuclear: Greater or less than the alternatives?

    International Nuclear Information System (INIS)

    Rabl, Ari; Rabl, Veronika A.

    2013-01-01

    Since Fukushima many are calling for a shutdown of nuclear power plants. To see whether such a shutdown would reduce the risks for health and environment, the external costs of nuclear electricity are compared with alternatives that could replace it. The frequency of catastrophic nuclear accidents is based on the historical record, about one in 25 years for the plants built to date, an order of magnitude higher than the safety goals of the U.S. Nuclear Regulatory Commission. Impacts similar to Chernobyl and Fukushima are assumed to estimate the cost. A detailed comparison is presented with wind as alternative with the lowest external cost. The variability of wind necessitates augmentation by other sources, primarily fossil fuels, because storage at the required scale is in most regions too expensive. The external costs of natural gas combined cycle are taken as 0.6 €cent/kWh due to health effects of air pollution and 1.25 €cent/kWh due to greenhouse gases (at 25€/t CO 2 eq ) for the central estimate, but a wide range of different parameters is also considered, both for nuclear and for the alternatives. Although the central estimate of external costs of the wind-based alternative is higher than that of nuclear, the uncertainty ranges overlap. - Highlights: ► The external costs of nuclear electricity are compared with the alternatives. ► Frequency and cost of nuclear accidents based on Chernobyl and Fukushima. ► Detailed comparison with wind as alternative with the lowest external costs. ► High external cost of wind because of natural gas backup (storage too limited). ► External costs of wind higher than nuclear but uncertainty ranges overlap

  12. Client-server computer architecture saves costs and eliminates bottlenecks

    International Nuclear Information System (INIS)

    Darukhanavala, P.P.; Davidson, M.C.; Tyler, T.N.; Blaskovich, F.T.; Smith, C.

    1992-01-01

    This paper reports that workstation, client-server architecture saved costs and eliminated bottlenecks that BP Exploration (Alaska) Inc. experienced with mainframe computer systems. In 1991, BP embarked on an ambitious project to change technical computing for its Prudhoe Bay, Endicott, and Kuparuk operations on Alaska's North Slope. This project promised substantial rewards, but also involved considerable risk. The project plan called for reservoir simulations (which historically had run on a Cray Research Inc. X-MP supercomputer in the company's Houston data center) to be run on small computer workstations. Additionally, large Prudhoe Bay, Endicott, and Kuparuk production and reservoir engineering data bases and related applications also would be moved to workstations, replacing a Digital Equipment Corp. VAX cluster in Anchorage

  13. hPIN/hTAN: Low-Cost e-Banking Secure against Untrusted Computers

    Science.gov (United States)

    Li, Shujun; Sadeghi, Ahmad-Reza; Schmitz, Roland

    We propose hPIN/hTAN, a low-cost token-based e-banking protection scheme when the adversary has full control over the user's computer. Compared with existing hardware-based solutions, hPIN/hTAN depends on neither second trusted channel, nor secure keypad, nor computationally expensive encryption module.

  14. Travel costs associated with flood closures of state highways near Centralia/Chehalis, Washington.

    Science.gov (United States)

    2014-09-01

    This report discusses the travel costs associated with the closure of roads in the greater : Centralia/Chehalis, Washington region due to 100-year flood conditions starting on the Chehalis River. The costs : were computed for roadway closures on I-5,...

  15. Low-cost autonomous perceptron neural network inspired by quantum computation

    Science.gov (United States)

    Zidan, Mohammed; Abdel-Aty, Abdel-Haleem; El-Sadek, Alaa; Zanaty, E. A.; Abdel-Aty, Mahmoud

    2017-11-01

    Achieving low cost learning with reliable accuracy is one of the important goals to achieve intelligent machines to save time, energy and perform learning process over limited computational resources machines. In this paper, we propose an efficient algorithm for a perceptron neural network inspired by quantum computing composite from a single neuron to classify inspirable linear applications after a single training iteration O(1). The algorithm is applied over a real world data set and the results are outer performs the other state-of-the art algorithms.

  16. Limited risk assessment and some cost/benefit considerations for greater confinement disposal compared to shallow land burial

    International Nuclear Information System (INIS)

    Hunter, P.H.; Lester, D.H.; Robertson, L.D.; Spaeth, M.E.; Stoddard, J.A.; Dickman, P.T.

    1984-09-01

    A limited risk assessment and some cost/benefit considerations of greater confinement disposal (GCD) compared to shallow land burial (SLB) are presented. This study is limited to an analysis of the postclosure phase of hypothetical GCD and SLB facilities. Selected release scenarios are used which bound the range of risks to a maximally exposed individual and a hypothetical population. Based on the scenario assessments, GCD had a significant risk advantage over SLB for normal exposure pathways at both humid and arid sites, particularly for the human intrusion scenario. Since GCD costs are somewhat higher than SLB, it is necessary to weigh the higher costs of GCD against the higher risks of SLB. In this regard, GCD should be pursued as an alternative to SLB for certain types of low-level waste, and as an alternative to processing for wastes requiring improved stabilization or higher integrity packaging to be compatible with SLB. There are two reasons for this conclusion. First, GCD might diminish public apprehension regarding the disposal of wastes perceived to be too hazardous for SLB. Second, GCD may be a relatively cost-effective alternative to various stabilization and packaging schemes required to meet 10 CFR 61 near-surface requirements as well as being a cost-effective alternative to deep geologic disposal. Radionuclide transport through the biosphere and resultant dose consequences were determined using the RADTRAN radionuclide transport code. 19 references, 4 figures, 5 tables

  17. A low cost computer-controlled electrochemical measurement system for education and research

    International Nuclear Information System (INIS)

    Cottis, R.A.

    1989-01-01

    With the advent of low cost computers of significant processing power, it has become economically attractive, as well as offering practical advantages, to replace conventional electrochemical instrumentation with computer-based equipment. For example, the equipment to be described can perform all of the functions required for the measurement of a potentiodynamic polarization curve, replacing the conventional arrangement of sweep generator, potentiostat and chart recorder at a cost (based on the purchase cost of parts) which is less than that of most chart recorders alone. Additionally the use of computer control at a relatively low level provides a versatility (assuming the development of suitable software) which cannot easily be matched by conventional instruments. As a result of these considerations a simple computer-controlled electrochemical measurement system has been developed, with a primary aim being its use in teaching an MSc class in corrosion science and engineering, with additional applications in MSc and PhD research. For education reasons the design of the user interface has tried to make the internal operation of the unit as obvious as possible, and thereby minimize the tendency for students to treat the unit as a 'black box' with incomprehensible inner workings. This has resulted in a unit in which the three main components of function generator, potentiostat and recorder are presented as independent areas on the front panel, and can be configured by the user in exactly the same way as conventional instruments. (author) 11 figs

  18. A low cost computer-controlled electrochemical measurement system for education and research

    Energy Technology Data Exchange (ETDEWEB)

    Cottis, R A [Manchester Univ. (UK). Inst. of Science and Technology

    1989-01-01

    With the advent of low cost computers of significant processing power, it has become economically attractive, as well as offering practical advantages, to replace conventional electrochemical instrumentation with computer-based equipment. For example, the equipment to be described can perform all of the functions required for the measurement of a potentiodynamic polarization curve, replacing the conventional arrangement of sweep generator, potentiostat and chart recorder at a cost (based on the purchase cost of parts) which is less than that of most chart recorders alone. Additionally the use of computer control at a relatively low level provides a versatility (assuming the development of suitable software) which cannot easily be matched by conventional instruments. As a result of these considerations a simple computer-controlled electrochemical measurement system has been developed, with a primary aim being its use in teaching an MSc class in corrosion science and engineering, with additional applications in MSc and PhD research. For education reasons the design of the user interface has tried to make the internal operation of the unit as obvious as possible, and thereby minimize the tendency for students to treat the unit as a 'black box' with incomprehensible inner workings. This has resulted in a unit in which the three main components of function generator, potentiostat and recorder are presented as independent areas on the front panel, and can be configured by the user in exactly the same way as conventional instruments. (author) 11 figs.

  19. Computer-Aided Surgical Simulation in Head and Neck Reconstruction: A Cost Comparison among Traditional, In-House, and Commercial Options.

    Science.gov (United States)

    Li, Sean S; Copeland-Halperin, Libby R; Kaminsky, Alexander J; Li, Jihui; Lodhi, Fahad K; Miraliakbari, Reza

    2018-06-01

    Computer-aided surgical simulation (CASS) has redefined surgery, improved precision and reduced the reliance on intraoperative trial-and-error manipulations. CASS is provided by third-party services; however, it may be cost-effective for some hospitals to develop in-house programs. This study provides the first cost analysis comparison among traditional (no CASS), commercial CASS, and in-house CASS for head and neck reconstruction.  The costs of three-dimensional (3D) pre-operative planning for mandibular and maxillary reconstructions were obtained from an in-house CASS program at our large tertiary care hospital in Northern Virginia, as well as a commercial provider (Synthes, Paoli, PA). A cost comparison was performed among these modalities and extrapolated in-house CASS costs were derived. The calculations were based on estimated CASS use with cost structures similar to our institution and sunk costs were amortized over 10 years.  Average operating room time was estimated at 10 hours, with an average of 2 hours saved with CASS. The hourly cost to the hospital for the operating room (including anesthesia and other ancillary costs) was estimated at $4,614/hour. Per case, traditional cases were $46,140, commercial CASS cases were $40,951, and in-house CASS cases were $38,212. Annual in-house CASS costs were $39,590.  CASS reduced operating room time, likely due to improved efficiency and accuracy. Our data demonstrate that hospitals with similar cost structure as ours, performing greater than 27 cases of 3D head and neck reconstructions per year can see a financial benefit from developing an in-house CASS program. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  20. Implementing and developing cloud computing applications

    CERN Document Server

    Sarna, David E Y

    2010-01-01

    From small start-ups to major corporations, companies of all sizes have embraced cloud computing for the scalability, reliability, and cost benefits it can provide. It has even been said that cloud computing may have a greater effect on our lives than the PC and dot-com revolutions combined.Filled with comparative charts and decision trees, Implementing and Developing Cloud Computing Applications explains exactly what it takes to build robust and highly scalable cloud computing applications in any organization. Covering the major commercial offerings available, it provides authoritative guidan

  1. Modelling the Intention to Adopt Cloud Computing Services: A Transaction Cost Theory Perspective

    Directory of Open Access Journals (Sweden)

    Ogan Yigitbasioglu

    2014-11-01

    Full Text Available This paper uses transaction cost theory to study cloud computing adoption. A model is developed and tested with data from an Australian survey. According to the results, perceived vendor opportunism and perceived legislative uncertainty around cloud computing were significantly associated with perceived cloud computing security risk. There was also a significant negative relationship between perceived cloud computing security risk and the intention to adopt cloud services. This study also reports on adoption rates of cloud computing in terms of applications, as well as the types of services used.

  2. Decommissioning costing approach based on the standardised list of costing items. Lessons learnt by the OMEGA computer code

    International Nuclear Information System (INIS)

    Daniska, Vladimir; Rehak, Ivan; Vasko, Marek; Ondra, Frantisek; Bezak, Peter; Pritrsky, Jozef; Zachar, Matej; Necas, Vladimir

    2011-01-01

    The document 'A Proposed Standardised List of Items for Costing Purposes' was issues in 1999 by OECD/NEA, IAEA and European Commission (EC) for promoting the harmonisation in decommissioning costing. It is a systematic list of decommissioning activities classified in chapters 01 to 11 with three numbered levels. Four cost group are defined for cost at each level. Document constitutes the standardised matrix of decommissioning activities and cost groups with definition of content of items. Knowing what is behind the items makes the comparison of cost for decommissioning projects transparent. Two approaches are identified for use of the standardised cost structure. First approach converts the cost data from existing specific cost structures into the standardised cost structure for the purpose of cost presentation. Second approach uses the standardised cost structure as the base for the cost calculation structure; the calculated cost data are formatted in the standardised cost format directly; several additional advantages may be identified in this approach. The paper presents the costing methodology based on the standardised cost structure and lessons learnt from last ten years of the implementation of the standardised cost structure as the cost calculation structure in the computer code OMEGA. Code include also on-line management of decommissioning waste, decay of radioactively, evaluation of exposure, generation and optimisation of the Gantt chart of a decommissioning project, which makes the OMEGA code an effective tool for planning and optimisation of decommissioning processes. (author)

  3. Synchronized Pair Configuration in Virtualization-Based Lab for Learning Computer Networks

    Science.gov (United States)

    Kongcharoen, Chaknarin; Hwang, Wu-Yuin; Ghinea, Gheorghita

    2017-01-01

    More studies are concentrating on using virtualization-based labs to facilitate computer or network learning concepts. Some benefits are lower hardware costs and greater flexibility in reconfiguring computer and network environments. However, few studies have investigated effective mechanisms for using virtualization fully for collaboration.…

  4. Cost-effective computational method for radiation heat transfer in semi-crystalline polymers

    Science.gov (United States)

    Boztepe, Sinan; Gilblas, Rémi; de Almeida, Olivier; Le Maoult, Yannick; Schmidt, Fabrice

    2018-05-01

    This paper introduces a cost-effective numerical model for infrared (IR) heating of semi-crystalline polymers. For the numerical and experimental studies presented here semi-crystalline polyethylene (PE) was used. The optical properties of PE were experimentally analyzed under varying temperature and the obtained results were used as input in the numerical studies. The model was built based on optically homogeneous medium assumption whereas the strong variation in the thermo-optical properties of semi-crystalline PE under heating was taken into account. Thus, the change in the amount radiative energy absorbed by the PE medium was introduced in the model induced by its temperature-dependent thermo-optical properties. The computational study was carried out considering an iterative closed-loop computation, where the absorbed radiation was computed using an in-house developed radiation heat transfer algorithm -RAYHEAT- and the computed results was transferred into the commercial software -COMSOL Multiphysics- for solving transient heat transfer problem to predict temperature field. The predicted temperature field was used to iterate the thermo-optical properties of PE that varies under heating. In order to analyze the accuracy of the numerical model experimental analyses were carried out performing IR-thermographic measurements during the heating of the PE plate. The applicability of the model in terms of computational cost, number of numerical input and accuracy was highlighted.

  5. Is computer aided detection (CAD) cost effective in screening mammography? A model based on the CADET II study

    Science.gov (United States)

    2011-01-01

    Background Single reading with computer aided detection (CAD) is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£), year 2007/08) of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner). Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate) CAD is unlikely to be a cost effective alternative to double reading for mammography screening in UK. This study

  6. Computational cost of isogeometric multi-frontal solvers on parallel distributed memory machines

    KAUST Repository

    Woźniak, Maciej; Paszyński, Maciej R.; Pardo, D.; Dalcin, Lisandro; Calo, Victor M.

    2015-01-01

    This paper derives theoretical estimates of the computational cost for isogeometric multi-frontal direct solver executed on parallel distributed memory machines. We show theoretically that for the Cp-1 global continuity of the isogeometric solution

  7. Hybrid Cloud Computing Architecture Optimization by Total Cost of Ownership Criterion

    Directory of Open Access Journals (Sweden)

    Elena Valeryevna Makarenko

    2014-12-01

    Full Text Available Achieving the goals of information security is a key factor in the decision to outsource information technology and, in particular, to decide on the migration of organizational data, applications, and other resources to the infrastructure, based on cloud computing. And the key issue in the selection of optimal architecture and the subsequent migration of business applications and data to the cloud organization information environment is the question of the total cost of ownership of IT infrastructure. This paper focuses on solving the problem of minimizing the total cost of ownership cloud.

  8. Development of a computer program for the cost analysis of spent fuel management

    International Nuclear Information System (INIS)

    Choi, Heui Joo; Lee, Jong Youl; Choi, Jong Won; Cha, Jeong Hun; Whang, Joo Ho

    2009-01-01

    So far, a substantial amount of spent fuels have been generated from the PWR and CANDU reactors. They are being temporarily stored at the nuclear power plant sites. It is expected that the temporary storage facility will be full of spent fuels by around 2016. The government plans to solve the problem by constructing an interim storage facility soon. The radioactive management act was enacted in 2008 to manage the spent fuels safety in Korea. According to the act, the radioactive waste management fund which will be used for the transportation, interim storage, and the final disposal of spent fuels has been established. The cost for the management of spent fuels is surprisingly high and could include a lot of uncertainty. KAERI and Kyunghee University have developed cost estimation tools to evaluate the cost for a spent fuel management based on an engineering design and calculation. It is not easy to develop a tool for a cost estimation under the situation that the national policy on a spent fuel management has not yet been fixed at all. Thus, the current version of the computer program is based on the current conceptual design of each management system. The main purpose of this paper is to introduce the computer program developed for the cost analysis of a spent fuel management. In order to show the application of the program, a spent fuel management scenario is prepared, and the cost for the scenario is estimated

  9. Incentive motivation deficits in schizophrenia reflect effort computation impairments during cost-benefit decision-making.

    Science.gov (United States)

    Fervaha, Gagan; Graff-Guerrero, Ariel; Zakzanis, Konstantine K; Foussias, George; Agid, Ofer; Remington, Gary

    2013-11-01

    Motivational impairments are a core feature of schizophrenia and although there are numerous reports studying this feature using clinical rating scales, objective behavioural assessments are lacking. Here, we use a translational paradigm to measure incentive motivation in individuals with schizophrenia. Sixteen stable outpatients with schizophrenia and sixteen matched healthy controls completed a modified version of the Effort Expenditure for Rewards Task that accounts for differences in motoric ability. Briefly, subjects were presented with a series of trials where they may choose to expend a greater amount of effort for a larger monetary reward versus less effort for a smaller reward. Additionally, the probability of receiving money for a given trial was varied at 12%, 50% and 88%. Clinical and other reward-related variables were also evaluated. Patients opted to expend greater effort significantly less than controls for trials of high, but uncertain (i.e. 50% and 88% probability) incentive value, which was related to amotivation and neurocognitive deficits. Other abnormalities were also noted but were related to different clinical variables such as impulsivity (low reward and 12% probability). These motivational deficits were not due to group differences in reward learning, reward valuation or hedonic capacity. Our findings offer novel support for incentive motivation deficits in schizophrenia. Clinical amotivation is associated with impairments in the computation of effort during cost-benefit decision-making. This objective translational paradigm may guide future investigations of the neural circuitry underlying these motivational impairments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Socio-economic considerations of cleaning Greater Vancouver's air

    International Nuclear Information System (INIS)

    2005-08-01

    Socio-economic considerations of better air quality on the Greater Vancouver population and economy were discussed. The purpose of the study was to provide socio-economic information to staff and stakeholders of the Greater Vancouver Regional District (GVRD) who are participating in an Air Quality Management Plan (AQMP) development process and the Sustainable Region Initiative (SRI) process. The study incorporated the following methodologies: identification and review of Canadian, American, and European quantitative socio-economic, cost-benefit, cost effectiveness, competitiveness and health analyses of changes in air quality and measures to improve air quality; interviews with industry representatives in Greater Vancouver on competitiveness impacts of air quality changes and ways to improve air quality; and a qualitative analysis and discussion of secondary quantitative information that identifies and evaluates socio-economic impacts arising from changes in Greater Vancouver air quality. The study concluded that for the Greater Vancouver area, the qualitative analysis of an improvement in Greater Vancouver air quality shows positive socio-economic outcomes, as high positive economic efficiency impacts are expected along with good social quality of life impacts. 149 refs., 30 tabs., 6 appendices

  11. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Science.gov (United States)

    Guerrero, Ginés D.; Imbernón, Baldomero; García, José M.

    2014-01-01

    Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC) platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs) has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO). This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor. PMID:25025055

  12. Assessing Tax Form Distribution Costs: A Proposed Method for Computing the Dollar Value of Tax Form Distribution in a Public Library.

    Science.gov (United States)

    Casey, James B.

    1998-01-01

    Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…

  13. Computational cost estimates for parallel shared memory isogeometric multi-frontal solvers

    KAUST Repository

    Woźniak, Maciej; Kuźnik, Krzysztof M.; Paszyński, Maciej R.; Calo, Victor M.; Pardo, D.

    2014-01-01

    In this paper we present computational cost estimates for parallel shared memory isogeometric multi-frontal solvers. The estimates show that the ideal isogeometric shared memory parallel direct solver scales as O( p2log(N/p)) for one dimensional problems, O(Np2) for two dimensional problems, and O(N4/3p2) for three dimensional problems, where N is the number of degrees of freedom, and p is the polynomial order of approximation. The computational costs of the shared memory parallel isogeometric direct solver are compared with those corresponding to the sequential isogeometric direct solver, being the latest equal to O(N p2) for the one dimensional case, O(N1.5p3) for the two dimensional case, and O(N2p3) for the three dimensional case. The shared memory version significantly reduces both the scalability in terms of N and p. Theoretical estimates are compared with numerical experiments performed with linear, quadratic, cubic, quartic, and quintic B-splines, in one and two spatial dimensions. © 2014 Elsevier Ltd. All rights reserved.

  14. Computational cost estimates for parallel shared memory isogeometric multi-frontal solvers

    KAUST Repository

    Woźniak, Maciej

    2014-06-01

    In this paper we present computational cost estimates for parallel shared memory isogeometric multi-frontal solvers. The estimates show that the ideal isogeometric shared memory parallel direct solver scales as O( p2log(N/p)) for one dimensional problems, O(Np2) for two dimensional problems, and O(N4/3p2) for three dimensional problems, where N is the number of degrees of freedom, and p is the polynomial order of approximation. The computational costs of the shared memory parallel isogeometric direct solver are compared with those corresponding to the sequential isogeometric direct solver, being the latest equal to O(N p2) for the one dimensional case, O(N1.5p3) for the two dimensional case, and O(N2p3) for the three dimensional case. The shared memory version significantly reduces both the scalability in terms of N and p. Theoretical estimates are compared with numerical experiments performed with linear, quadratic, cubic, quartic, and quintic B-splines, in one and two spatial dimensions. © 2014 Elsevier Ltd. All rights reserved.

  15. The ability of land owners and their cooperatives to leverage payments greater than opportunity costs from conservation contracts.

    Science.gov (United States)

    Lennox, Gareth D; Armsworth, Paul R

    2013-06-01

    In negotiations over land-right acquisitions, landowners have an informational advantage over conservation groups because they know more about the opportunity costs of conservation measures on their sites. This advantage creates the possibility that landowners will demand payments greater than the required minimum, where this minimum required payment is known as the landowner’s willingness to accept (WTA). However, in recent studies of conservation costs, researchers have assumed landowners will accept conservation with minimum payments. We investigated the ability of landowners to demand payments above their WTA when a conservation group has identified multiple sites for protection. First, we estimated the maximum payment landowners could potentially demand, which is set when groups of landowners act as a cooperative. Next, through the simulation of conservation auctions, we explored the amount of money above landowners’ WTA (i.e., surplus) that conservation groups could cede to secure conservation agreements, again investigating the influence of landowner cooperatives. The simulations showed the informational advantage landowners held could make conservation investments up to 42% more expensive than suggested by the site WTAs. Moreover, all auctions resulted in landowners obtaining payments greater than their WTA; thus, it may be unrealistic to assume landowners will accept conservation contracts with minimum payments. Of particular significance for species conservation, conservation objectives focused on overall species richness,which therefore recognize site complementarity, create an incentive for land owners to form cooperatives to capture surplus. To the contrary, objectives in which sites are substitutes, such as the maximization of species occurrences, create a disincentive for cooperative formation.

  16. A nearly-linear computational-cost scheme for the forward dynamics of an N-body pendulum

    Science.gov (United States)

    Chou, Jack C. K.

    1989-01-01

    The dynamic equations of motion of an n-body pendulum with spherical joints are derived to be a mixed system of differential and algebraic equations (DAE's). The DAE's are kept in implicit form to save arithmetic and preserve the sparsity of the system and are solved by the robust implicit integration method. At each solution point, the predicted solution is corrected to its exact solution within given tolerance using Newton's iterative method. For each iteration, a linear system of the form J delta X = E has to be solved. The computational cost for solving this linear system directly by LU factorization is O(n exp 3), and it can be reduced significantly by exploring the structure of J. It is shown that by recognizing the recursive patterns and exploiting the sparsity of the system the multiplicative and additive computational costs for solving J delta X = E are O(n) and O(n exp 2), respectively. The formulation and solution method for an n-body pendulum is presented. The computational cost is shown to be nearly linearly proportional to the number of bodies.

  17. The cognitive dynamics of computer science cost-effective large scale software development

    CERN Document Server

    De Gyurky, Szabolcs Michael; John Wiley & Sons

    2006-01-01

    This book has three major objectives: To propose an ontology for computer software; To provide a methodology for development of large software systems to cost and schedule that is based on the ontology; To offer an alternative vision regarding the development of truly autonomous systems.

  18. Cost-effective computations with boundary interface operators in elliptic problems

    International Nuclear Information System (INIS)

    Khoromskij, B.N.; Mazurkevich, G.E.; Nikonov, E.G.

    1993-01-01

    The numerical algorithm for fast computations with interface operators associated with the elliptic boundary value problems (BVP) defined on step-type domains is presented. The algorithm is based on the asymptotically almost optimal technique developed for treatment of the discrete Poincare-Steklov (PS) operators associated with the finite-difference Laplacian on rectangles when using the uniform grid with a 'displacement by h/2'. The approach can be regarded as an extension of the method proposed for the partial solution of the finite-difference Laplace equation to the case of displaced grids and mixed boundary conditions. It is shown that the action of the PS operator for the Dirichlet problem and mixed BVP can be computed with expenses of the order of O(Nlog 2 N) both for arithmetical operations and computer memory needs, where N is the number of unknowns on the rectangle boundary. The single domain algorithm is applied to solving the multidomain elliptic interface problems with piecewise constant coefficients. The numerical experiments presented confirm almost linear growth of the computational costs and memory needs with respect to the dimension of the discrete interface problem. 14 refs., 3 figs., 4 tabs

  19. Matching Cost Filtering for Dense Stereo Correspondence

    Directory of Open Access Journals (Sweden)

    Yimin Lin

    2013-01-01

    Full Text Available Dense stereo correspondence enabling reconstruction of depth information in a scene is of great importance in the field of computer vision. Recently, some local solutions based on matching cost filtering with an edge-preserving filter have been proved to be capable of achieving more accuracy than global approaches. Unfortunately, the computational complexity of these algorithms is quadratically related to the window size used to aggregate the matching costs. The recent trend has been to pursue higher accuracy with greater efficiency in execution. Therefore, this paper proposes a new cost-aggregation module to compute the matching responses for all the image pixels at a set of sampling points generated by a hierarchical clustering algorithm. The complexity of this implementation is linear both in the number of image pixels and the number of clusters. Experimental results demonstrate that the proposed algorithm outperforms state-of-the-art local methods in terms of both accuracy and speed. Moreover, performance tests indicate that parameters such as the height of the hierarchical binary tree and the spatial and range standard deviations have a significant influence on time consumption and the accuracy of disparity maps.

  20. 12 CFR 714.5 - What is required if you rely on an estimated residual value greater than 25% of the original cost...

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false What is required if you rely on an estimated residual value greater than 25% of the original cost of the leased property? 714.5 Section 714.5 Banks and Banking NATIONAL CREDIT UNION ADMINISTRATION REGULATIONS AFFECTING CREDIT UNIONS LEASING § 714.5 What is...

  1. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Ginés D. Guerrero

    2014-01-01

    Full Text Available Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO. This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor.

  2. Low-cost addition-subtraction sequences for the final exponentiation computation in pairings

    DEFF Research Database (Denmark)

    Guzmán-Trampe, Juan E; Cruz-Cortéz, Nareli; Dominguez Perez, Luis

    2014-01-01

    In this paper, we address the problem of finding low cost addition–subtraction sequences for situations where a doubling step is significantly cheaper than a non-doubling one. One application of this setting appears in the computation of the final exponentiation step of the reduced Tate pairing d...

  3. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Science.gov (United States)

    2010-01-01

    ... Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions (a... loan cost rate for various transactions, as well as instructions, explanations, and examples for.... (2) Term of the transaction. For purposes of total annual loan cost disclosures, the term of a...

  4. Fog Computing and Edge Computing Architectures for Processing Data From Diabetes Devices Connected to the Medical Internet of Things.

    Science.gov (United States)

    Klonoff, David C

    2017-07-01

    The Internet of Things (IoT) is generating an immense volume of data. With cloud computing, medical sensor and actuator data can be stored and analyzed remotely by distributed servers. The results can then be delivered via the Internet. The number of devices in IoT includes such wireless diabetes devices as blood glucose monitors, continuous glucose monitors, insulin pens, insulin pumps, and closed-loop systems. The cloud model for data storage and analysis is increasingly unable to process the data avalanche, and processing is being pushed out to the edge of the network closer to where the data-generating devices are. Fog computing and edge computing are two architectures for data handling that can offload data from the cloud, process it nearby the patient, and transmit information machine-to-machine or machine-to-human in milliseconds or seconds. Sensor data can be processed near the sensing and actuating devices with fog computing (with local nodes) and with edge computing (within the sensing devices). Compared to cloud computing, fog computing and edge computing offer five advantages: (1) greater data transmission speed, (2) less dependence on limited bandwidths, (3) greater privacy and security, (4) greater control over data generated in foreign countries where laws may limit use or permit unwanted governmental access, and (5) lower costs because more sensor-derived data are used locally and less data are transmitted remotely. Connected diabetes devices almost all use fog computing or edge computing because diabetes patients require a very rapid response to sensor input and cannot tolerate delays for cloud computing.

  5. User manual for PACTOLUS: a code for computing power costs.

    Energy Technology Data Exchange (ETDEWEB)

    Huber, H.D.; Bloomster, C.H.

    1979-02-01

    PACTOLUS is a computer code for calculating the cost of generating electricity. Through appropriate definition of the input data, PACTOLUS can calculate the cost of generating electricity from a wide variety of power plants, including nuclear, fossil, geothermal, solar, and other types of advanced energy systems. The purpose of PACTOLUS is to develop cash flows and calculate the unit busbar power cost (mills/kWh) over the entire life of a power plant. The cash flow information is calculated by two principal models: the Fuel Model and the Discounted Cash Flow Model. The Fuel Model is an engineering cost model which calculates the cash flow for the fuel cycle costs over the project lifetime based on input data defining the fuel material requirements, the unit costs of fuel materials and processes, the process lead and lag times, and the schedule of the capacity factor for the plant. For nuclear plants, the Fuel Model calculates the cash flow for the entire nuclear fuel cycle. For fossil plants, the Fuel Model calculates the cash flow for the fossil fuel purchases. The Discounted Cash Flow Model combines the fuel costs generated by the Fuel Model with input data on the capital costs, capital structure, licensing time, construction time, rates of return on capital, tax rates, operating costs, and depreciation method of the plant to calculate the cash flow for the entire lifetime of the project. The financial and tax structure for both investor-owned utilities and municipal utilities can be simulated through varying the rates of return on equity and debt, the debt-equity ratios, and tax rates. The Discounted Cash Flow Model uses the principal that the present worth of the revenues will be equal to the present worth of the expenses including the return on investment over the economic life of the project. This manual explains how to prepare the input data, execute cases, and interpret the output results. (RWR)

  6. User manual for PACTOLUS: a code for computing power costs

    International Nuclear Information System (INIS)

    Huber, H.D.; Bloomster, C.H.

    1979-02-01

    PACTOLUS is a computer code for calculating the cost of generating electricity. Through appropriate definition of the input data, PACTOLUS can calculate the cost of generating electricity from a wide variety of power plants, including nuclear, fossil, geothermal, solar, and other types of advanced energy systems. The purpose of PACTOLUS is to develop cash flows and calculate the unit busbar power cost (mills/kWh) over the entire life of a power plant. The cash flow information is calculated by two principal models: the Fuel Model and the Discounted Cash Flow Model. The Fuel Model is an engineering cost model which calculates the cash flow for the fuel cycle costs over the project lifetime based on input data defining the fuel material requirements, the unit costs of fuel materials and processes, the process lead and lag times, and the schedule of the capacity factor for the plant. For nuclear plants, the Fuel Model calculates the cash flow for the entire nuclear fuel cycle. For fossil plants, the Fuel Model calculates the cash flow for the fossil fuel purchases. The Discounted Cash Flow Model combines the fuel costs generated by the Fuel Model with input data on the capital costs, capital structure, licensing time, construction time, rates of return on capital, tax rates, operating costs, and depreciation method of the plant to calculate the cash flow for the entire lifetime of the project. The financial and tax structure for both investor-owned utilities and municipal utilities can be simulated through varying the rates of return on equity and debt, the debt-equity ratios, and tax rates. The Discounted Cash Flow Model uses the principal that the present worth of the revenues will be equal to the present worth of the expenses including the return on investment over the economic life of the project. This manual explains how to prepare the input data, execute cases, and interpret the output results with the updated version of PACTOLUS. 11 figures, 2 tables

  7. Computational cost for detecting inspiralling binaries using a network of laser interferometric detectors

    International Nuclear Information System (INIS)

    Pai, Archana; Bose, Sukanta; Dhurandhar, Sanjeev

    2002-01-01

    We extend a coherent network data-analysis strategy developed earlier for detecting Newtonian waveforms to the case of post-Newtonian (PN) waveforms. Since the PN waveform depends on the individual masses of the inspiralling binary, the parameter-space dimension increases by one from that of the Newtonian case. We obtain the number of templates and estimate the computational costs for PN waveforms: for a lower mass limit of 1M o-dot , for LIGO-I noise and with 3% maximum mismatch, the online computational speed requirement for single detector is a few Gflops; for a two-detector network it is hundreds of Gflops and for a three-detector network it is tens of Tflops. Apart from idealistic networks, we obtain results for realistic networks comprising of LIGO and VIRGO. Finally, we compare costs incurred in a coincidence detection strategy with those incurred in the coherent strategy detailed above

  8. The Cost-Effectiveness of Dual Mobility Implants for Primary Total Hip Arthroplasty: A Computer-Based Cost-Utility Model.

    Science.gov (United States)

    Barlow, Brian T; McLawhorn, Alexander S; Westrich, Geoffrey H

    2017-05-03

    Dislocation remains a clinically important problem following primary total hip arthroplasty, and it is a common reason for revision total hip arthroplasty. Dual mobility (DM) implants decrease the risk of dislocation but can be more expensive than conventional implants and have idiosyncratic failure mechanisms. The purpose of this study was to investigate the cost-effectiveness of DM implants compared with conventional bearings for primary total hip arthroplasty. Markov model analysis was conducted from the societal perspective with use of direct and indirect costs. Costs, expressed in 2013 U.S. dollars, were derived from the literature, the National Inpatient Sample, and the Centers for Medicare & Medicaid Services. Effectiveness was expressed in quality-adjusted life years (QALYs). The model was populated with health state utilities and state transition probabilities derived from previously published literature. The analysis was performed for a patient's lifetime, and costs and effectiveness were discounted at 3% annually. The principal outcome was the incremental cost-effectiveness ratio (ICER), with a willingness-to-pay threshold of $100,000/QALY. Sensitivity analyses were performed to explore relevant uncertainty. In the base case, DM total hip arthroplasty showed absolute dominance over conventional total hip arthroplasty, with lower accrued costs ($39,008 versus $40,031 U.S. dollars) and higher accrued utility (13.18 versus 13.13 QALYs) indicating cost-savings. DM total hip arthroplasty ceased being cost-saving when its implant costs exceeded those of conventional total hip arthroplasty by $1,023, and the cost-effectiveness threshold for DM implants was $5,287 greater than that for conventional implants. DM was not cost-effective when the annualized incremental probability of revision from any unforeseen failure mechanism or mechanisms exceeded 0.29%. The probability of intraprosthetic dislocation exerted the most influence on model results. This model

  9. Cost-Effectiveness of Computed Tomographic Colonography: A Prospective Comparison with Colonoscopy

    International Nuclear Information System (INIS)

    Arnesen, R.B.; Ginnerup-Pedersen, B.; Poulsen, P.B.; Benzon, K. von; Adamsen, S.; Laurberg, S.; Hart-Hansen, O.

    2007-01-01

    Purpose: To estimate the cost-effectiveness of detecting colorectal polyps with computed tomographic colonography (CTC) and subsequent polypectomy with primary colonoscopy (CC), using CC as the alternative strategy. Material and Methods: A marginal analysis was performed regarding 103 patients who had had CTC prior to same-day CC at two hospitals, H-I (n 53) and H-II (n = 50). The patients were randomly chosen from surveillance and symptomatic study populations (148 at H-I and 231 at H-II). Populations, organizations, and procedures were compared. Cost data on time consumption, medication, and minor equipment were collected prospectively, while data on salaries and major equipment were collected retrospectively. The effect was the (previously published) sensitivities of CTC and CC for detection of colorectal polyps ≥6 mm (H-I, n = 148) or ≥5 mm (H-II, n = 231). Results: Thirteen patients at each center had at least one colorectal polyp ≥6 mm or ≥5 mm. CTC was the cost-effective alternative at H-I (Euro 187 vs. Euro 211), while CC was the cost-effective alternative at H-II (Euro 239 vs. Euro 192). The cost-effectiveness (costs per finding) mainly depended on the sensitivity of CTC and CC, but the depreciation of equipment and the staff's use of time were highly influential as well. Conclusion: Detection of colorectal polyps ≥6 mm or ≥5 mm with CTC, followed by polypectomy by CC, can be performed cost-effectively at some institutions with the appropriate hardware and organization keywords

  10. The Optimal Pricing of Computer Software and Other Products with High Switching Costs

    OpenAIRE

    Pekka Ahtiala

    2004-01-01

    The paper studies the determinants of the optimum prices of computer programs and their upgrades. It is based on the notion that because of the human capital invested in the use of a computer program by its user, this product has high switching costs, and on the finding that pirates are responsible for generating over 80 per cent of new software sales. A model to maximize the present value of the program to the program house is constructed to determine the optimal prices of initial programs a...

  11. Computational cost for detecting inspiralling binaries using a network of laser interferometric detectors

    CERN Document Server

    Pai, A; Dhurandhar, S V

    2002-01-01

    We extend a coherent network data-analysis strategy developed earlier for detecting Newtonian waveforms to the case of post-Newtonian (PN) waveforms. Since the PN waveform depends on the individual masses of the inspiralling binary, the parameter-space dimension increases by one from that of the Newtonian case. We obtain the number of templates and estimate the computational costs for PN waveforms: for a lower mass limit of 1M sub o sub - sub d sub o sub t , for LIGO-I noise and with 3% maximum mismatch, the online computational speed requirement for single detector is a few Gflops; for a two-detector network it is hundreds of Gflops and for a three-detector network it is tens of Tflops. Apart from idealistic networks, we obtain results for realistic networks comprising of LIGO and VIRGO. Finally, we compare costs incurred in a coincidence detection strategy with those incurred in the coherent strategy detailed above.

  12. Upgrade and benchmarking of the NIFS physics-engineering-cost code

    International Nuclear Information System (INIS)

    Dolan, T.J.; Yamazaki, K.

    2004-07-01

    The NIFS Physics-Engineering-Cost (PEC) code for helical and tokamak fusion reactors is upgraded by adding data from three blanket-shield designs, a new cost section based on the ARIES cost schedule, more recent unit costs, and improved algorithms for various computations. The PEC code is also benchmarked by modeling the ARIES-AT (advanced technology) tokamak and the ARIES-SPPS (stellarator power plant system). The PEC code succeeds in predicting many of the pertinent plasma parameters and reactor component masses within about 10%. There are cost differences greater than 10% for some fusion power core components, which may be attributed to differences of unit costs used by the codes. The COEs estimated by the PEC code differ from the COEs of the ARIES-AT and ARIES-SPPS studies by 5%. (author)

  13. Computation Directorate 2008 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D L

    2009-03-25

    Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to its 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.

  14. Cost-effectiveness modeling of colorectal cancer: Computed tomography colonography vs colonoscopy or fecal occult blood tests

    International Nuclear Information System (INIS)

    Lucidarme, Olivier; Cadi, Mehdi; Berger, Genevieve; Taieb, Julien; Poynard, Thierry; Grenier, Philippe; Beresniak, Ariel

    2012-01-01

    Objectives: To assess the cost-effectiveness of three colorectal-cancer (CRC) screening strategies in France: fecal-occult-blood tests (FOBT), computed-tomography-colonography (CTC) and optical-colonoscopy (OC). Methods: Ten-year simulation modeling was used to assess a virtual asymptomatic, average-risk population 50–74 years old. Negative OC was repeated 10 years later, and OC positive for advanced or non-advanced adenoma 3 or 5 years later, respectively. FOBT was repeated biennially. Negative CTC was repeated 5 years later. Positive CTC and FOBT led to triennial OC. Total cost and CRC rate after 10 years for each screening strategy and 0–100% adherence rates with 10% increments were computed. Transition probabilities were programmed using distribution ranges to account for uncertainty parameters. Direct medical costs were estimated using the French national health insurance prices. Probabilistic sensitivity analyses used 5000 Monte Carlo simulations generating model outcomes and standard deviations. Results: For a given adherence rate, CTC screening was always the most effective but not the most cost-effective. FOBT was the least effective but most cost-effective strategy. OC was of intermediate efficacy and the least cost-effective strategy. Without screening, treatment of 123 CRC per 10,000 individuals would cost €3,444,000. For 60% adherence, the respective costs of preventing and treating, respectively 49 and 74 FOBT-detected, 73 and 50 CTC-detected and 63 and 60 OC-detected CRC would be €2,810,000, €6,450,000 and €9,340,000. Conclusion: Simulation modeling helped to identify what would be the most effective (CTC) and cost-effective screening (FOBT) strategy in the setting of mass CRC screening in France.

  15. Computer Aided Design of a Low-Cost Painting Robot

    Directory of Open Access Journals (Sweden)

    SYEDA MARIA KHATOON ZAIDI

    2017-10-01

    Full Text Available The application of robots or robotic systems for painting parts is becoming increasingly conventional; to improve reliability, productivity, consistency and to decrease waste. However, in Pakistan only highend Industries are able to afford the luxury of a robotic system for various purposes. In this study we propose an economical Painting Robot that a small-scale industry can install in their plant with ease. The importance of this robot is that being cost effective, it can easily be replaced in small manufacturing industries and therefore, eliminate health problems occurring to the individual in charge of painting parts on an everyday basis. To achieve this aim, the robot is made with local parts with only few exceptions, to cut costs; and the programming language is kept at a mediocre level. Image processing is used to establish object recognition and it can be programmed to paint various simple geometries. The robot is placed on a conveyer belt to maximize productivity. A four DoF (Degree of Freedom arm increases the working envelope and accessibility of painting different shaped parts with ease. This robot is capable of painting up, front, back, left and right sides of the part with a single colour. Initially CAD (Computer Aided Design models of the robot were developed which were analyzed, modified and improved to withstand loading condition and perform its task efficiently. After design selection, appropriate motors and materials were selected and the robot was developed. Throughout the development phase, minor problems and errors were fixed accordingly as they arose. Lastly the robot was integrated with the computer and image processing for autonomous control. The final results demonstrated that the robot is economical and reduces paint wastage.

  16. Computer aided design of a low-cost painting robot

    International Nuclear Information System (INIS)

    Zaidi, S.M.; Janejo, F.; Mujtaba, S.B.

    2017-01-01

    The application of robots or robotic systems for painting parts is becoming increasingly conventional; to improve reliability, productivity, consistency and to decrease waste. However, in Pakistan only highend Industries are able to afford the luxury of a robotic system for various purposes. In this study we propose an economical Painting Robot that a small-scale industry can install in their plant with ease. The importance of this robot is that being cost effective, it can easily be replaced in small manufacturing industries and therefore, eliminate health problems occurring to the individual in charge of painting parts on an everyday basis. To achieve this aim, the robot is made with local parts with only few exceptions, to cut costs; and the programming language is kept at a mediocre level. Image processing is used to establish object recognition and it can be programmed to paint various simple geometries. The robot is placed on a conveyer belt to maximize productivity. A four DoF (Degree of Freedom) arm increases the working envelope and accessibility of painting different shaped parts with ease. This robot is capable of painting up, front, back, left and right sides of the part with a single colour. Initially CAD (Computer Aided Design) models of the robot were developed which were analyzed, modified and improved to withstand loading condition and perform its task efficiently. After design selection, appropriate motors and materials were selected and the robot was developed. Throughout the development phase, minor problems and errors were fixed accordingly as they arose. Lastly the robot was integrated with the computer and image processing for autonomous control. The final results demonstrated that the robot is economical and reduces paint wastage. (author)

  17. The concept of computer software designed to identify and analyse logistics costs in agricultural enterprises

    Directory of Open Access Journals (Sweden)

    Karol Wajszczyk

    2009-01-01

    Full Text Available The study comprised research, development and computer programming works concerning the development of a concept for the IT tool to be used in the identification and analysis of logistics costs in agricultural enterprises in terms of the process-based approach. As a result of research and programming work an overall functional and IT concept of software was developed for the identification and analysis of logistics costs for agricultural enterprises.

  18. A feasibility study on direct methanol fuel cells for laptop computers based on a cost comparison with lithium-ion batteries

    International Nuclear Information System (INIS)

    Wee, Jung-Ho

    2007-01-01

    This paper compares the total cost of direct methanol fuel cell (DMFC) and lithium (Li)-ion battery systems when applied as the power supply for laptop computers in the Korean environment. The average power output and operational time of the laptop computers were assumed to be 20 W and 3000 h, respectively. Considering the status of their technologies and with certain conditions assumed, the total costs were calculated to be US$140 for the Li-ion battery and US$362 for DMFC. The manufacturing costs of the DMFC and Li-ion battery systems were calculated to be $16.65 W -1 and $0.77 W h -1 , and the energy consumption costs to be $0.00051 W h -1 and $0.00032 W h -1 , respectively. The higher fuel consumption cost of the DMFC system was due to the methanol (MeOH) crossover loss. Therefore, the requirements for DMFCs to be able to compete with Li-ion batteries in terms of energy cost include reducing the crossover level to at an order magnitude of -9 and the MeOH price to under $0.5 kg -1 . Under these conditions, if the DMFC manufacturing cost could be reduced to $6.30 W -1 , then the DMFC system would become at least as competitive as the Li-ion battery system for powering laptop computers in Korea. (author)

  19. A novel cost based model for energy consumption in cloud computing.

    Science.gov (United States)

    Horri, A; Dastghaibyfard, Gh

    2015-01-01

    Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment.

  20. Omniscopes: Large area telescope arrays with only NlogN computational cost

    International Nuclear Information System (INIS)

    Tegmark, Max; Zaldarriaga, Matias

    2010-01-01

    We show that the class of antenna layouts for telescope arrays allowing cheap analysis hardware (with correlator cost scaling as NlogN rather than N 2 with the number of antennas N) is encouragingly large, including not only previously discussed rectangular grids but also arbitrary hierarchies of such grids, with arbitrary rotations and shears at each level. We show that all correlations for such a 2D array with an n-level hierarchy can be efficiently computed via a fast Fourier transform in not two but 2n dimensions. This can allow major correlator cost reductions for science applications requiring exquisite sensitivity at widely separated angular scales, for example, 21 cm tomography (where short baselines are needed to probe the cosmological signal and long baselines are needed for point source removal), helping enable future 21 cm experiments with thousands or millions of cheap dipolelike antennas. Such hierarchical grids combine the angular resolution advantage of traditional array layouts with the cost advantage of a rectangular fast Fourier transform telescope. We also describe an algorithm for how a subclass of hierarchical arrays can efficiently use rotation synthesis to produce global sky maps with minimal noise and a well-characterized synthesized beam.

  1. Design and implementation of a reliable and cost-effective cloud computing infrastructure: the INFN Napoli experience

    International Nuclear Information System (INIS)

    Capone, V; Esposito, R; Pardi, S; Taurino, F; Tortone, G

    2012-01-01

    Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.

  2. Design and implementation of a reliable and cost-effective cloud computing infrastructure: the INFN Napoli experience

    Science.gov (United States)

    Capone, V.; Esposito, R.; Pardi, S.; Taurino, F.; Tortone, G.

    2012-12-01

    Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.

  3. (CICT) Computing, Information, and Communications Technology Overview

    Science.gov (United States)

    VanDalsem, William R.

    2003-01-01

    The goal of the Computing, Information, and Communications Technology (CICT) program is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communications technologies. This viewgraph presentation includes diagrams of how the political guidance behind CICT is structured. The presentation profiles each part of the NASA Mission in detail, and relates the Mission to the activities of CICT. CICT's Integrated Capability Goal is illustrated, and hypothetical missions which could be enabled by CICT are profiled. CICT technology development is profiled.

  4. On the role of cost-sensitive learning in multi-class brain-computer interfaces.

    Science.gov (United States)

    Devlaminck, Dieter; Waegeman, Willem; Wyns, Bart; Otte, Georges; Santens, Patrick

    2010-06-01

    Brain-computer interfaces (BCIs) present an alternative way of communication for people with severe disabilities. One of the shortcomings in current BCI systems, recently put forward in the fourth BCI competition, is the asynchronous detection of motor imagery versus resting state. We investigated this extension to the three-class case, in which the resting state is considered virtually lying between two motor classes, resulting in a large penalty when one motor task is misclassified into the other motor class. We particularly focus on the behavior of different machine-learning techniques and on the role of multi-class cost-sensitive learning in such a context. To this end, four different kernel methods are empirically compared, namely pairwise multi-class support vector machines (SVMs), two cost-sensitive multi-class SVMs and kernel-based ordinal regression. The experimental results illustrate that ordinal regression performs better than the other three approaches when a cost-sensitive performance measure such as the mean-squared error is considered. By contrast, multi-class cost-sensitive learning enables us to control the number of large errors made between two motor tasks.

  5. Strategic Analysis of Autodesk and the Move to Cloud Computing

    OpenAIRE

    Kewley, Kathleen

    2012-01-01

    This paper provides an analysis of the opportunity for Autodesk to move its core technology to a cloud delivery model. Cloud computing offers clients a number of advantages, such as lower costs for computer hardware, increased access to technology and greater flexibility. With the IT industry embracing this transition, software companies need to plan for future change and lead with innovative solutions. Autodesk is in a unique position to capitalize on this market shift, as it is the leader i...

  6. Computational Fluid Dynamics of Whole-Body Aircraft

    Science.gov (United States)

    Agarwal, Ramesh

    1999-01-01

    The current state of the art in computational aerodynamics for whole-body aircraft flowfield simulations is described. Recent advances in geometry modeling, surface and volume grid generation, and flow simulation algorithms have led to accurate flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics has emerged as a crucial enabling technology for the design and development of flight vehicles. Examples illustrating the current capability for the prediction of transport and fighter aircraft flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future, inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology, and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.

  7. Computational cost of isogeometric multi-frontal solvers on parallel distributed memory machines

    KAUST Repository

    Woźniak, Maciej

    2015-02-01

    This paper derives theoretical estimates of the computational cost for isogeometric multi-frontal direct solver executed on parallel distributed memory machines. We show theoretically that for the Cp-1 global continuity of the isogeometric solution, both the computational cost and the communication cost of a direct solver are of order O(log(N)p2) for the one dimensional (1D) case, O(Np2) for the two dimensional (2D) case, and O(N4/3p2) for the three dimensional (3D) case, where N is the number of degrees of freedom and p is the polynomial order of the B-spline basis functions. The theoretical estimates are verified by numerical experiments performed with three parallel multi-frontal direct solvers: MUMPS, PaStiX and SuperLU, available through PETIGA toolkit built on top of PETSc. Numerical results confirm these theoretical estimates both in terms of p and N. For a given problem size, the strong efficiency rapidly decreases as the number of processors increases, becoming about 20% for 256 processors for a 3D example with 1283 unknowns and linear B-splines with C0 global continuity, and 15% for a 3D example with 643 unknowns and quartic B-splines with C3 global continuity. At the same time, one cannot arbitrarily increase the problem size, since the memory required by higher order continuity spaces is large, quickly consuming all the available memory resources even in the parallel distributed memory version. Numerical results also suggest that the use of distributed parallel machines is highly beneficial when solving higher order continuity spaces, although the number of processors that one can efficiently employ is somehow limited.

  8. Cost effective distributed computing for Monte Carlo radiation dosimetry

    International Nuclear Information System (INIS)

    Wise, K.N.; Webb, D.V.

    2000-01-01

    Full text: An inexpensive computing facility has been established for performing repetitive Monte Carlo simulations with the BEAM and EGS4/EGSnrc codes of linear accelerator beams, for calculating effective dose from diagnostic imaging procedures and of ion chambers and phantoms used for the Australian high energy absorbed dose standards. The facility currently consists of 3 dual-processor 450 MHz processor PCs linked by a high speed LAN. The 3 PCs can be accessed either locally from a single keyboard/monitor/mouse combination using a SwitchView controller or remotely via a computer network from PCs with suitable communications software (e.g. Telnet, Kermit etc). All 3 PCs are identically configured to have the Red Hat Linux 6.0 operating system. A Fortran compiler and the BEAM and EGS4/EGSnrc codes are available on the 3 PCs. The preparation of sequences of jobs utilising the Monte Carlo codes is simplified using load-distributing software (enFuzion 6.0 marketed by TurboLinux Inc, formerly Cluster from Active Tools) which efficiently distributes the computing load amongst all 6 processors. We describe 3 applications of the system - (a) energy spectra from radiotherapy sources, (b) mean mass-energy absorption coefficients and stopping powers for absolute absorbed dose standards and (c) dosimetry for diagnostic procedures; (a) and (b) are based on the transport codes BEAM and FLURZnrc while (c) is a Fortran/EGS code developed at ARPANSA. Efficiency gains ranged from 3 for (c) to close to the theoretical maximum of 6 for (a) and (b), with the gain depending on the amount of 'bookkeeping' to begin each task and the time taken to complete a single task. We have found the use of a load-balancing batch processing system with many PCs to be an economical way of achieving greater productivity for Monte Carlo calculations or of any computer intensive task requiring many runs with different parameters. Copyright (2000) Australasian College of Physical Scientists and

  9. Estimation of the laser cutting operating cost by support vector regression methodology

    Science.gov (United States)

    Jović, Srđan; Radović, Aleksandar; Šarkoćević, Živče; Petković, Dalibor; Alizamir, Meysam

    2016-09-01

    Laser cutting is a popular manufacturing process utilized to cut various types of materials economically. The operating cost is affected by laser power, cutting speed, assist gas pressure, nozzle diameter and focus point position as well as the workpiece material. In this article, the process factors investigated were: laser power, cutting speed, air pressure and focal point position. The aim of this work is to relate the operating cost to the process parameters mentioned above. CO2 laser cutting of stainless steel of medical grade AISI316L has been investigated. The main goal was to analyze the operating cost through the laser power, cutting speed, air pressure, focal point position and material thickness. Since the laser operating cost is a complex, non-linear task, soft computing optimization algorithms can be used. Intelligent soft computing scheme support vector regression (SVR) was implemented. The performance of the proposed estimator was confirmed with the simulation results. The SVR results are then compared with artificial neural network and genetic programing. According to the results, a greater improvement in estimation accuracy can be achieved through the SVR compared to other soft computing methodologies. The new optimization methods benefit from the soft computing capabilities of global optimization and multiobjective optimization rather than choosing a starting point by trial and error and combining multiple criteria into a single criterion.

  10. Cost-effectiveness of external cephalic version for term breech presentation.

    Science.gov (United States)

    Tan, Jonathan M; Macario, Alex; Carvalho, Brendan; Druzin, Maurice L; El-Sayed, Yasser Y

    2010-01-21

    External cephalic version (ECV) is recommended by the American College of Obstetricians and Gynecologists to convert a breech fetus to vertex position and reduce the need for cesarean delivery. The goal of this study was to determine the incremental cost-effectiveness ratio, from society's perspective, of ECV compared to scheduled cesarean for term breech presentation. A computer-based decision model (TreeAge Pro 2008, Tree Age Software, Inc.) was developed for a hypothetical base case parturient presenting with a term singleton breech fetus with no contraindications for vaginal delivery. The model incorporated actual hospital costs (e.g., $8,023 for cesarean and $5,581 for vaginal delivery), utilities to quantify health-related quality of life, and probabilities based on analysis of published literature of successful ECV trial, spontaneous reversion, mode of delivery, and need for unanticipated emergency cesarean delivery. The primary endpoint was the incremental cost-effectiveness ratio in dollars per quality-adjusted year of life gained. A threshold of $50,000 per quality-adjusted life-years (QALY) was used to determine cost-effectiveness. The incremental cost-effectiveness of ECV, assuming a baseline 58% success rate, equaled $7,900/QALY. If the estimated probability of successful ECV is less than 32%, then ECV costs more to society and has poorer QALYs for the patient. However, as the probability of successful ECV was between 32% and 63%, ECV cost more than cesarean delivery but with greater associated QALY such that the cost-effectiveness ratio was less than $50,000/QALY. If the probability of successful ECV was greater than 63%, the computer modeling indicated that a trial of ECV is less costly and with better QALYs than a scheduled cesarean. The cost-effectiveness of a trial of ECV is most sensitive to its probability of success, and not to the probabilities of a cesarean after ECV, spontaneous reversion to breech, successful second ECV trial, or adverse

  11. Cost-effectiveness of external cephalic version for term breech presentation

    Directory of Open Access Journals (Sweden)

    Carvalho Brendan

    2010-01-01

    Full Text Available Abstract Background External cephalic version (ECV is recommended by the American College of Obstetricians and Gynecologists to convert a breech fetus to vertex position and reduce the need for cesarean delivery. The goal of this study was to determine the incremental cost-effectiveness ratio, from society's perspective, of ECV compared to scheduled cesarean for term breech presentation. Methods A computer-based decision model (TreeAge Pro 2008, Tree Age Software, Inc. was developed for a hypothetical base case parturient presenting with a term singleton breech fetus with no contraindications for vaginal delivery. The model incorporated actual hospital costs (e.g., $8,023 for cesarean and $5,581 for vaginal delivery, utilities to quantify health-related quality of life, and probabilities based on analysis of published literature of successful ECV trial, spontaneous reversion, mode of delivery, and need for unanticipated emergency cesarean delivery. The primary endpoint was the incremental cost-effectiveness ratio in dollars per quality-adjusted year of life gained. A threshold of $50,000 per quality-adjusted life-years (QALY was used to determine cost-effectiveness. Results The incremental cost-effectiveness of ECV, assuming a baseline 58% success rate, equaled $7,900/QALY. If the estimated probability of successful ECV is less than 32%, then ECV costs more to society and has poorer QALYs for the patient. However, as the probability of successful ECV was between 32% and 63%, ECV cost more than cesarean delivery but with greater associated QALY such that the cost-effectiveness ratio was less than $50,000/QALY. If the probability of successful ECV was greater than 63%, the computer modeling indicated that a trial of ECV is less costly and with better QALYs than a scheduled cesarean. The cost-effectiveness of a trial of ECV is most sensitive to its probability of success, and not to the probabilities of a cesarean after ECV, spontaneous reversion

  12. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  13. Low Cost Night Vision System for Intruder Detection

    Science.gov (United States)

    Ng, Liang S.; Yusoff, Wan Azhar Wan; R, Dhinesh; Sak, J. S.

    2016-02-01

    The growth in production of Android devices has resulted in greater functionalities as well as lower costs. This has made previously more expensive systems such as night vision affordable for more businesses and end users. We designed and implemented robust and low cost night vision systems based on red-green-blue (RGB) colour histogram for a static camera as well as a camera on an unmanned aerial vehicle (UAV), using OpenCV library on Intel compatible notebook computers, running Ubuntu Linux operating system, with less than 8GB of RAM. They were tested against human intruders under low light conditions (indoor, outdoor, night time) and were shown to have successfully detected the intruders.

  14. Cost-effectiveness of implementing computed tomography screening for lung cancer in Taiwan.

    Science.gov (United States)

    Yang, Szu-Chun; Lai, Wu-Wei; Lin, Chien-Chung; Su, Wu-Chou; Ku, Li-Jung; Hwang, Jing-Shiang; Wang, Jung-Der

    2017-06-01

    A screening program for lung cancer requires more empirical evidence. Based on the experience of the National Lung Screening Trial (NLST), we developed a method to adjust lead-time bias and quality-of-life changes for estimating the cost-effectiveness of implementing computed tomography (CT) screening in Taiwan. The target population was high-risk (≥30 pack-years) smokers between 55 and 75 years of age. From a nation-wide, 13-year follow-up cohort, we estimated quality-adjusted life expectancy (QALE), loss-of-QALE, and lifetime healthcare expenditures per case of lung cancer stratified by pathology and stage. Cumulative stage distributions for CT-screening and no-screening were assumed equal to those for CT-screening and radiography-screening in the NLST to estimate the savings of loss-of-QALE and additional costs of lifetime healthcare expenditures after CT screening. Costs attributable to screen-negative subjects, false-positive cases and radiation-induced lung cancer were included to obtain the incremental cost-effectiveness ratio from the public payer's perspective. The incremental costs were US$22,755 per person. After dividing this by savings of loss-of-QALE (1.16 quality-adjusted life year (QALY)), the incremental cost-effectiveness ratio was US$19,683 per QALY. This ratio would fall to US$10,947 per QALY if the stage distribution for CT-screening was the same as that of screen-detected cancers in the NELSON trial. Low-dose CT screening for lung cancer among high-risk smokers would be cost-effective in Taiwan. As only about 5% of our women are smokers, future research is necessary to identify the high-risk groups among non-smokers and increase the coverage. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  15. Strength and Reliability of Wood for the Components of Low-cost Wind Turbines: Computational and Experimental Analysis and Applications

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon; Freere, Peter; Sharma, Ranjan

    2009-01-01

    of experiments and computational investigations. Low cost testing machines have been designed, and employed for the systematic analysis of different sorts of Nepali wood, to be used for the wind turbine construction. At the same time, computational micromechanical models of deformation and strength of wood......This paper reports the latest results of the comprehensive program of experimental and computational analysis of strength and reliability of wooden parts of low cost wind turbines. The possibilities of prediction of strength and reliability of different types of wood are studied in the series...... are developed, which should provide the basis for microstructure-based correlating of observable and service properties of wood. Some correlations between microstructure, strength and service properties of wood have been established....

  16. Productivity associated with visual status of computer users.

    Science.gov (United States)

    Daum, Kent M; Clore, Katherine A; Simms, Suzanne S; Vesely, Jon W; Wilczek, Dawn D; Spittle, Brian M; Good, Greg W

    2004-01-01

    The aim of this project is to examine the potential connection between the astigmatic refractive corrections of subjects using computers and their productivity and comfort. We hypothesize that improving the visual status of subjects using computers results in greater productivity, as well as improved visual comfort. Inclusion criteria required subjects 19 to 30 years of age with complete vision examinations before being enrolled. Using a double-masked, placebo-controlled, randomized design, subjects completed three experimental tasks calculated to assess the effects of refractive error on productivity (time to completion and the number of errors) at a computer. The tasks resembled those commonly undertaken by computer users and involved visual search tasks of: (1) counties and populations; (2) nonsense word search; and (3) a modified text-editing task. Estimates of productivity for time to completion varied from a minimum of 2.5% upwards to 28.7% with 2 D cylinder miscorrection. Assuming a conservative estimate of an overall 2.5% increase in productivity with appropriate astigmatic refractive correction, our data suggest a favorable cost-benefit ratio of at least 2.3 for the visual correction of an employee (total cost 268 dollars) with a salary of 25,000 dollars per year. We conclude that astigmatic refractive error affected both productivity and visual comfort under the conditions of this experiment. These data also suggest a favorable cost-benefit ratio for employers who provide computer-specific eyewear to their employees.

  17. Modelling User-Costs in Life Cycle Cost-Benefit (LCCB) analysis

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    2008-01-01

    The importance of including user's costs in Life-Cycle Cost-Benefit analysis of structures is discussed in this paper. This is especially for bridges of great importance. Repair or/and failure of a bridge will usually result in user costs greater than the repair or replacement costs of the bridge...

  18. Low cost phantom for computed radiology; Objeto de teste de baixo custo para radiologia computadorizada

    Energy Technology Data Exchange (ETDEWEB)

    Travassos, Paulo Cesar B.; Magalhaes, Luis Alexandre G., E-mail: pctravassos@ufrj.br [Universidade do Estado do Rio de Janeiro (IBRGA/UERJ), RJ (Brazil). Laboratorio de Ciencias Radiologicas; Augusto, Fernando M.; Sant' Yves, Thalis L.A.; Goncalves, Elicardo A.S. [Instituto Nacional de Cancer (INCA), Rio de Janeiro, RJ (Brazil); Botelho, Marina A. [Hospital Universitario Pedro Ernesto (UERJ), Rio de Janeiro, RJ (Brazil)

    2012-08-15

    This article presents the results obtained from a low cost phantom, used to analyze Computed Radiology (CR) equipment. The phantom was constructed to test a few parameters related to image quality, as described in [1-9]. Materials which can be easily purchased were used in the construction of the phantom, with total cost of approximately U$100.00. A bar pattern was placed only to verify the efficacy of the grids in the spatial resolution determination, and was not included in the budget because the data was acquired from the grids. (author)

  19. Planning for greater-confinement disposal

    International Nuclear Information System (INIS)

    Gilbert, T.L.; Luner, C.; Meshkov, N.K.; Trevorrow, L.E.; Yu, C.

    1984-01-01

    This contribution is a progress report for preparation of a document that will summarize procedures and technical information needed to plan for and implement greater-confinement disposal (GCD) of low-level radioactive waste. Selection of a site and a facility design (Phase I), and construction, operation, and extended care (Phase II) will be covered in the document. This progress report is limited to Phase I. Phase I includes determination of the need for GCD, design alternatives, and selection of a site and facility design. Alternative designs considered are augered shafts, deep trenches, engineered structures, high-integrity containers, hydrofracture, and improved waste form. Design considerations and specifications, performance elements, cost elements, and comparative advantages and disadvantages of the different designs are covered. Procedures are discussed for establishing overall performance objectives and waste-acceptance criteria, and for comparative assessment of the performance and cost of the different alternatives. 16 references

  20. Planning for greater-confinement disposal

    International Nuclear Information System (INIS)

    Gilbert, T.L.; Luner, C.; Meshkov, N.K.; Trevorrow, L.E.; Yu, C.

    1984-01-01

    This contribution is a progress report for preparation of a document that will summarize procedures and technical information needed to plan for and implement greater-confinement disposal (GCD) of low-level radioactive waste. Selection of a site and a facility design (Phase I), and construction, operation, and extended care (Phase II) will be covered in the document. This progress report is limited to Phase I. Phase I includes determination of the need for GCD, design alternatives, and selection of a site and facility design. Alternative designs considered are augered shafts, deep trenches, engineered structures, high-integrity containers, hydrofracture, and improved waste form. Design considerations and specifications, performance elements, cost elements, and comparative advantages and disadvantages of the different designs are covered. Procedures are discussed for establishing overall performance objecties and waste-acceptance criteria, and for comparative assessment of the performance and cost of the different alternatives. 16 refs

  1. Benefit-cost-risk analysis of alternatives for greater-confinement disposal of radioactive waste

    International Nuclear Information System (INIS)

    Gilbert, T.L.; Luner, C.; Peterson, J.M.

    1983-01-01

    Seven alternatives are included in the analysis: near-surface disposal; improved waste form; below-ground engineered structure; augered shaft; shale fracturing; shallow geologic repository; and high-level waste repository. These alternatives are representative generic facilities that span the range from low-level waste disposal practice to high-level waste disposal practice, tentatively ordered according to an expected increasing cost and/or effectiveness of confinement. They have been chosen to enable an assessment of the degree of confinement that represents an appropriate balance between public health and safety requirements and costs rather than identification of a specific preferred facility design. The objective of the analysis is to provide a comparative ranking of the alternatives on the basis of benefit-cost-risk considerations

  2. Cost and resource utilization associated with use of computed tomography to evaluate chest pain in the emergency department: the Rule Out Myocardial Infarction using Computer Assisted Tomography (ROMICAT) study.

    Science.gov (United States)

    Hulten, Edward; Goehler, Alexander; Bittencourt, Marcio Sommer; Bamberg, Fabian; Schlett, Christopher L; Truong, Quynh A; Nichols, John; Nasir, Khurram; Rogers, Ian S; Gazelle, Scott G; Nagurney, John T; Hoffmann, Udo; Blankstein, Ron

    2013-09-01

    Coronary computed tomographic angiography (cCTA) allows rapid, noninvasive exclusion of obstructive coronary artery disease (CAD). However, concern exists whether implementation of cCTA in the assessment of patients presenting to the emergency department with acute chest pain will lead to increased downstream testing and costs compared with alternative strategies. Our aim was to compare observed actual costs of usual care (UC) with projected costs of a strategy including early cCTA in the evaluation of patients with acute chest pain in the Rule Out Myocardial Infarction Using Computer Assisted Tomography I (ROMICAT I) study. We compared cost and hospital length of stay of UC observed among 368 patients enrolled in the ROMICAT I study with projected costs of management based on cCTA. Costs of UC were determined by an electronic cost accounting system. Notably, UC was not influenced by cCTA results because patients and caregivers were blinded to the cCTA results. Costs after early implementation of cCTA were estimated assuming changes in management based on cCTA findings of the presence and severity of CAD. Sensitivity analysis was used to test the influence of key variables on both outcomes and costs. We determined that in comparison with UC, cCTA-guided triage, whereby patients with no CAD are discharged, could reduce total hospital costs by 23% (Pcost increases such that when the prevalence of ≥ 50% stenosis is >28% to 33%, the use of cCTA becomes more costly than UC. cCTA may be a cost-saving tool in acute chest pain populations that have a prevalence of potentially obstructive CAD cost would be anticipated in populations with higher prevalence of disease.

  3. Comparison of different strategies in prenatal screening for Down's syndrome: cost effectiveness analysis of computer simulation.

    Science.gov (United States)

    Gekas, Jean; Gagné, Geneviève; Bujold, Emmanuel; Douillard, Daniel; Forest, Jean-Claude; Reinharz, Daniel; Rousseau, François

    2009-02-13

    To assess and compare the cost effectiveness of three different strategies for prenatal screening for Down's syndrome (integrated test, sequential screening, and contingent screenings) and to determine the most useful cut-off values for risk. Computer simulations to study integrated, sequential, and contingent screening strategies with various cut-offs leading to 19 potential screening algorithms. The computer simulation was populated with data from the Serum Urine and Ultrasound Screening Study (SURUSS), real unit costs for healthcare interventions, and a population of 110 948 pregnancies from the province of Québec for the year 2001. Cost effectiveness ratios, incremental cost effectiveness ratios, and screening options' outcomes. The contingent screening strategy dominated all other screening options: it had the best cost effectiveness ratio ($C26,833 per case of Down's syndrome) with fewer procedure related euploid miscarriages and unnecessary terminations (respectively, 6 and 16 per 100,000 pregnancies). It also outperformed serum screening at the second trimester. In terms of the incremental cost effectiveness ratio, contingent screening was still dominant: compared with screening based on maternal age alone, the savings were $C30,963 per additional birth with Down's syndrome averted. Contingent screening was the only screening strategy that offered early reassurance to the majority of women (77.81%) in first trimester and minimised costs by limiting retesting during the second trimester (21.05%). For the contingent and sequential screening strategies, the choice of cut-off value for risk in the first trimester test significantly affected the cost effectiveness ratios (respectively, from $C26,833 to $C37,260 and from $C35,215 to $C45,314 per case of Down's syndrome), the number of procedure related euploid miscarriages (from 6 to 46 and from 6 to 45 per 100,000 pregnancies), and the number of unnecessary terminations (from 16 to 26 and from 16 to 25 per 100

  4. Feasibility Study and Cost Benefit Analysis of Thin-Client Computer System Implementation Onboard United States Navy Ships

    National Research Council Canada - National Science Library

    Arbulu, Timothy D; Vosberg, Brian J

    2007-01-01

    The purpose of this MBA project was to conduct a feasibility study and a cost benefit analysis of using thin-client computer systems instead of traditional networks onboard United States Navy ships...

  5. Computer-based interventions for drug use disorders: A systematic review

    Science.gov (United States)

    Moore, Brent A.; Fazzino, Tera; Garnet, Brian; Cutter, Christopher J.; Barry, Declan T.

    2011-01-01

    A range of innovative computer-based interventions for psychiatric disorders have been developed, and are promising for drug use disorders, due to reduced cost and greater availability compared to traditional treatment. Electronic searches were conducted from 1966 to November 19, 2009 using MEDLINE, Psychlit, and EMBASE. 468 non-duplicate records were identified. Two reviewers classified abstracts for study inclusion, resulting in 12 studies of moderate quality. Eleven studies were pilot or full-scale trials compared to a control condition. Interventions showed high acceptability despite substantial variation in type and amount of treatment. Compared to treatment-as-usual, computer-based interventions led to less substance use as well as higher motivation to change, better retention, and greater knowledge of presented information. Computer-based interventions for drug use disorders have the potential to dramatically expand and alter the landscape of treatment. Evaluation of internet and phone-based delivery that allow for treatment-on-demand in patients’ own environment is needed. PMID:21185683

  6. Computational sensing of herpes simplex virus using a cost-effective on-chip microscope

    KAUST Repository

    Ray, Aniruddha

    2017-07-03

    Caused by the herpes simplex virus (HSV), herpes is a viral infection that is one of the most widespread diseases worldwide. Here we present a computational sensing technique for specific detection of HSV using both viral immuno-specificity and the physical size range of the viruses. This label-free approach involves a compact and cost-effective holographic on-chip microscope and a surface-functionalized glass substrate prepared to specifically capture the target viruses. To enhance the optical signatures of individual viruses and increase their signal-to-noise ratio, self-assembled polyethylene glycol based nanolenses are rapidly formed around each virus particle captured on the substrate using a portable interface. Holographic shadows of specifically captured viruses that are surrounded by these self-assembled nanolenses are then reconstructed, and the phase image is used for automated quantification of the size of each particle within our large field-of-view, ~30 mm2. The combination of viral immuno-specificity due to surface functionalization and the physical size measurements enabled by holographic imaging is used to sensitively detect and enumerate HSV particles using our compact and cost-effective platform. This computational sensing technique can find numerous uses in global health related applications in resource-limited environments.

  7. A Comprehensive and Cost-Effective Computer Infrastructure for K-12 Schools

    Science.gov (United States)

    Warren, G. P.; Seaton, J. M.

    1996-01-01

    Since 1993, NASA Langley Research Center has been developing and implementing a low-cost Internet connection model, including system architecture, training, and support, to provide Internet access for an entire network of computers. This infrastructure allows local area networks which exceed 50 machines per school to independently access the complete functionality of the Internet by connecting to a central site, using state-of-the-art commercial modem technology, through a single standard telephone line. By locating high-cost resources at this central site and sharing these resources and their costs among the school districts throughout a region, a practical, efficient, and affordable infrastructure for providing scale-able Internet connectivity has been developed. As the demand for faster Internet access grows, the model has a simple expansion path that eliminates the need to replace major system components and re-train personnel. Observations of optical Internet usage within an environment, particularly school classrooms, have shown that after an initial period of 'surfing,' the Internet traffic becomes repetitive. By automatically storing requested Internet information on a high-capacity networked disk drive at the local site (network based disk caching), then updating this information only when it changes, well over 80 percent of the Internet traffic that leaves a location can be eliminated by retrieving the information from the local disk cache.

  8. Deregulation and Nuclear Training: Cost Effective Alternatives

    International Nuclear Information System (INIS)

    Richard P. Coe; Patricia A. Lake

    2000-01-01

    Training is crucial to the success of any organization. It is also expensive, with some estimates exceeding $50 billion annually spent on training by U.S. corporations. Nuclear training, like that of many other highly technical organizations, is both crucial and costly. It is unlikely that the amount of training can be significantly reduced. If anything, current trends indicate that training needs will probably increase as the industry and workforce ages and changes. With the advent of energy deregulation in the United States, greater pressures will surface to make the costs of energy more cost-competitive. This in turn will drive businesses to more closely examine existing costs and find ways to do things in a more cost-effective way. The commercial nuclear industry will be no exception, and nuclear training will be equally affected. It is time for nuclear training and indeed the entire nuclear industry to begin using more aggressive techniques to reduce costs. This includes the need for nuclear training to find alternatives to traditional methods for the delivery of cost-effective high-quality training that meets regulatory requirements and produces well-qualified personnel capable of working in an efficient and safe manner. Computer-based and/or Web-based training are leading emerging technologies

  9. A practical technique for benefit-cost analysis of computer-aided design and drafting systems

    International Nuclear Information System (INIS)

    Shah, R.R.; Yan, G.

    1979-03-01

    Analysis of benefits and costs associated with the operation of Computer-Aided Design and Drafting Systems (CADDS) are needed to derive economic justification for acquiring new systems, as well as to evaluate the performance of existing installations. In practice, however, such analyses are difficult to perform since most technical and economic advantages of CADDS are ΣirreduciblesΣ, i.e. cannot be readily translated into monetary terms. In this paper, a practical technique for economic analysis of CADDS in a drawing office environment is presented. A Σworst caseΣ approach is taken since increase in productivity of existing manpower is the only benefit considered, while all foreseen costs are taken into account. Methods of estimating benefits and costs are described. The procedure for performing the analysis is illustrated by a case study based on the drawing office activities at Atomic Energy of Canada Limited. (auth)

  10. Greater oil investment opportunities

    International Nuclear Information System (INIS)

    Arenas, Ismael Enrique

    1997-01-01

    Geologically speaking, Colombia is a very attractive country for the world oil community. According to this philosophy new and important steps are being taken to reinforce the oil sector: Expansion of the exploratory frontier by including a larger number of sedimentary areas, and the adoption of innovative contracting instruments. Colombia has to offer, Greater economic incentives for the exploration of new areas to expand the exploratory frontier, stimulation of exploration in areas with prospectivity for small fields. Companies may offer Ecopetrol a participation in production over and above royalties, without it's participating in the investments and costs of these fields, more favorable conditions for natural gas seeking projects, in comparison with those governing the terms for oil

  11. An Alternative Method for Computing Unit Costs and Productivity Ratios. AIR 1984 Annual Forum Paper.

    Science.gov (United States)

    Winstead, Wayland H.; And Others

    An alternative measure for evaluating the performance of academic departments was studied. A comparison was made with the traditional manner for computing unit costs and productivity ratios: prorating the salary and effort of each faculty member to each course level based on the personal mix of course taught. The alternative method used averaging…

  12. Computer tablet or telephone? A randomised controlled trial exploring two methods of collecting data from drug and alcohol outpatients.

    Science.gov (United States)

    Hobden, Breanne; Bryant, Jamie; Carey, Mariko; Sanson-Fisher, Rob; Oldmeadow, Christopher

    2017-08-01

    Both computerised and telephone surveys have potential advantages for research data collection. The current study aimed to determine the: (i) feasibility, (ii) acceptability, and (iii) cost per completed survey of computer tablet versus telephone data collection for clients attending an outpatient drug and alcohol treatment clinic. Two-arm randomised controlled trial. Clients attending a drug and alcohol outpatient clinic in New South Wales, Australia, were randomised to complete a baseline survey via computer tablet in the clinic or via telephone interview within two weeks of their appointment. All participants completed a three-month follow-up survey via telephone. Consent and completion rates for the baseline survey were significantly higher in the computer tablet condition. The time taken to complete the computer tablet survey was lower (11min) than the telephone condition (17min). There were no differences in the proportion of consenters or completed follow-up surveys between the two conditions at the 3-month follow-up. Acceptability was high across both modes of data collection. The cost of the computer tablet condition was $67.52 greater per completed survey than the telephone condition. There is a trade-off between computer tablet and telephone data collection. While both data collection methods were acceptable to participants, the computer tablet condition resulted in higher consent and completion rates at baseline, therefore yielding greater external validity, and was quicker for participants to complete. Telephone data collection was however, more cost-effective. Researchers should carefully consider the mode of data collection that suits individual study needs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Low-cost, high-performance and efficiency computational photometer design

    Science.gov (United States)

    Siewert, Sam B.; Shihadeh, Jeries; Myers, Randall; Khandhar, Jay; Ivanov, Vitaly

    2014-05-01

    Researchers at the University of Alaska Anchorage and University of Colorado Boulder have built a low cost high performance and efficiency drop-in-place Computational Photometer (CP) to test in field applications ranging from port security and safety monitoring to environmental compliance monitoring and surveying. The CP integrates off-the-shelf visible spectrum cameras with near to long wavelength infrared detectors and high resolution digital snapshots in a single device. The proof of concept combines three or more detectors into a single multichannel imaging system that can time correlate read-out, capture, and image process all of the channels concurrently with high performance and energy efficiency. The dual-channel continuous read-out is combined with a third high definition digital snapshot capability and has been designed using an FPGA (Field Programmable Gate Array) to capture, decimate, down-convert, re-encode, and transform images from two standard definition CCD (Charge Coupled Device) cameras at 30Hz. The continuous stereo vision can be time correlated to megapixel high definition snapshots. This proof of concept has been fabricated as a fourlayer PCB (Printed Circuit Board) suitable for use in education and research for low cost high efficiency field monitoring applications that need multispectral and three dimensional imaging capabilities. Initial testing is in progress and includes field testing in ports, potential test flights in un-manned aerial systems, and future planned missions to image harsh environments in the arctic including volcanic plumes, ice formation, and arctic marine life.

  14. The cost-effectiveness of the RSI QuickScan intervention programme for computer workers: Results of an economic evaluation alongside a randomised controlled trial.

    Science.gov (United States)

    Speklé, Erwin M; Heinrich, Judith; Hoozemans, Marco J M; Blatter, Birgitte M; van der Beek, Allard J; van Dieën, Jaap H; van Tulder, Maurits W

    2010-11-11

    The costs of arm, shoulder and neck symptoms are high. In order to decrease these costs employers implement interventions aimed at reducing these symptoms. One frequently used intervention is the RSI QuickScan intervention programme. It establishes a risk profile of the target population and subsequently advises interventions following a decision tree based on that risk profile. The purpose of this study was to perform an economic evaluation, from both the societal and companies' perspective, of the RSI QuickScan intervention programme for computer workers. In this study, effectiveness was defined at three levels: exposure to risk factors, prevalence of arm, shoulder and neck symptoms, and days of sick leave. The economic evaluation was conducted alongside a randomised controlled trial (RCT). Participating computer workers from 7 companies (N = 638) were assigned to either the intervention group (N = 320) or the usual care group (N = 318) by means of cluster randomisation (N = 50). The intervention consisted of a tailor-made programme, based on a previously established risk profile. At baseline, 6 and 12 month follow-up, the participants completed the RSI QuickScan questionnaire. Analyses to estimate the effect of the intervention were done according to the intention-to-treat principle. To compare costs between groups, confidence intervals for cost differences were computed by bias-corrected and accelerated bootstrapping. The mean intervention costs, paid by the employer, were 59 euro per participant in the intervention and 28 euro in the usual care group. Mean total health care and non-health care costs per participant were 108 euro in both groups. As to the cost-effectiveness, improvement in received information on healthy computer use as well as in their work posture and movement was observed at higher costs. With regard to the other risk factors, symptoms and sick leave, only small and non-significant effects were found. In this study, the RSI Quick

  15. Effectiveness of Multimedia Elements in Computer Supported Instruction: Analysis of Personalization Effects, Students' Performances and Costs

    Science.gov (United States)

    Zaidel, Mark; Luo, XiaoHui

    2010-01-01

    This study investigates the efficiency of multimedia instruction at the college level by comparing the effectiveness of multimedia elements used in the computer supported learning with the cost of their preparation. Among the various technologies that advance learning, instructors and students generally identify interactive multimedia elements as…

  16. Automatic Computer Mapping of Terrain

    Science.gov (United States)

    Smedes, H. W.

    1971-01-01

    Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.

  17. Computational methods in drug discovery

    Directory of Open Access Journals (Sweden)

    Sumudu P. Leelananda

    2016-12-01

    Full Text Available The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein–ligand docking, pharmacophore modeling and QSAR techniques are reviewed.

  18. Computer-assisted propofol administration.

    LENUS (Irish Health Repository)

    O'Connor, J P A

    2012-02-01

    The use of propofol for sedation in endoscopy may allow for better quality of sedation, quicker recovery and facilitate greater throughput in endoscopy units. The cost-effectiveness and utility of propofol sedation for endoscopic procedures is contingent on the personnel and resources required to carry out the procedure. Computer-based platforms are based on the patients response to stimulation and physiologic parameters. They offer an appealing means of delivering safe and effective doses of propofol. One such means is the bispectral index where continuous EEG recordings are used to assess the degree of sedation. Another is the closed-loop target-controlled system where a set of physical parameters, such as muscle relaxation and auditory-evoked potential, determine a level of medication appropriate to achieve sedation. Patient-controlled platforms may also be used. These electronic adjuncts may help endoscopists who wish to adopt propofol sedation to change current practices with greater confidence.

  19. Cost Savings Associated with the Adoption of a Cloud Computing Data Transfer System for Trauma Patients.

    Science.gov (United States)

    Feeney, James M; Montgomery, Stephanie C; Wolf, Laura; Jayaraman, Vijay; Twohig, Michael

    2016-09-01

    Among transferred trauma patients, challenges with the transfer of radiographic studies include problems loading or viewing the studies at the receiving hospitals, and problems manipulating, reconstructing, or evalu- ating the transferred images. Cloud-based image transfer systems may address some ofthese problems. We reviewed the charts of patients trans- ferred during one year surrounding the adoption of a cloud computing data transfer system. We compared the rates of repeat imaging before (precloud) and af- ter (postcloud) the adoption of the cloud-based data transfer system. During the precloud period, 28 out of 100 patients required 90 repeat studies. With the cloud computing transfer system in place, three out of 134 patients required seven repeat films. There was a statistically significant decrease in the proportion of patients requiring repeat films (28% to 2.2%, P < .0001). Based on an annualized volume of 200 trauma patient transfers, the cost savings estimated using three methods of cost analysis, is between $30,272 and $192,453.

  20. Direct costs and cost-effectiveness of dual-source computed tomography and invasive coronary angiography in patients with an intermediate pretest likelihood for coronary artery disease.

    Science.gov (United States)

    Dorenkamp, Marc; Bonaventura, Klaus; Sohns, Christian; Becker, Christoph R; Leber, Alexander W

    2012-03-01

    The study aims to determine the direct costs and comparative cost-effectiveness of latest-generation dual-source computed tomography (DSCT) and invasive coronary angiography for diagnosing coronary artery disease (CAD) in patients suspected of having this disease. The study was based on a previously elaborated cohort with an intermediate pretest likelihood for CAD and on complementary clinical data. Cost calculations were based on a detailed analysis of direct costs, and generally accepted accounting principles were applied. Based on Bayes' theorem, a mathematical model was used to compare the cost-effectiveness of both diagnostic approaches. Total costs included direct costs, induced costs and costs of complications. Effectiveness was defined as the ability of a diagnostic test to accurately identify a patient with CAD. Direct costs amounted to €98.60 for DSCT and to €317.75 for invasive coronary angiography. Analysis of model calculations indicated that cost-effectiveness grew hyperbolically with increasing prevalence of CAD. Given the prevalence of CAD in the study cohort (24%), DSCT was found to be more cost-effective than invasive coronary angiography (€970 vs €1354 for one patient correctly diagnosed as having CAD). At a disease prevalence of 49%, DSCT and invasive angiography were equally effective with costs of €633. Above a threshold value of disease prevalence of 55%, proceeding directly to invasive coronary angiography was more cost-effective than DSCT. With proper patient selection and consideration of disease prevalence, DSCT coronary angiography is cost-effective for diagnosing CAD in patients with an intermediate pretest likelihood for it. However, the range of eligible patients may be smaller than previously reported.

  1. Low cost, scalable proteomics data analysis using Amazon's cloud computing services and open source search algorithms.

    Science.gov (United States)

    Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N

    2009-06-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).

  2. Tapping Transaction Costs to Forecast Acquisition Cost Breaches

    Science.gov (United States)

    2016-01-01

    experience a cost breach. In our medical example, we could use survival analysis to identify risk fac- tors, such as obesity , that might indicate a greater... exogenous variables on the probability of a dichotomous outcome, such as whether or not a cost breach occurs in any given program year. Logit is

  3. Computer automation in veterinary hospitals.

    Science.gov (United States)

    Rogers, H

    1996-05-01

    Computers have been used to automate complex and repetitive tasks in veterinary hospitals since the 1960s. Early systems were expensive, but their use was justified because they performed jobs which would have been impossible or which would have required greater resources in terms of time and personnel had they been performed by other methods. Systems found in most veterinary hospitals today are less costly, magnitudes more capable, and often underused. Modern multitasking operating systems and graphical interfaces bring many opportunities for automation. Commercial and custom programs developed and used in a typical multidoctor mixed species veterinary practice are described.

  4. Expedited Holonomic Quantum Computation via Net Zero-Energy-Cost Control in Decoherence-Free Subspace.

    Science.gov (United States)

    Pyshkin, P V; Luo, Da-Wei; Jing, Jun; You, J Q; Wu, Lian-Ao

    2016-11-25

    Holonomic quantum computation (HQC) may not show its full potential in quantum speedup due to the prerequisite of a long coherent runtime imposed by the adiabatic condition. Here we show that the conventional HQC can be dramatically accelerated by using external control fields, of which the effectiveness is exclusively determined by the integral of the control fields in the time domain. This control scheme can be realized with net zero energy cost and it is fault-tolerant against fluctuation and noise, significantly relaxing the experimental constraints. We demonstrate how to realize the scheme via decoherence-free subspaces. In this way we unify quantum robustness merits of this fault-tolerant control scheme, the conventional HQC and decoherence-free subspace, and propose an expedited holonomic quantum computation protocol.

  5. Greater trochanteric fracture with occult intertrochanteric extension.

    Science.gov (United States)

    Reiter, Michael; O'Brien, Seth D; Bui-Mansfield, Liem T; Alderete, Joseph

    2013-10-01

    Proximal femoral fractures are frequently encountered in the emergency department (ED). Prompt diagnosis is paramount as delay will exacerbate the already poor outcomes associated with these injuries. In cases where radiography is negative but clinical suspicion remains high, magnetic resonance imaging (MRI) is the study of choice as it has the capability to depict fractures which are occult on other imaging modalities. Awareness of a particular subset of proximal femoral fractures, namely greater trochanteric fractures, is vital for both radiologists and clinicians since it has been well documented that they invariably have an intertrochanteric component which may require surgical management. The detection of intertrochanteric or cervical extension of greater trochanteric fractures has been described utilizing MRI but is underestimated with both computed tomography (CT) and bone scan. Therefore, if MRI is unavailable or contraindicated, the diagnosis of an isolated greater trochanteric fracture should be met with caution. The importance of avoiding this potential pitfall is demonstrated in the following case of an elderly woman with hip pain and CT demonstrating an isolated greater trochanteric fracture who subsequently returned to the ED with a displaced intertrochanteric fracture.

  6. Operating Dedicated Data Centers - Is It Cost-Effective?

    Science.gov (United States)

    Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.

    2014-06-01

    The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.

  7. The cost-effectiveness of methanol for reducing motor vehicle emissions and urban ozone

    International Nuclear Information System (INIS)

    Krupnick, A.J.; Walls, M.A.

    1992-01-01

    This article analyzes the costs and emissions characteristics of methanol vehicles. The cost-effectiveness of methanol - the cost per ton of reactive hydrocarbon emissions reduced - is calculated and compared to the cost-effectiveness of other hydrocarbon reduction strategies. Methanol is found to cost from $33,000 to nearly $60,000 per ton, while several other options are available for under $10,000 per ton. The cost per part-per-million reduction in peak ambient ozone levels is also computed for two cities, Houston and Philadelphia. Despite the greater improvement in ozone in Philadelphia than Houston, methanol is found to be more cost-effective in Houston. This result occurs because Houston's distribution and marketing costs are lower than Philadelphia's. The costs in both cities, however, are far higher than estimates of the benefits from acute health improvements. Finally, the reduction in ozone exposure in Los Angeles is estimated and the costs of the reduction compared with an estimate of acute health benefits. Again, the benefits fall far short of the costs. 51 refs., 5 tabs

  8. Advanced computer graphics techniques as applied to the nuclear industry

    International Nuclear Information System (INIS)

    Thomas, J.J.; Koontz, A.S.

    1985-08-01

    Computer graphics is a rapidly advancing technological area in computer science. This is being motivated by increased hardware capability coupled with reduced hardware costs. This paper will cover six topics in computer graphics, with examples forecasting how each of these capabilities could be used in the nuclear industry. These topics are: (1) Image Realism with Surfaces and Transparency; (2) Computer Graphics Motion; (3) Graphics Resolution Issues and Examples; (4) Iconic Interaction; (5) Graphic Workstations; and (6) Data Fusion - illustrating data coming from numerous sources, for display through high dimensional, greater than 3-D, graphics. All topics will be discussed using extensive examples with slides, video tapes, and movies. Illustrations have been omitted from the paper due to the complexity of color reproduction. 11 refs., 2 figs., 3 tabs

  9. Cost Analysis of the STONE Randomized Trial: Can Health Care Costs be Reduced One Test at a Time?

    Science.gov (United States)

    Melnikow, Joy; Xing, Guibo; Cox, Ginger; Leigh, Paul; Mills, Lisa; Miglioretti, Diana L; Moghadassi, Michelle; Smith-Bindman, Rebecca

    2016-04-01

    Decreasing the use of high-cost tests may reduce health care costs. To compare costs of care for patients presenting to the emergency department (ED) with suspected kidney stones randomized to 1 of 3 initial imaging tests. Patients were randomized to point-of-care ultrasound (POC US, least costly), radiology ultrasound (RAD US), or computed tomography (CT, most costly). Subsequent testing and treatment were the choice of the treating physician. A total of 2759 patients at 15 EDs were randomized to POC US (n=908), RAD US, (n=893), or CT (n=958). Mean age was 40.4 years; 51.8% were male. All medical care documented in the trial database in the 7 days following enrollment was abstracted and coded to estimate costs using national average 2012 Medicare reimbursements. Costs for initial ED care and total 7-day costs were compared using nonparametric bootstrap to account for clustering of patients within medical centers. Initial ED visit costs were modestly lower for patients assigned to RAD US: $423 ($411, $434) compared with patients assigned to CT: $448 ($438, $459) (Pcosts were not significantly different between groups: $1014 ($912, $1129) for POC US, $970 ($878, $1078) for RAD US, and $959 ($870, $1044) for CT. Hospital admissions contributed over 50% of total costs, though only 11% of patients were admitted. Mean total costs (and admission rates) varied substantially by site from $749 to $1239. Assignment to a less costly test had no impact on overall health care costs for ED patients. System-level interventions addressing variation in admission rates from the ED might have greater impact on costs.

  10. The thermodynamic efficiency of computations made in cells across the range of life

    Science.gov (United States)

    Kempes, Christopher P.; Wolpert, David; Cohen, Zachary; Pérez-Mercader, Juan

    2017-11-01

    Biological organisms must perform computation as they grow, reproduce and evolve. Moreover, ever since Landauer's bound was proposed, it has been known that all computation has some thermodynamic cost-and that the same computation can be achieved with greater or smaller thermodynamic cost depending on how it is implemented. Accordingly an important issue concerning the evolution of life is assessing the thermodynamic efficiency of the computations performed by organisms. This issue is interesting both from the perspective of how close life has come to maximally efficient computation (presumably under the pressure of natural selection), and from the practical perspective of what efficiencies we might hope that engineered biological computers might achieve, especially in comparison with current computational systems. Here we show that the computational efficiency of translation, defined as free energy expended per amino acid operation, outperforms the best supercomputers by several orders of magnitude, and is only about an order of magnitude worse than the Landauer bound. However, this efficiency depends strongly on the size and architecture of the cell in question. In particular, we show that the useful efficiency of an amino acid operation, defined as the bulk energy per amino acid polymerization, decreases for increasing bacterial size and converges to the polymerization cost of the ribosome. This cost of the largest bacteria does not change in cells as we progress through the major evolutionary shifts to both single- and multicellular eukaryotes. However, the rates of total computation per unit mass are non-monotonic in bacteria with increasing cell size, and also change across different biological architectures, including the shift from unicellular to multicellular eukaryotes. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  11. Cost-effectiveness analysis of computerized ECG interpretation system in an ambulatory health care organization.

    Science.gov (United States)

    Carel, R S

    1982-04-01

    The cost-effectiveness of a computerized ECG interpretation system in an ambulatory health care organization has been evaluated in comparison with a conventional (manual) system. The automated system was shown to be more cost-effective at a minimum load of 2,500 patients/month. At larger monthly loads an even greater cost-effectiveness was found, the average cost/ECG being about $2. In the manual system the cost/unit is practically independent of patient load. This is primarily due to the fact that 87% of the cost/ECG is attributable to wages and fees of highly trained personnel. In the automated system, on the other hand, the cost/ECG is heavily dependent on examinee load. This is due to the relatively large impact of equipment depreciation on fixed (and total) cost. Utilization of a computer-assisted system leads to marked reduction in cardiologists' interpretation time, substantially shorter turnaround time (of unconfirmed reports), and potential provision of simultaneous service at several remotely located "heart stations."

  12. Computer-assisted propofol administration.

    Science.gov (United States)

    O'Connor, J P A; O'Moráin, C A; Vargo, J J

    2010-01-01

    The use of propofol for sedation in endoscopy may allow for better quality of sedation, quicker recovery and facilitate greater throughput in endoscopy units. The cost-effectiveness and utility of propofol sedation for endoscopic procedures is contingent on the personnel and resources required to carry out the procedure. Computer-based platforms are based on the patients response to stimulation and physiologic parameters. They offer an appealing means of delivering safe and effective doses of propofol. One such means is the bispectral index where continuous EEG recordings are used to assess the degree of sedation. Another is the closed-loop target-controlled system where a set of physical parameters, such as muscle relaxation and auditory-evoked potential, determine a level of medication appropriate to achieve sedation. Patient-controlled platforms may also be used. These electronic adjuncts may help endoscopists who wish to adopt propofol sedation to change current practices with greater confidence. Copyright 2010 S. Karger AG, Basel.

  13. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    Science.gov (United States)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  14. POPCYCLE: a computer code for calculating nuclear and fossil plant levelized life-cycle power costs

    International Nuclear Information System (INIS)

    Hardie, R.W.

    1982-02-01

    POPCYCLE, a computer code designed to calculate levelized life-cycle power costs for nuclear and fossil electrical generating plants is described. Included are (1) derivations of the equations and a discussion of the methodology used by POPCYCLE, (2) a description of the input required by the code, (3) a listing of the input for a sample case, and (4) the output for a sample case

  15. Early assessment of the likely cost-effectiveness of a new technology: A Markov model with probabilistic sensitivity analysis of computer-assisted total knee replacement.

    Science.gov (United States)

    Dong, Hengjin; Buxton, Martin

    2006-01-01

    The objective of this study is to apply a Markov model to compare cost-effectiveness of total knee replacement (TKR) using computer-assisted surgery (CAS) with that of TKR using a conventional manual method in the absence of formal clinical trial evidence. A structured search was carried out to identify evidence relating to the clinical outcome, cost, and effectiveness of TKR. Nine Markov states were identified based on the progress of the disease after TKR. Effectiveness was expressed by quality-adjusted life years (QALYs). The simulation was carried out initially for 120 cycles of a month each, starting with 1,000 TKRs. A discount rate of 3.5 percent was used for both cost and effectiveness in the incremental cost-effectiveness analysis. Then, a probabilistic sensitivity analysis was carried out using a Monte Carlo approach with 10,000 iterations. Computer-assisted TKR was a long-term cost-effective technology, but the QALYs gained were small. After the first 2 years, the incremental cost per QALY of computer-assisted TKR was dominant because of cheaper and more QALYs. The incremental cost-effectiveness ratio (ICER) was sensitive to the "effect of CAS," to the CAS extra cost, and to the utility of the state "Normal health after primary TKR," but it was not sensitive to utilities of other Markov states. Both probabilistic and deterministic analyses produced similar cumulative serious or minor complication rates and complex or simple revision rates. They also produced similar ICERs. Compared with conventional TKR, computer-assisted TKR is a cost-saving technology in the long-term and may offer small additional QALYs. The "effect of CAS" is to reduce revision rates and complications through more accurate and precise alignment, and although the conclusions from the model, even when allowing for a full probabilistic analysis of uncertainty, are clear, the "effect of CAS" on the rate of revisions awaits long-term clinical evidence.

  16. Incremental cost of department-wide implementation of a picture archiving and communication system and computed radiography.

    Science.gov (United States)

    Pratt, H M; Langlotz, C P; Feingold, E R; Schwartz, J S; Kundel, H L

    1998-01-01

    To determine the incremental cash flows associated with department-wide implementation of a picture archiving and communication system (PACS) and computed radiography (CR) at a large academic medical center. The authors determined all capital and operational costs associated with PACS implementation during an 8-year time horizon. Economic effects were identified, adjusted for time value, and used to calculate net present values (NPVs) for each section of the department of radiology and for the department as a whole. The chest-bone section used the most resources. Changes in cost assumptions for the chest-bone section had a dominant effect on the department-wide NPV. The base-case NPV (i.e., that determined by using the initial assumptions) was negative, indicating that additional net costs are incurred by the radiology department from PACS implementation. PACS and CR provide cost savings only when a 12-year hardware life span is assumed, when CR equipment is removed from the analysis, or when digitized long-term archives are compressed at a rate of 10:1. Full PACS-CR implementation would not provide cost savings for a large, subspecialized department. However, institutions that are committed to CR implementation (for whom CR implementation would represent a sunk cost) or institutions that are able to archive images by using image compression will experience cost savings from PACS.

  17. Costs of traffic injuries

    DEFF Research Database (Denmark)

    Kruse, Marie

    2015-01-01

    assessed using Danish national healthcare registers. Productivity costs were computed using duration analysis (Cox regression models). In a subanalysis, cost per severe traffic injury was computed for the 12 995 individuals that experienced a severe injury. RESULTS: The socioeconomic cost of a traffic...... injury was €1406 (2009 price level) in the first year, and €8950 over a 10-year period. Per 100 000 population, the 10-year cost was €6 565 668. A severe traffic injury costs €4969 per person in the first year, and €4 006 685 per 100 000 population over a 10-year period. Victims of traffic injuries...

  18. Selecting an Architecture for a Safety-Critical Distributed Computer System with Power, Weight and Cost Considerations

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.

  19. Operating dedicated data centers – is it cost-effective?

    International Nuclear Information System (INIS)

    Ernst, M; Hogue, R; Hollowell, C; Strecker-Kellog, W; Wong, A; Zaytsev, A

    2014-01-01

    The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.

  20. Social incidence and economic costs of carbon limits; A computable general equilibrium analysis for Switzerland

    Energy Technology Data Exchange (ETDEWEB)

    Stephan, G.; Van Nieuwkoop, R.; Wiedmer, T. (Institute for Applied Microeconomics, Univ. of Bern (Switzerland))

    1992-01-01

    Both distributional and allocational effects of limiting carbon dioxide emissions in a small and open economy are discussed. It starts from the assumption that Switzerland attempts to stabilize its greenhouse gas emissions over the next 25 years, and evaluates costs and benefits of the respective reduction programme. From a methodological viewpoint, it is illustrated how a computable general equilibrium approach can be adopted for identifying economic effects of cutting greenhouse gas emissions on the national level. From a political economy point of view it considers the social incidence of a greenhouse policy. It shows in particular that public acceptance can be increased and economic costs of greenhouse policies can be reduced, if carbon taxes are accompanied by revenue redistribution. 8 tabs., 1 app., 17 refs.

  1. Unit Cost Compendium Calculations

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Unit Cost Compendium (UCC) Calculations raw data set was designed to provide for greater accuracy and consistency in the use of unit costs across the USEPA...

  2. A Low-Cost Computer-Controlled Arduino-Based Educational Laboratory System for Teaching the Fundamentals of Photovoltaic Cells

    Science.gov (United States)

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental…

  3. Use of several Cloud Computing approaches for climate modelling: performance, costs and opportunities

    Science.gov (United States)

    Perez Montes, Diego A.; Añel Cabanelas, Juan A.; Wallom, David C. H.; Arribas, Alberto; Uhe, Peter; Caderno, Pablo V.; Pena, Tomas F.

    2017-04-01

    Cloud Computing is a technological option that offers great possibilities for modelling in geosciences. We have studied how two different climate models, HadAM3P-HadRM3P and CESM-WACCM, can be adapted in two different ways to run on Cloud Computing Environments from three different vendors: Amazon, Google and Microsoft. Also, we have evaluated qualitatively how the use of Cloud Computing can affect the allocation of resources by funding bodies and issues related to computing security, including scientific reproducibility. Our first experiments were developed using the well known ClimatePrediction.net (CPDN), that uses BOINC, over the infrastructure from two cloud providers, namely Microsoft Azure and Amazon Web Services (hereafter AWS). For this comparison we ran a set of thirteen month climate simulations for CPDN in Azure and AWS using a range of different virtual machines (VMs) for HadRM3P (50 km resolution over South America CORDEX region) nested in the global atmosphere-only model HadAM3P. These simulations were run on a single processor and took between 3 and 5 days to compute depending on the VM type. The last part of our simulation experiments was running WACCM over different VMS on the Google Compute Engine (GCE) and make a comparison with the supercomputer (SC) Finisterrae1 from the Centro de Supercomputacion de Galicia. It was shown that GCE gives better performance than the SC for smaller number of cores/MPI tasks but the model throughput shows clearly how the SC performance is better after approximately 100 cores (related with network speed and latency differences). From a cost point of view, Cloud Computing moves researchers from a traditional approach where experiments were limited by the available hardware resources to monetary resources (how many resources can be afforded). As there is an increasing movement and recommendation for budgeting HPC projects on this technology (budgets can be calculated in a more realistic way) we could see a shift on

  4. Computer-assisted cognitive remediation therapy in schizophrenia: Durability of the effects and cost-utility analysis.

    Science.gov (United States)

    Garrido, Gemma; Penadés, Rafael; Barrios, Maite; Aragay, Núria; Ramos, Irene; Vallès, Vicenç; Faixa, Carlota; Vendrell, Josep M

    2017-08-01

    The durability of computer-assisted cognitive remediation (CACR) therapy over time and the cost-effectiveness of treatment remains unclear. The aim of the current study is to investigate the effectiveness of CACR and to examine the use and cost of acute psychiatric admissions before and after of CACR. Sixty-seven participants were initially recruited. For the follow-up study a total of 33 participants were enrolled, 20 to the CACR condition group and 13 to the active control condition group. All participants were assessed at baseline, post-therapy and 12 months post-therapy on neuropsychology, QoL and self-esteem measurements. The use and cost of acute psychiatric admissions were collected retrospectively at four assessment points: baseline, 12 months post-therapy, 24 months post-therapy, and 36 months post-therapy. The results indicated that treatment effectiveness persisted in the CACR group one year post-therapy on neuropsychological and well-being outcomes. The CACR group showed a clear decrease in the use of acute psychiatric admissions at 12, 24 and 36 months post-therapy, which lowered the global costs the acute psychiatric admissions at 12, 24 and 36 months post-therapy. The CACR is durable over at least a 12-month period, and CACR may be helping to reduce health care costs for schizophrenia patients. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  5. Secure equality and greater-than tests with sublinear online complexity

    DEFF Research Database (Denmark)

    Lipmaa, Helger; Toft, Tomas

    2013-01-01

    Secure multiparty computation (MPC) allows multiple parties to evaluate functions without disclosing the private inputs. Secure comparisons (testing equality and greater-than) are important primitives required by many MPC applications. We propose two equality tests for ℓ-bit values with O(1) online...

  6. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  7. Unenhanced computed tomography in acute renal colic reduces cost outside radiology department

    DEFF Research Database (Denmark)

    Lauritsen, J.; Andersen, J.R.; Nordling, J.

    2008-01-01

    BACKGROUND: Unenhanced multidetector computed tomography (UMDCT) is well established as the procedure of choice for radiologic evaluation of patients with renal colic. The procedure has both clinical and financial consequences for departments of surgery and radiology. However, the financial effect...... outside the radiology department is poorly elucidated. PURPOSE: To evaluate the financial consequences outside of the radiology department, a retrospective study comparing the ward occupation of patients examined with UMDCT to that of intravenous urography (IVU) was performed. MATERIAL AND METHODS......) saved the hospital USD 265,000 every 6 months compared to the use of IVU. CONCLUSION: Use of UMDCT compared to IVU in patients with renal colic leads to cost savings outside the radiology department Udgivelsesdato: 2008/12...

  8. The impact of geography on energy infrastructure costs

    International Nuclear Information System (INIS)

    Zvoleff, Alex; Kocaman, Ayse Selin; Huh, Woonghee Tim; Modi, Vijay

    2009-01-01

    Infrastructure planning for networked infrastructure such as grid electrification (or piped supply of water) has historically been a process of outward network expansion, either by utilities in response to immediate economic opportunity, or in response to a government mandate or subsidy intended to catalyze economic growth. While significant progress has been made in access to grid electricity in Asia, where population densities are greater and rural areas tend to have nucleated settlements, access to grid electricity in Sub-Saharan Africa remains low; a problem generally ascribed to differences in settlement patterns. The discussion, however, has remained qualitative, and hence it has been difficult for planners to understand the differing costs of carrying out grid expansion in one region as opposed to another. This paper describes a methodology to estimate the cost of local-level distribution systems for a least-cost network, and to compute additional information of interest to policymakers, such as the marginal cost of connecting additional households to a grid as a function of the penetration rate. We present several large datasets of household locations developed from satellite imagery, and examine them with our methodology, providing insight into the relationship between settlement pattern and the cost of rural electrification.

  9. COMPUTER SYSTEM FOR DETERMINATION OF COST DAILY SUGAR PRODUCTION AND INCIDENTS DECISIONS FOR COMPANIES SUGAR (SACODI

    Directory of Open Access Journals (Sweden)

    Alejandro Álvarez-Navarro

    2016-01-01

    Full Text Available The process of sugar production is complex; anything that affects this chain has direct repercussions in the sugar production’s costs, it’s synthetic and decisive indicator for the taking of decisions. Currently the Cuban sugar factory determine this cost weekly, for that, its process of taking of decisions is affected. Looking for solutions to this problem, the present work, being part of a territorial project approved by CITMA, intended to calculate the cost of production daily, weekly, monthly and accumulated until indicated date, according to an adaptation to the methodology used by the National Costs System of sugarcane created by the MINAZ, it’s supported by a computer system denominated SACODI. This adaptation registers the physical and economic indicators of all direct and indirect expenses of the  sugarcane and besides this information generates an economic-mathematical model of goal programming whose solution indicates the best balance in amount of sugar of the entities of the sugar factory, in short term. The implementation of the system in the sugar factory «Julio A. Mella» in Santiago de Cuba in the sugar-cane production 08-09 produced an estimate of decrease of the cost of until 3,5 % for the taking of better decisions. 

  10. [Relating costs to activities in hospitals. Use of internal cost accounting].

    Science.gov (United States)

    Stavem, K

    1995-01-10

    During the last few years hospital cost accounting has become widespread in many countries, in parallel with increasing cost pressure, greater competition and new financing schemes. Cost accounting has been used in the manufacturing industry for many years. Costs can be related to activities and production, e.g. by the costing of procedures, episodes of care and other internally defined cost objectives. Norwegian hospitals have lagged behind in the adoption of cost accounting. They ought to act quickly if they want to be prepared for possible changes in health care financing. The benefits can be considerable to a hospital operating in a rapidly changing health care environment.

  11. VMware private cloud computing with vCloud director

    CERN Document Server

    Gallagher, Simon

    2013-01-01

    It's All About Delivering Service with vCloud Director Empowered by virtualization, companies are not just moving into the cloud, they're moving into private clouds for greater security, flexibility, and cost savings. However, this move involves more than just infrastructure. It also represents a different business model and a new way to provide services. In this detailed book, VMware vExpert Simon Gallagher makes sense of private cloud computing for IT administrators. From basic cloud theory and strategies for adoption to practical implementation, he covers all the issues. You'll lea

  12. Computer performance evaluation of FACOM 230-75 computer system, (2)

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1980-08-01

    In this report are described computer performance evaluations for FACOM230-75 computers in JAERI. The evaluations are performed on following items: (1) Cost/benefit analysis of timesharing terminals, (2) Analysis of the response time of timesharing terminals, (3) Analysis of throughout time for batch job processing, (4) Estimation of current potential demands for computer time, (5) Determination of appropriate number of card readers and line printers. These evaluations are done mainly from the standpoint of cost reduction of computing facilities. The techniques adapted are very practical ones. This report will be useful for those people who are concerned with the management of computing installation. (author)

  13. Rural New Zealand health professionals' perceived barriers to greater use of the internet for learning.

    Science.gov (United States)

    Janes, Ron; Arroll, Bruce; Buetow, Stephen; Coster, Gregor; McCormick, Ross; Hague, Iain

    2005-01-01

    The purpose of this research was to investigate rural North Island (New Zealand) health professionals' attitudes and perceived barriers to using the internet for ongoing professional learning. A cross-sectional postal survey of all rural North Island GPs, practice nurses and pharmacists was conducted in mid-2003. The questionnaire contained both quantitative and qualitative questions. The transcripts from two open questions requiring written answers were analysed for emergent themes, which are reported here. The first open question asked: 'Do you have any comments on the questionnaire, learning, computers or the Internet?' The second open question asked those who had taken a distance-learning course using the internet to list positive and negative aspects of their course, and suggest improvements. Out of 735 rural North Island health professionals surveyed, 430 returned useable questionnaires (a response rate of 59%). Of these, 137 answered the question asking for comments on learning, computers and the internet. Twenty-eight individuals who had completed a distance-learning course using the internet, provided written responses to the second question. Multiple barriers to greater use of the internet were identified. They included lack of access to computers, poor availability of broadband (fast) internet access, lack of IT skills/knowledge, lack of time, concerns about IT costs and database security, difficulty finding quality information, lack of time, energy or motivation to learn new skills, competing priorities (eg family), and a preference for learning modalities which include more social interaction. Individuals also stated that rural health professionals needed to engage the technology, because it provided rapid, flexible access from home or work to a significant health information resource, and would save money and travelling time to urban-based education. In mid-2003, there were multiple barriers to rural North Island health professionals making greater

  14. Costs of fire suppression forces based on cost-aggregation approach

    Science.gov (United States)

    Gonz& aacute; lez-Cab& aacute; Armando n; Charles W. McKetta; Thomas J. Mills

    1984-01-01

    A cost-aggregation approach has been developed for determining the cost of Fire Management Inputs (FMls)-the direct fireline production units (personnel and equipment) used in initial attack and large-fire suppression activities. All components contributing to an FMI are identified, computed, and summed to estimate hourly costs. This approach can be applied to any FMI...

  15. Environmental damage costs from airborne pollution of industrial activities in the greater Athens, Greece area and the resulting benefits from the introduction of BAT

    International Nuclear Information System (INIS)

    Mirasgedis, S.; Hontou, V.; Georgopoulou, E.; Sarafidis, Y.; Gakis, N.; Lalas, D.P.; Loukatos, A.; Gargoulas, N.; Mentzis, A.; Economidis, D.; Triantafilopoulos, T.; Korizi, K.; Mavrotas, G.

    2008-01-01

    Attributing costs to the environmental impacts associated with industrial activities can greatly assist in protecting human health and the natural environment as monetary values are capable of directly influencing technological and policy decisions without changing the rules of the market. This paper attempts to estimate the external cost attributable to the atmospheric pollution from 'medium and high environmental burden' industrial activities located in the greater Athens area and the benefits from Best Available Techniques (BAT) introduction. To this end a number of typical installations were defined to be used in conjunction with the Impact Pathway Approach developed in the context of the ExternE project to model all industrial sectors/sub-sectors located in the area of interest. Total environmental externalities due to air pollutants emitted by these industrial activities were found to reach 211 M Euro per year, associated mainly with human mortality and morbidity due to PM 10 emissions, as well as with climate change impacts due to CO 2 emissions for which non-metallic minerals and oil processing industries are the main sources. The results obtained can be used as the basis for an integrated evaluation of potential BAT, taking into account not only private costs and benefits but also the environmental externalities, thus leading to policy decisions that maximize social welfare in each industrial sector/sub-sector

  16. Computed tomography of the heart using thallium-201 in children

    International Nuclear Information System (INIS)

    Treves, S.; Hill, T.C.; VanPraagh, R.; Holman, B.L.

    1979-01-01

    Thallium-201 emission computed tomography (ECT) was performed in 3 pediatric patients in whom conventional scintigraphy was normal but there was a strong clinical suspicion of myocardial disease. Abnormalities in the distribution of myocardial perfusion appeared sharply delineated with ECT compared to normal conventional gamma camera scintigraphy. Single photon ECT provides a three dimensional reconstruction which results in greater enhancement since activity in overlying structures does not interfere. Its widespread use is limited only by the cost of the imaging device

  17. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    International Nuclear Information System (INIS)

    Evans, D; Fisk, I; Holzman, B; Pordes, R; Tiradani, A; Melo, A; Sheldon, P; Metson, S

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely 'on-demand' as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the 'base-line' needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  18. Planning for greater confinement disposal

    International Nuclear Information System (INIS)

    Gilbert, T.L.; Luner, C.; Meshkov, N.K.; Trevorrow, L.E.; Yu, C.

    1985-01-01

    A report that provides guidance for planning for greater-confinement disposal (GCD) of low-level radioactive waste is being prepared. The report addresses procedures for selecting a GCD technology and provides information for implementing these procedures. The focus is on GCD; planning aspects common to GCD and shallow-land burial are covered by reference. Planning procedure topics covered include regulatory requirements, waste characterization, benefit-cost-risk assessment and pathway analysis methodologies, determination of need, waste-acceptance criteria, performance objectives, and comparative assessment of attributes that support these objectives. The major technologies covered include augered shafts, deep trenches, engineered structures, hydrofracture, improved waste forms, and high-integrity containers. Descriptive information is provided, and attributes that are relevant for risk assessment and operational requirements are given. 10 refs., 3 figs., 2 tabs

  19. Reduced computational cost in the calculation of worst case response time for real time systems

    OpenAIRE

    Urriza, José M.; Schorb, Lucas; Orozco, Javier D.; Cayssials, Ricardo

    2009-01-01

    Modern Real Time Operating Systems require reducing computational costs even though the microprocessors become more powerful each day. It is usual that Real Time Operating Systems for embedded systems have advance features to administrate the resources of the applications that they support. In order to guarantee either the schedulability of the system or the schedulability of a new task in a dynamic Real Time System, it is necessary to know the Worst Case Response Time of the Real Time tasks ...

  20. Computer architecture fundamentals and principles of computer design

    CERN Document Server

    Dumas II, Joseph D

    2005-01-01

    Introduction to Computer ArchitectureWhat is Computer Architecture?Architecture vs. ImplementationBrief History of Computer SystemsThe First GenerationThe Second GenerationThe Third GenerationThe Fourth GenerationModern Computers - The Fifth GenerationTypes of Computer SystemsSingle Processor SystemsParallel Processing SystemsSpecial ArchitecturesQuality of Computer SystemsGenerality and ApplicabilityEase of UseExpandabilityCompatibilityReliabilitySuccess and Failure of Computer Architectures and ImplementationsQuality and the Perception of QualityCost IssuesArchitectural Openness, Market Timi

  1. Bioenergetic components of reproductive effort in viviparous snakes: costs of vitellogenesis exceed costs of pregnancy.

    Science.gov (United States)

    Van Dyke, James U; Beaupre, Steven J

    2011-12-01

    Reproductive effort has been defined as the proportion of an organism's energy budget that is allocated to reproduction over a biologically meaningful time period. Historically, studies of reproductive bioenergetics considered energy content of gametes, but not costs of gamete production. Although metabolic costs of vitellogenesis (MCV) fundamentally reflect the primary bioenergetic cost of reproductive allocation in female reptiles, the few investigations that have considered costs of reproductive allocation have focused on metabolic costs of pregnancy (MCP) in viviparous species. We define MCP as energetic costs incurred by pregnant females, including all costs of maintaining gestation conditions necessary for embryogenesis. MCP by our definition do not include fetal costs of embryogenesis. We measured metabolic rates in five species of viviparous snakes (Agkistrodon contortrix, Boa constrictor, Eryx colubrinus, Nerodia sipedon, and Thamnophis sirtalis) during vitellogenesis and pregnancy in order to estimate MCV and MCP. Across all species, MCV were responsible for 30% increases in maternal metabolism. Phylogenetically-independent contrasts showed that MCV were significantly greater in B. constrictor than in other species, likely because B. constrictor yolk energy content was greater than that of other species. Estimates of MCP were not significantly different from zero in any species. In viviparous snakes, MCV appear to represent significant bioenergetic expenditures, while MCP do not. We suggest that MCV, together with yolk energy content, represent the most significant component of reptilian reproductive effort, and therefore deserve greater attention than MCP in studies of reptilian reproductive bioenergetics. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Fast mode decision based on human noticeable luminance difference and rate distortion cost for H.264/AVC

    Science.gov (United States)

    Li, Mian-Shiuan; Chen, Mei-Juan; Tai, Kuang-Han; Sue, Kuen-Liang

    2013-12-01

    This article proposes a fast mode decision algorithm based on the correlation of the just-noticeable-difference (JND) and the rate distortion cost (RD cost) to reduce the computational complexity of H.264/AVC. First, the relationship between the average RD cost and the number of JND pixels is established by Gaussian distributions. Thus, the RD cost of the Inter 16 × 16 mode is compared with the predicted thresholds from these models for fast mode selection. In addition, we use the image content, the residual data, and JND visual model for horizontal/vertical detection, and then utilize the result to predict the partition in a macroblock. From the experimental results, a greater time saving can be achieved while the proposed algorithm also maintains performance and quality effectively.

  3. Scilab software as an alternative low-cost computing in solving the linear equations problem

    Science.gov (United States)

    Agus, Fahrul; Haviluddin

    2017-02-01

    Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.

  4. New Federal Cost Accounting Regulations

    Science.gov (United States)

    Wolff, George J.; Handzo, Joseph J.

    1973-01-01

    Discusses a new set of indirect cost accounting procedures which must be followed by school districts wishing to recover any indirect costs of administering federal grants and contracts. Also discusses the amount of indirect costs that may be recovered, computing indirect costs, classifying project costs, and restricted grants. (Author/DN)

  5. Reactive control processes contributing to residual switch cost and mixing cost in young and old adults

    Directory of Open Access Journals (Sweden)

    Lisa Rebecca Whitson

    2014-04-01

    Full Text Available In task-switching paradigms, performance is better when repeating the same task than when alternating between tasks (switch cost and when repeating a task alone rather than intermixed with another task (mixing cost. These costs remain even after extensive practice and when task cues enable advanced preparation (residual costs. Moreover, residual RT mixing cost has been consistently shown to increase with age. Residual switch and mixing costs modulate the amplitude of the stimulus-locked P3b. This mixing effect is disproportionately larger in older adults who also prepare more for and respond more cautiously on these ‘mixed’ repeat trials (Karayanidis et al., 2011. In this study, we examine stimulus-locked and response-locked P3 and lateralized readiness potentials to identify whether residual switch and mixing cost arise from the need to control interference at the level of stimulus processing or response processing. Residual mixing cost was associated with control of stimulus-level interference, whereas residual switch cost was also associated with a delay in response selection. In older adults, the disproportionate increase in mixing cost was associated with greater interference at the level of decision-response mapping and response programming for repeat trials in mixed-task blocks. We argue that, together with evidence of greater proactive control and more cautious responding for these trials, these findings suggest that older adults strategically recruit greater proactive and reactive control to overcome increased susceptibility to post-stimulus interference. This interpretation is consistent with recruitment of compensatory strategies to compensate for reduced repetition benefit rather than an overall decline on cognitive flexibility.

  6. Costs and role of ultrasound follow-up of polytrauma patients after initial computed tomography

    International Nuclear Information System (INIS)

    Maurer, M.H.; Winkler, A.; Powerski, M.J.; Elgeti, F.; Huppertz, A.; Roettgen, R.; Marnitz, T.; Wichlas, F.

    2012-01-01

    Purpose: To assess the costs and diagnostic gain of abdominal ultrasound follow-up of polytrauma patients initially examined by whole-body computed tomography (CT). Materials and Methods: A total of 176 patients with suspected multiple trauma (126 men, 50 women; age 43.5 ± 17.4 years) were retrospectively analyzed with regard to supplementary and new findings obtained by ultrasound follow-up compared with the results of exploratory FAST (focused assessment with sonography for trauma) at admission and the findings of whole-body CT. A process model was used to document the staff, materials, and total costs of the ultrasound follow-up examinations. Results: FAST yielded 26 abdominal findings (organ injury and/or free intra-abdominal fluid) in 19 patients, while the abdominal scan of whole-body CT revealed 32 findings in 25 patients. FAST had 81 % sensitivity and 100 % specificity. Follow-up ultrasound examinations revealed new findings in 2 of the 25 patients with abdominal injuries detected with initial CT. In the 151 patients without abdominal injuries in the initial CT scan, ultrasound follow-up did not yield any supplementary or new findings. The total costs of an ultrasound follow-up examination were EUR 28.93. The total costs of all follow-up ultrasound examinations performed in the study population were EUR 5658.23. Conclusion: Follow-up abdominal ultrasound yields only a low overall diagnostic gain in polytrauma patients in whom initial CT fails to detect any abdominal injuries but incurs high personnel expenses for radiological departments. (orig.)

  7. Torsion of the greater omentum: A rare preoperative diagnosis

    International Nuclear Information System (INIS)

    Tandon, Ankit Anil; Lim, Kian Soon

    2010-01-01

    Torsion of the greater omentum is a rare acute abdominal condition that is seldom diagnosed preoperatively. We report the characteristic computed tomography (CT) scan findings and the clinical implications of this unusual diagnosis in a 41-year-old man, who also had longstanding right inguinal hernia. Awareness of omental torsion as a differential diagnosis in the acute abdomen setting is necessary for correct patient management

  8. Cost-effectiveness of routine computed tomography in the evaluation of idiopathic unilateral vocal fold paralysis.

    Science.gov (United States)

    Hojjat, Houmehr; Svider, Peter F; Folbe, Adam J; Raza, Syed N; Carron, Michael A; Shkoukani, Mahdi A; Merati, Albert L; Mayerhoff, Ross M

    2017-02-01

    To evaluate the cost-effectiveness of routine computed tomography (CT) in individuals with unilateral vocal fold paralysis (UVFP) STUDY DESIGN: Health Economics Decision Tree Analysis METHODS: A decision tree was constructed to determine the incremental cost-effectiveness ratio (ICER) of CT imaging in UVFP patients. Univariate sensitivity analysis was utilized to calculate what the probability of having an etiology of the paralysis discovered would have to be to make CT with contrast more cost-effective than no imaging. We used two studies examining findings in UVFP patients. The decision pathways were utilizing CT neck with intravenous contrast after diagnostic laryngoscopy versus laryngoscopy alone. The probability of detecting an etiology for UVFP and associated costs were extracted to construct the decision tree. The only incorrect diagnosis was missing a mass in the no-imaging decision branch, which rendered an effectiveness of 0. The ICER of using CT was $3,306, below most acceptable willingness-to-pay (WTP) thresholds. Additionally, univariate sensitivity analysis indicated that at the WTP threshold of $30,000, obtaining CT imaging was the most cost-effective choice when the probability of having a lesion was above 1.7%. Multivariate probabilistic sensitivity analysis with Monte Carlo simulations also showed that at the WTP of $30,000, CT scanning is more cost-effective, with 99.5% certainty. Particularly in the current healthcare environment characterized by increasing consciousness of utilization defensive medicine, economic evaluations represent evidence-based findings that can be employed to facilitate appropriate decision making and enhance physician-patient communication. This economic evaluation strongly supports obtaining CT imaging in patients with newly diagnosed UVFP. 2c. Laryngoscope, 2016 127:440-444, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  9. 24 CFR 908.108 - Cost.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Cost. 908.108 Section 908.108..., RENTAL VOUCHER, AND MODERATE REHABILITATION PROGRAMS § 908.108 Cost. (a) General. The costs of the... computer hardware or software, or both, the cost of contracting for those services, or the cost of...

  10. [Cost effectiveness of workplace smoking policies].

    Science.gov (United States)

    Raaijmakers, Tamara; van den Borne, Inge

    2003-01-01

    This study reviews the motivations of companies to set out a policy for controlling smoking, the economic benefits for the company resulting from such a policy and the costs, broken down by European Union countries. The literature on the costs of implementing a policy related to smoking at the workplace is reviewed. The main objective of policies related to smoking at the workplace is that of safeguarding employees from environmental tobacco smoke. Other reasons are cutting costs, improving the company image, and reducing absenteeism, occupational accidents, internal quarrels and extra costs due to cigarette smoking, protection against environmental tobacco smoke does not entail any higher costs for companies, and economic advantages are visible. The benefits are by far greater than the costs involved, particularly on a long-range basis, and seem to be greater when smoking at the workplace is completely prohibited and no smoking areas are set.

  11. Understanding Time-driven Activity-based Costing.

    Science.gov (United States)

    Sharan, Alok D; Schroeder, Gregory D; West, Michael E; Vaccaro, Alexander R

    2016-03-01

    Transitioning to a value-based health care system will require providers to increasingly scrutinize their outcomes and costs. Although there has been a great deal of effort to understand outcomes, cost accounting in health care has been a greater challenge. Currently the cost accounting methods used by hospitals and providers are based off a fee-for-service system. As resources become increasingly scarce and the health care system attempts to understand which services provide the greatest value, it will be critically important to understand the true costs of delivering a service. An understanding of the true costs of a particular service will help providers make smarter decisions on how to allocate and utilize resources as well as determine which activities are nonvalue added. Achieving value will require providers to have a greater focus on accurate outcome data as well as better methods of cost accounting.

  12. Use of spikants in HTGR fuel and their effect on costs

    International Nuclear Information System (INIS)

    Brooks, L.H.

    1979-05-01

    The costs of fresh fuel fabrication have been estimated for flowsheets that have spikant added (Co-60) at various points. The costs are compared with refabricated and fresh fuel fabrication costs. It is shown that the costs increase as the spikant is added nearer to the plant feed point. The least cost is achieved by adding a detachable spikant to the finished fuel element. The highest cost is incurred when the spikant is fed in with the plant feed. This cost is greater than that for refabricated fuel because extra process cells are necessary to prepare the spikant and, in addition, the gamma flux from the Co-60 is much greater than from U-232 and greater precautions have to be taken

  13. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Bach, Matthias

    2014-07-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  14. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    International Nuclear Information System (INIS)

    Bach, Matthias

    2014-01-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  15. MONITOR: A computer model for estimating the costs of an integral monitored retrievable storage facility

    International Nuclear Information System (INIS)

    Reimus, P.W.; Sevigny, N.L.; Schutz, M.E.; Heller, R.A.

    1986-12-01

    The MONITOR model is a FORTRAN 77 based computer code that provides parametric life-cycle cost estimates for a monitored retrievable storage (MRS) facility. MONITOR is very flexible in that it can estimate the costs of an MRS facility operating under almost any conceivable nuclear waste logistics scenario. The model can also accommodate input data of varying degrees of complexity and detail (ranging from very simple to more complex) which makes it ideal for use in the MRS program, where new designs and new cost data are frequently offered for consideration. MONITOR can be run as an independent program, or it can be interfaced with the Waste System Transportation and Economic Simulation (WASTES) model, a program that simulates the movement of waste through a complete nuclear waste disposal system. The WASTES model drives the MONITOR model by providing it with the annual quantities of waste that are received, stored, and shipped at the MRS facility. Three runs of MONITOR are documented in this report. Two of the runs are for Version 1 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2A (backup) version of the MRS cost estimate. In one of these runs MONITOR was run as an independent model, and in the other run MONITOR was run using an input file generated by the WASTES model. The two runs correspond to identical cases, and the fact that they gave identical results verified that the code performed the same calculations in both modes of operation. The third run was made for Version 2 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2B (integral) version of the MRS cost estimate. This run was made with MONITOR being run as an independent model. The results of several cases have been verified by hand calculations

  16. A precise goniometer/tensiometer using a low cost single-board computer

    Science.gov (United States)

    Favier, Benoit; Chamakos, Nikolaos T.; Papathanasiou, Athanasios G.

    2017-12-01

    Measuring the surface tension and the Young contact angle of a droplet is extremely important for many industrial applications. Here, considering the booming interest for small and cheap but precise experimental instruments, we have constructed a low-cost contact angle goniometer/tensiometer, based on a single-board computer (Raspberry Pi). The device runs an axisymmetric drop shape analysis (ADSA) algorithm written in Python. The code, here named DropToolKit, was developed in-house. We initially present the mathematical framework of our algorithm and then we validate our software tool against other well-established ADSA packages, including the commercial ramé-hart DROPimage Advanced as well as the DropAnalysis plugin in ImageJ. After successfully testing for various combinations of liquids and solid surfaces, we concluded that our prototype device would be highly beneficial for industrial applications as well as for scientific research in wetting phenomena compared to the commercial solutions.

  17. Wastewater Treatment Costs and Outlays in Organic Petrochemicals: Standards Versus Taxes With Methodology Suggestions for Marginal Cost Pricing and Analysis

    Science.gov (United States)

    Thompson, Russell G.; Singleton, F. D., Jr.

    1986-04-01

    With the methodology recommended by Baumol and Oates, comparable estimates of wastewater treatment costs and industry outlays are developed for effluent standard and effluent tax instruments for pollution abatement in five hypothetical organic petrochemicals (olefins) plants. The computational method uses a nonlinear simulation model for wastewater treatment to estimate the system state inputs for linear programming cost estimation, following a practice developed in a National Science Foundation (Research Applied to National Needs) study at the University of Houston and used to estimate Houston Ship Channel pollution abatement costs for the National Commission on Water Quality. Focusing on best practical and best available technology standards, with effluent taxes adjusted to give nearly equal pollution discharges, shows that average daily treatment costs (and the confidence intervals for treatment cost) would always be less for the effluent tax than for the effluent standard approach. However, industry's total outlay for these treatment costs, plus effluent taxes, would always be greater for the effluent tax approach than the total treatment costs would be for the effluent standard approach. Thus the practical necessity of showing smaller outlays as a prerequisite for a policy change toward efficiency dictates the need to link the economics at the microlevel with that at the macrolevel. Aggregation of the plants into a programming modeling basis for individual sectors and for the economy would provide a sound basis for effective policy reform, because the opportunity costs of the salient regulatory policies would be captured. Then, the government's policymakers would have the informational insights necessary to legislate more efficient environmental policies in light of the wealth distribution effects.

  18. The cost of electrocoagulation

    Energy Technology Data Exchange (ETDEWEB)

    Donini, J.C.; Kan, J.; Szynkarczuk, J.; Hassan, T.A.; Kar, K.L.

    1993-01-01

    Electrocoagulation could be an attractive and suitable method for separating solids from waste water. The electrocoagulation of kaolinite and bentonite suspensions was studied in a pilot electrocoagulation unit to assess the cost and efficiency of the process. Factors affecting cost such as the formation of passivation layers on electrode plates and the recirculation and concentration of sodium chloride were examined. Colorimetry was used to analyze aluminum content in the suspension. The results were used to calculate the cost due to consumption of electrode material (aluminium) during the process. Total cost was assumed to comprise the energy cost and the cost of electrode material. Comparison was based on the settling properties of the treated product: turbidity, settling rate, and cake height. In most cases, aluminium efficiency averaged around 200% and material cost accounted for 80% of total cost. Although higher concentrations of sodium chloride could only slightly increase aluminium efficiency and electrode efficiency, the higher concentrations resulted in much greater total cost, due to the greater current generated by the increased suspension conductivity, which in turn dissolved a larger amount of aluminium. The recirculation loop increased the flow rate by 3-10 times, enhancing the mass transport between the electrodes and resulting in lower cost and better settling properties. Over the course of two months the electrodes coatings became thicker while efficiency decreased. The electrode efficiency was found to be as high as 94% for virgin electrodes and as low as 10% after two months. 8 refs., 25 figs., 9 tabs.

  19. Risk assessment and management of brucellosis in the southern greater Yellowstone area (II): Cost-benefit analysis of reducing elk brucellosis prevalence.

    Science.gov (United States)

    Boroff, Kari; Kauffman, Mandy; Peck, Dannele; Maichak, Eric; Scurlock, Brandon; Schumaker, Brant

    2016-11-01

    Recent cases of bovine brucellosis (Brucella abortus) in cattle (Bos taurus) and domestic bison (Bison bison) of the southern Greater Yellowstone Area (SGYA) have been traced back to free-ranging elk (Cervus elaphus). Several management activities have been implemented to reduce brucellosis seroprevalence in elk, including test-and-slaughter, low-density feeding at elk winter feedgrounds, and elk vaccination. It is unclear which of these activities are most cost-effective at reducing the risk of elk transmitting brucellosis to cattle. In a companion paper, a stochastic risk model was used to translate a reduction in elk seroprevalence to a reduction in the risk of transmission to cattle. Here, we use those results to estimate the expected economic benefits and costs of reducing seroprevalence in elk using three different management activities: vaccination of elk with Brucella strain 19 (S19), low-density feeding of elk, and elk test-and-slaughter. Results indicate that the three elk management activities yield negative expected net benefits, ranging from -$2983 per year for low-density feeding to -$595,471 per year for test-and-slaughter. Society's risk preferences will determine whether strategies that generate small negative net benefit, such as low-density feeding, are worth implementing. However, activities with large negative net benefits, such as test-and-slaughter and S19 vaccination, are unlikely to be economically worthwhile. Given uncertainty about various model parameters, we identify some circumstances in which individual management activities might generate positive expected net benefit. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Near DC eddy current measurement of aluminum multilayers using MR sensors and commodity low-cost computer technology

    Science.gov (United States)

    Perry, Alexander R.

    2002-06-01

    Low Frequency Eddy Current (EC) probes are capable of measurement from 5 MHz down to DC through the use of Magnetoresistive (MR) sensors. Choosing components with appropriate electrical specifications allows them to be matched to the power and impedance characteristics of standard computer connectors. This permits direct attachment of the probe to inexpensive computers, thereby eliminating external power supplies, amplifiers and modulators that have heretofore precluded very low system purchase prices. Such price reduction is key to increased market penetration in General Aviation maintenance and consequent reduction in recurring costs. This paper examines our computer software CANDETECT, which implements this approach and permits effective probe operation. Results are presented to show the intrinsic sensitivity of the software and demonstrate its practical performance when seeking cracks in the underside of a thick aluminum multilayer structure. The majority of the General Aviation light aircraft fleet uses rivets and screws to attach sheet aluminum skin to the airframe, resulting in similar multilayer lap joints.

  1. [Cost analysis for navigation in knee endoprosthetics].

    Science.gov (United States)

    Cerha, O; Kirschner, S; Günther, K-P; Lützner, J

    2009-12-01

    Total knee arthroplasty (TKA) is one of the most frequent procedures in orthopaedic surgery. The outcome depends on a range of factors including alignment of the leg and the positioning of the implant in addition to patient-associated factors. Computer-assisted navigation systems can improve the restoration of a neutral leg alignment. This procedure has been established especially in Europe and North America. The additional expenses are not reimbursed in the German DRG system (Diagnosis Related Groups). In the present study a cost analysis of computer-assisted TKA compared to the conventional technique was performed. The acquisition expenses of various navigation systems (5 and 10 year depreciation), annual costs for maintenance and software updates as well as the accompanying costs per operation (consumables, additional operating time) were considered. The additional operating time was determined on the basis of a meta-analysis according to the current literature. Situations with 25, 50, 100, 200 and 500 computer-assisted TKAs per year were simulated. The amount of the incremental costs of the computer-assisted TKA depends mainly on the annual volume and the additional operating time. A relevant decrease of the incremental costs was detected between 50 and 100 procedures per year. In a model with 100 computer-assisted TKAs per year an additional operating time of 14 mins and a 10 year depreciation of the investment costs, the incremental expenses amount to 300-395 depending on the navigation system. Computer-assisted TKA is associated with additional costs. From an economical point of view an amount of more than 50 procedures per year appears to be favourable. The cost-effectiveness could be estimated if long-term results will show a reduction of revisions or a better clinical outcome.

  2. Cloud computing for comparative genomics.

    Science.gov (United States)

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  3. Applied cost allocation

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Hougaard, Jens Leth; Smilgins, Aleksandrs

    2016-01-01

    This paper deals with empirical computation of Aumann–Shapley cost shares for joint production. We show that if one uses a mathematical programing approach with its non-parametric estimation of the cost function there may be observations in the data set for which we have multiple Aumann–Shapley p...

  4. Management and cost accounting

    CERN Document Server

    Drury, Colin

    1992-01-01

    This third edition of a textbook on management and cost accounting features coverage of activity-based costing (ABC), advance manufacturing technologies (AMTs), JIT, MRP, target costing, life-cycle costing, strategic management accounting, total quality management and customer profitability analysis. Also included are revised and new end-of-chapter problems taken from past examination papers of CIMA, ACCA and ICAEW. There is increased reference to management accounting in practice, including many of the results of the author's CIMA sponsored survey, and greater emphasis on operational control and performance measurement.

  5. DECOST: computer routine for decommissioning cost and funding analysis

    International Nuclear Information System (INIS)

    Mingst, B.C.

    1979-12-01

    One of the major controversies surrounding the decommissioning of nuclear facilities is the lack of financial information on just what the eventual costs will be. The Nuclear Regulatory Commission has studies underway to analyze the costs of decommissioning of nuclear fuel cycle facilities and some other similar studies have also been done by other groups. These studies all deal only with the final cost outlays needed to finance decommissioning in an unchangeable set of circumstances. Funding methods and planning to reduce the costs and financial risks are usually not attempted. The DECOST program package is intended to fill this void and allow wide-ranging study of the various options available when planning for the decommissioning of nuclear facilities

  6. An Introduction to Parallel Cluster Computing Using PVM for Computer Modeling and Simulation of Engineering Problems

    International Nuclear Information System (INIS)

    Spencer, VN

    2001-01-01

    An investigation has been conducted regarding the ability of clustered personal computers to improve the performance of executing software simulations for solving engineering problems. The power and utility of personal computers continues to grow exponentially through advances in computing capabilities such as newer microprocessors, advances in microchip technologies, electronic packaging, and cost effective gigabyte-size hard drive capacity. Many engineering problems require significant computing power. Therefore, the computation has to be done by high-performance computer systems that cost millions of dollars and need gigabytes of memory to complete the task. Alternately, it is feasible to provide adequate computing in the form of clustered personal computers. This method cuts the cost and size by linking (clustering) personal computers together across a network. Clusters also have the advantage that they can be used as stand-alone computers when they are not operating as a parallel computer. Parallel computing software to exploit clusters is available for computer operating systems like Unix, Windows NT, or Linux. This project concentrates on the use of Windows NT, and the Parallel Virtual Machine (PVM) system to solve an engineering dynamics problem in Fortran

  7. Chest Computed Tomographic Image Screening for Cystic Lung Diseases in Patients with Spontaneous Pneumothorax Is Cost Effective.

    Science.gov (United States)

    Gupta, Nishant; Langenderfer, Dale; McCormack, Francis X; Schauer, Daniel P; Eckman, Mark H

    2017-01-01

    Patients without a known history of lung disease presenting with a spontaneous pneumothorax are generally diagnosed as having primary spontaneous pneumothorax. However, occult diffuse cystic lung diseases such as Birt-Hogg-Dubé syndrome (BHD), lymphangioleiomyomatosis (LAM), and pulmonary Langerhans cell histiocytosis (PLCH) can also first present with a spontaneous pneumothorax, and their early identification by high-resolution computed tomographic (HRCT) chest imaging has implications for subsequent management. The objective of our study was to evaluate the cost-effectiveness of HRCT chest imaging to facilitate early diagnosis of LAM, BHD, and PLCH. We constructed a Markov state-transition model to assess the cost-effectiveness of screening HRCT to facilitate early diagnosis of diffuse cystic lung diseases in patients presenting with an apparent primary spontaneous pneumothorax. Baseline data for prevalence of BHD, LAM, and PLCH and rates of recurrent pneumothoraces in each of these diseases were derived from the literature. Costs were extracted from 2014 Medicare data. We compared a strategy of HRCT screening followed by pleurodesis in patients with LAM, BHD, or PLCH versus conventional management with no HRCT screening. In our base case analysis, screening for the presence of BHD, LAM, or PLCH in patients presenting with a spontaneous pneumothorax was cost effective, with a marginal cost-effectiveness ratio of $1,427 per quality-adjusted life-year gained. Sensitivity analysis showed that screening HRCT remained cost effective for diffuse cystic lung diseases prevalence as low as 0.01%. HRCT image screening for BHD, LAM, and PLCH in patients with apparent primary spontaneous pneumothorax is cost effective. Clinicians should consider performing a screening HRCT in patients presenting with apparent primary spontaneous pneumothorax.

  8. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    Science.gov (United States)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  9. Effects on costs of frontline diagnostic evaluation in patients suspected of angina: coronary computed tomography angiography vs. conventional ischaemia testing

    DEFF Research Database (Denmark)

    Nielsen, Lene H; Olsen, Jens; Markenvard, John

    2013-01-01

    group. The mean (SD) total costs per patient at the end of thefollow-up were 14% lower in the CTA group than in the ex-test group, € 1510 (3474) vs. €1777 (3746) (P = 0.03). CONCLUSION: Diagnostic assessment of symptomatic patients with a low-intermediate probability of CAD by CTA incurred lower costs......AIMS: The aim of this study was to investigate in patients with stable angina the effects on costs of frontline diagnostics by exercise-stress testing (ex-test) vs. coronary computed tomography angiography (CTA). METHODS AND RESULTS: In two coronary units at Lillebaelt Hospital, Denmark, 498...... patients were identified in whom either ex-test (n = 247) or CTA (n = 251) were applied as the frontline diagnostic strategy in symptomatic patients with a low-intermediate pre-test probability of coronary artery disease (CAD). During 12 months of follow-up, death, myocardial infarction and costs...

  10. On the Clouds: A New Way of Computing

    Directory of Open Access Journals (Sweden)

    Yan Han

    2010-06-01

    Full Text Available This article introduces cloud computing and discusses the author’s experience “on the clouds.” The author reviews cloud computing services and providers, then presents his experience of running multiple systems (e.g., integrated library systems, content management systems, and repository software. He evaluates costs, discusses advantages, and addresses some issues about cloud computing. Cloud computing fundamentally changes the ways institutions and companies manage their computing needs. Libraries can take advantage of cloud computing to start an IT project with low cost, to manage computing resources cost-effectively, and to explore new computing possibilities.

  11. Server Operation and Virtualization to Save Energy and Cost in Future Sustainable Computing

    Directory of Open Access Journals (Sweden)

    Jun-Ho Huh

    2018-06-01

    Full Text Available Since the introduction of the LTE (Long Term Evolution service, we have lived in a time of expanding amounts of data. The amount of data produced has increased every year with the increase of smart phone distribution in particular. Telecommunication service providers have to struggle to secure sufficient network capacity in order to maintain quick access to necessary data by consumers. Nonetheless, maintaining the maximum capacity and bandwidth at all times requires considerable cost and excessive equipment. Therefore, to solve such a problem, telecommunication service providers need to maintain an appropriate level of network capacity and to provide sustainable service to customers through a quick network development in case of shortage. So far, telecommunication service providers have bought and used the network equipment directly produced by network equipment manufacturers such as Ericsson, Nokia, Cisco, and Samsung. Since the equipment is specialized for networking, which satisfied consumers with their excellent performances, they are very costly because they are developed with advanced technologies. Moreover, it takes much time due to the purchase process wherein the telecommunication service providers place an order and the manufacturer produces and delivers. Accordingly, there are cases that require signaling and two-way data traffic as well as capacity because of the diversity of IoT devices. For these purposes, the need for NFV (Network Function Virtualization is raised. Equipment virtualization is performed so that it is operated on an x86-based compatible server instead of working on the network equipment manufacturer’s dedicated hardware. By operating in some compatible servers, it can reduce the wastage of hardware and cope with the change thanks to quick hardware development. This study proposed an efficient system of reducing cost in network server operation using such NFV technology and found that the cost was reduced by 24

  12. Can a Costly Intervention Be Cost-effective?

    Science.gov (United States)

    Foster, E. Michael; Jones, Damon

    2009-01-01

    Objectives To examine the cost-effectiveness of the Fast Track intervention, a multi-year, multi-component intervention designed to reduce violence among at-risk children. A previous report documented the favorable effect of intervention on the highest-risk group of ninth-graders diagnosed with conduct disorder, as well as self-reported delinquency. The current report addressed the cost-effectiveness of the intervention for these measures of program impact. Design Costs of the intervention were estimated using program budgets. Incremental cost-effectiveness ratios were computed to determine the cost per unit of improvement in the 3 outcomes measured in the 10th year of the study. Results Examination of the total sample showed that the intervention was not cost-effective at likely levels of policymakers' willingness to pay for the key outcomes. Subsequent analysis of those most at risk, however, showed that the intervention likely was cost-effective given specified willingness-to-pay criteria. Conclusions Results indicate that the intervention is cost-effective for the children at highest risk. From a policy standpoint, this finding is encouraging because such children are likely to generate higher costs for society over their lifetimes. However, substantial barriers to cost-effectiveness remain, such as the ability to effectively identify and recruit such higher-risk children in future implementations. PMID:17088509

  13. Cloud computing for comparative genomics

    Directory of Open Access Journals (Sweden)

    Pivovarov Rimma

    2010-05-01

    Full Text Available Abstract Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD, to run within Amazon's Elastic Computing Cloud (EC2. We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  14. Computational Sensing Using Low-Cost and Mobile Plasmonic Readers Designed by Machine Learning

    KAUST Repository

    Ballard, Zachary S.

    2017-01-27

    Plasmonic sensors have been used for a wide range of biological and chemical sensing applications. Emerging nanofabrication techniques have enabled these sensors to be cost-effectively mass manufactured onto various types of substrates. To accompany these advances, major improvements in sensor read-out devices must also be achieved to fully realize the broad impact of plasmonic nanosensors. Here, we propose a machine learning framework which can be used to design low-cost and mobile multispectral plasmonic readers that do not use traditionally employed bulky and expensive stabilized light sources or high-resolution spectrometers. By training a feature selection model over a large set of fabricated plasmonic nanosensors, we select the optimal set of illumination light-emitting diodes needed to create a minimum-error refractive index prediction model, which statistically takes into account the varied spectral responses and fabrication-induced variability of a given sensor design. This computational sensing approach was experimentally validated using a modular mobile plasmonic reader. We tested different plasmonic sensors with hexagonal and square periodicity nanohole arrays and revealed that the optimal illumination bands differ from those that are “intuitively” selected based on the spectral features of the sensor, e.g., transmission peaks or valleys. This framework provides a universal tool for the plasmonics community to design low-cost and mobile multispectral readers, helping the translation of nanosensing technologies to various emerging applications such as wearable sensing, personalized medicine, and point-of-care diagnostics. Beyond plasmonics, other types of sensors that operate based on spectral changes can broadly benefit from this approach, including e.g., aptamer-enabled nanoparticle assays and graphene-based sensors, among others.

  15. Performance Assessment of a Custom, Portable, and Low-Cost Brain-Computer Interface Platform.

    Science.gov (United States)

    McCrimmon, Colin M; Fu, Jonathan Lee; Wang, Ming; Lopes, Lucas Silva; Wang, Po T; Karimi-Bidhendi, Alireza; Liu, Charles Y; Heydari, Payam; Nenadic, Zoran; Do, An Hong

    2017-10-01

    Conventional brain-computer interfaces (BCIs) are often expensive, complex to operate, and lack portability, which confines their use to laboratory settings. Portable, inexpensive BCIs can mitigate these problems, but it remains unclear whether their low-cost design compromises their performance. Therefore, we developed a portable, low-cost BCI and compared its performance to that of a conventional BCI. The BCI was assembled by integrating a custom electroencephalogram (EEG) amplifier with an open-source microcontroller and a touchscreen. The function of the amplifier was first validated against a commercial bioamplifier, followed by a head-to-head comparison between the custom BCI (using four EEG channels) and a conventional 32-channel BCI. Specifically, five able-bodied subjects were cued to alternate between hand opening/closing and remaining motionless while the BCI decoded their movement state in real time and provided visual feedback through a light emitting diode. Subjects repeated the above task for a total of 10 trials, and were unaware of which system was being used. The performance in each trial was defined as the temporal correlation between the cues and the decoded states. The EEG data simultaneously acquired with the custom and commercial amplifiers were visually similar and highly correlated ( ρ = 0.79). The decoding performances of the custom and conventional BCIs averaged across trials and subjects were 0.70 ± 0.12 and 0.68 ± 0.10, respectively, and were not significantly different. The performance of our portable, low-cost BCI is comparable to that of the conventional BCIs. Platforms, such as the one developed here, are suitable for BCI applications outside of a laboratory.

  16. Computer technology and computer programming research and strategies

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.

  17. Reactive control processes contributing to residual switch cost and mixing cost across the adult lifespan.

    Science.gov (United States)

    Whitson, Lisa R; Karayanidis, Frini; Fulham, Ross; Provost, Alexander; Michie, Patricia T; Heathcote, Andrew; Hsieh, Shulan

    2014-01-01

    In task-switching paradigms, performance is better when repeating the same task than when alternating between tasks (switch cost) and when repeating a task alone rather than intermixed with another task (mixing cost). These costs remain even after extensive practice and when task cues enable advanced preparation (residual costs). Moreover, residual reaction time mixing cost has been consistently shown to increase with age. Residual switch and mixing costs modulate the amplitude of the stimulus-locked P3b. This mixing effect is disproportionately larger in older adults who also prepare more for and respond more cautiously on these "mixed" repeat trials (Karayanidis et al., 2011). In this paper, we analyze stimulus-locked and response-locked P3 and lateralized readiness potentials to identify whether residual switch and mixing cost arise from the need to control interference at the level of stimulus processing or response processing. Residual mixing cost was associated with control of stimulus-level interference, whereas residual switch cost was also associated with a delay in response selection. In older adults, the disproportionate increase in mixing cost was associated with greater interference at the level of decision-response mapping and response programming for repeat trials in mixed-task blocks. These findings suggest that older adults strategically recruit greater proactive and reactive control to overcome increased susceptibility to post-stimulus interference. This interpretation is consistent with recruitment of compensatory strategies to compensate for reduced repetition benefit rather than an overall decline on cognitive flexibility.

  18. Cost-Effectiveness Analysis (CEA) of Intravenous Urography (IVU) and Unenhanced Multidetector Computed Tomography (MDCT) for Initial Investigation of Suspected Acute Ureterolithiasis

    International Nuclear Information System (INIS)

    Eikefjord, E.; Askildsen, J.E.; Roervik, J.

    2008-01-01

    Background: It is important to compare the cost and effectiveness of multidetector computed tomography (MDCT) and intravenous urography (IVU) to determine the most cost-effective alternative for the initial investigation of acute ureterolithiasis. Purpose: To analyze the task-specific variable costs combined with the diagnostic effect of MDCT and IVU for patients with acute flank pain, and to determine which is most cost effective. Material and Methods: 119 patients with acute flank pain suggestive of stone disease (ureterolithiasis) were examined by both MDCT and IVU. Variable costs related to medical equipment, consumption material, equipment control, and personnel were calculated. The diagnostic effect was assessed. Results: The variable costs of MDCT versus IVU were EUR 32 and EUR 117, respectively. This significant difference was mainly due to savings in examination time, higher annual examination frequency, lower material costs, and no use of contrast media. As for diagnostic effect, MDCT proved considerably more accurate in the diagnosis of stone disease than IVU and markedly more accurate concerning differential diagnoses. Conclusion: MDCT had lower differential costs and a higher capacity to determine correctly stone disease and differential diagnoses, as compared to IVU, in patients with acute flank pain. Consequently, MDCT is a dominant alternative to IVU when evaluated exclusively from a cost-effective perspective

  19. Ubiquitous Computing: The Universal Use of Computers on College Campuses.

    Science.gov (United States)

    Brown, David G., Ed.

    This book is a collection of vignettes from 13 universities where everyone on campus has his or her own computer. These 13 institutions have instituted "ubiquitous computing" in very different ways at very different costs. The chapters are: (1) "Introduction: The Ubiquitous Computing Movement" (David G. Brown); (2) "Dartmouth College" (Malcolm…

  20. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  1. How to Bill Your Computer Services.

    Science.gov (United States)

    Dooskin, Herbert P.

    1981-01-01

    A computer facility billing procedure should be designed so that the full costs of a computer center operation are equitably charged to the users. Design criteria, costing methods, and management's role are discussed. (Author/MLF)

  2. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  3. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  4. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  5. Applied cost allocation

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Hougaard, Jens Leth; Smilgins, Aleksandrs

    2016-01-01

    This paper deals with empirical computation of Aumann–Shapley cost shares for joint production. We show that if one uses a mathematical programing approach with its non-parametric estimation of the cost function there may be observations in the data set for which we have multiple Aumann–Shapley p...... of assumptions concerning firm behavior. These assumptions enable us to connect inefficient with efficient production and thereby provide consistent ways of allocating the costs arising from inefficiency....

  6. The role of dedicated data computing centers in the age of cloud computing

    Science.gov (United States)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  7. Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics.

    Science.gov (United States)

    Patrizi, Alfredo; Pennestrì, Ettore; Valentini, Pier Paolo

    2016-01-01

    The paper deals with the comparison between a high-end marker-based acquisition system and a low-cost marker-less methodology for the assessment of the human posture during working tasks. The low-cost methodology is based on the use of a single Microsoft Kinect V1 device. The high-end acquisition system is the BTS SMART that requires the use of reflective markers to be placed on the subject's body. Three practical working activities involving object lifting and displacement have been investigated. The operational risk has been evaluated according to the lifting equation proposed by the American National Institute for Occupational Safety and Health. The results of the study show that the risk multipliers computed from the two acquisition methodologies are very close for all the analysed activities. In agreement to this outcome, the marker-less methodology based on the Microsoft Kinect V1 device seems very promising to promote the dissemination of computer-aided assessment of ergonomics while maintaining good accuracy and affordable costs. PRACTITIONER’S SUMMARY: The study is motivated by the increasing interest for on-site working ergonomics assessment. We compared a low-cost marker-less methodology with a high-end marker-based system. We tested them on three different working tasks, assessing the working risk of lifting loads. The two methodologies showed comparable precision in all the investigations.

  8. User's guide to SERICPAC: A computer program for calculating electric-utility avoided costs rates

    Energy Technology Data Exchange (ETDEWEB)

    Wirtshafter, R.; Abrash, M.; Koved, M.; Feldman, S.

    1982-05-01

    SERICPAC is a computer program developed to calculate average avoided cost rates for decentralized power producers and cogenerators that sell electricity to electric utilities. SERICPAC works in tandem with SERICOST, a program to calculate avoided costs, and determines the appropriate rates for buying and selling of electricity from electric utilities to qualifying facilities (QF) as stipulated under Section 210 of PURA. SERICPAC contains simulation models for eight technologies including wind, hydro, biogas, and cogeneration. The simulations are converted in a diversified utility production which can be either gross production or net production, which accounts for an internal electricity usage by the QF. The program allows for adjustments to the production to be made for scheduled and forced outages. The final output of the model is a technology-specific average annual rate. The report contains a description of the technologies and the simulations as well as complete user's guide to SERICPAC.

  9. Low-cost computer classification of land cover in the Portland area, Oregon, by signature extension techniques

    Science.gov (United States)

    Gaydos, Leonard

    1978-01-01

    Computer-aided techniques for interpreting multispectral data acquired by Landsat offer economies in the mapping of land cover. Even so, the actual establishment of the statistical classes, or "signatures," is one of the relatively more costly operations involved. Analysts have therefore been seeking cost-saving signature extension techniques that would accept training data acquired for one time or place and apply them to another. Opportunities to extend signatures occur in preprocessing steps and in the classification steps that follow. In the present example, land cover classes were derived by the simplest and most direct form of signature extension: Classes statistically derived from a Landsat scene for the Puget Sound area, Wash., were applied to the Portland area, Oreg., using data for the next Landsat scene acquired less than 25 seconds down orbit. Many features can be recognized on the reduced-scale version of the Portland land cover map shown in this report, although no statistical assessment of its accuracy is available.

  10. Adaptive Cost-Based Task Scheduling in Cloud Environment

    Directory of Open Access Journals (Sweden)

    Mohammed A. S. Mosleh

    2016-01-01

    Full Text Available Task execution in cloud computing requires obtaining stored data from remote data centers. Though this storage process reduces the memory constraints of the user’s computer, the time deadline is a serious concern. In this paper, Adaptive Cost-based Task Scheduling (ACTS is proposed to provide data access to the virtual machines (VMs within the deadline without increasing the cost. ACTS considers the data access completion time for selecting the cost effective path to access the data. To allocate data access paths, the data access completion time is computed by considering the mean and variance of the network service time and the arrival rate of network input/output requests. Then the task priority is assigned to the removed tasks based data access time. Finally, the cost of data paths are analyzed and allocated based on the task priority. Minimum cost path is allocated to the low priority tasks and fast access path are allocated to high priority tasks as to meet the time deadline. Thus efficient task scheduling can be achieved by using ACTS. The experimental results conducted in terms of execution time, computation cost, communication cost, bandwidth, and CPU utilization prove that the proposed algorithm provides better performance than the state-of-the-art methods.

  11. SOME NOTES ON COST ALLOCATION IN MULTICASTING

    Directory of Open Access Journals (Sweden)

    Darko Skorin-Kapov

    2012-12-01

    Full Text Available We analyze the cost allocation strategies with the problef of broadcasting information from some source to a number of communication network users. A multicast routing chooses a minimum cost tree network that spans the source and all the receivers. The cost of such a network is distributed among its receivers who may be individuals or organizations with possibly conflicting interests. Providing network developers, users and owners with practical computable 'fair' cost allocation solution procedures is of great importance for network mamagement. Consequently, this multidisciplinary problem was extensively studied by Operational Researchers, Economists, Mathematicians and Computer Scientists. The fairness of various proposed solutions was even argued in US courts. This presentation overviews some previously published, as well as some recent results, in the development of algorithmic mechanisms to efficiently compute 'attractive' cost allocation solutions for multicast networks. Specifically, we will analyze cooperative game theory based cost allocation models that avoid cross subsidies and/or are distance and population monotonic. We will also present some related open cost allocation problems and the potential contribution that such models might make to this problem in the future.

  12. Cost analysis of living donor kidney transplantation in China: a single-center experience.

    Science.gov (United States)

    Zhao, Wenyu; Zhang, Lei; Han, Shu; Zhu, Youhua; Wang, Liming; Zhou, Meisheng; Zeng, Li

    2012-01-01

    Kidney transplantation is the most cost-effective option for the treatment of end-stage renal disease, but the financial aspects of kidney transplantation have not yet been fully investigated. The purpose of this study was to determine the hospital cost of living donor kidney transplantation in China and to identify factors associated with the high cost. Demographic and clinical data of 103 consecutive patients who underwent living donor kidney transplantation from January 2007 to January 2011 at our center were reviewed, and detailed hospital cost of initial admission for kidney transplantation was analyzed. A stepwise multiple regression analysis was computed to determine predictors affecting the total hospital cost. The median total hospital cost was US $10,531, of which 69.2% was for medications, 13.2% for surgical procedures, 11.4% for para clinics, 3.7% for accommodations, 0.5% for nursing care, and 2.0% for other miscellaneous medical services. A multivariate stepwise logistic regression model for overall cost of transplantation revealed that the length of hospital stay, induction therapy, steroid-resistant rejection, maintenance therapy, infection status and body weight were independent predictors affecting the total hospitalization cost. Although the cost of living donor kidney transplantation in China is much lower than that in developed countries, it is a heavy burden for both the government and the patients. As medications formed the greater proportion of the total hospitalization cost, efforts to reduce the cost of drugs should be addressed.

  13. A computed tomography study in the location of greater palatine artery in South Indian population for maxillary osteotomy

    Directory of Open Access Journals (Sweden)

    I Packiaraj

    2016-01-01

    Full Text Available Introduction: The greater palatine artery is one of the important feeding vessel to the maxilla. The surgeon should know the surgical anatomy of greater palatine artery to avoid trauma in maxilla which leads to ischemic problems. Aim: The CT evaluation of the distance between Pyriform aperture and the greater palatine foramen in various ages of both sexes. Result: The distance varies according to sex and age which are measured by CT and standardised. Discussion: The lateral nasal osteotomy can be done upto 25 mm depth, instead of 20 mm. Conclusion: By this study it shows that the lateral nasal wall osteotomy can be performed without injury to greater palatine artery.

  14. Positron emission tomography/computed tomography surveillance in patients with Hodgkin lymphoma in first remission has a low positive predictive value and high costs.

    Science.gov (United States)

    El-Galaly, Tarec Christoffer; Mylam, Karen Juul; Brown, Peter; Specht, Lena; Christiansen, Ilse; Munksgaard, Lars; Johnsen, Hans Erik; Loft, Annika; Bukh, Anne; Iyer, Victor; Nielsen, Anne Lerberg; Hutchings, Martin

    2012-06-01

    The value of performing post-therapy routine surveillance imaging in patients with Hodgkin lymphoma is controversial. This study evaluates the utility of positron emission tomography/computed tomography using 2-[18F]fluoro-2-deoxyglucose for this purpose and in situations with suspected lymphoma relapse. We conducted a multicenter retrospective study. Patients with newly diagnosed Hodgkin lymphoma achieving at least a partial remission on first-line therapy were eligible if they received positron emission tomography/computed tomography surveillance during follow-up. Two types of imaging surveillance were analyzed: "routine" when patients showed no signs of relapse at referral to positron emission tomography/computed tomography, and "clinically indicated" when recurrence was suspected. A total of 211 routine and 88 clinically indicated positron emission tomography/computed tomography studies were performed in 161 patients. In ten of 22 patients with recurrence of Hodgkin lymphoma, routine imaging surveillance was the primary tool for the diagnosis of the relapse. Extranodal disease, interim positron emission tomography-positive lesions and positron emission tomography activity at response evaluation were all associated with a positron emission tomography/computed tomography-diagnosed preclinical relapse. The true positive rates of routine and clinically indicated imaging were 5% and 13%, respectively (P = 0.02). The overall positive predictive value and negative predictive value of positron emission tomography/computed tomography were 28% and 100%, respectively. The estimated cost per routine imaging diagnosed relapse was US$ 50,778. Negative positron emission tomography/computed tomography reliably rules out a relapse. The high false positive rate is, however, an important limitation and a confirmatory biopsy is mandatory for the diagnosis of a relapse. With no proven survival benefit for patients with a pre-clinically diagnosed relapse, the high costs and low

  15. What does an MRI scan cost?

    Science.gov (United States)

    Young, David W

    2015-11-01

    Historically, hospital departments have computed the costs of individual tests or procedures using the ratio of cost to charges (RCC) method, which can produce inaccurate results. To determine a more accurate cost of a test or procedure, the activity-based costing (ABC) method must be used. Accurate cost calculations will ensure reliable information about the profitability of a hospital's DRGs.

  16. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  17. Genomic cloud computing: legal and ethical points to consider.

    Science.gov (United States)

    Dove, Edward S; Joly, Yann; Tassé, Anne-Marie; Knoppers, Bartha M

    2015-10-01

    The biggest challenge in twenty-first century data-intensive genomic science, is developing vast computer infrastructure and advanced software tools to perform comprehensive analyses of genomic data sets for biomedical research and clinical practice. Researchers are increasingly turning to cloud computing both as a solution to integrate data from genomics, systems biology and biomedical data mining and as an approach to analyze data to solve biomedical problems. Although cloud computing provides several benefits such as lower costs and greater efficiency, it also raises legal and ethical issues. In this article, we discuss three key 'points to consider' (data control; data security, confidentiality and transfer; and accountability) based on a preliminary review of several publicly available cloud service providers' Terms of Service. These 'points to consider' should be borne in mind by genomic research organizations when negotiating legal arrangements to store genomic data on a large commercial cloud service provider's servers. Diligent genomic cloud computing means leveraging security standards and evaluation processes as a means to protect data and entails many of the same good practices that researchers should always consider in securing their local infrastructure.

  18. The Protein Cost of Metabolic Fluxes: Prediction from Enzymatic Rate Laws and Cost Minimization.

    Directory of Open Access Journals (Sweden)

    Elad Noor

    2016-11-01

    Full Text Available Bacterial growth depends crucially on metabolic fluxes, which are limited by the cell's capacity to maintain metabolic enzymes. The necessary enzyme amount per unit flux is a major determinant of metabolic strategies both in evolution and bioengineering. It depends on enzyme parameters (such as kcat and KM constants, but also on metabolite concentrations. Moreover, similar amounts of different enzymes might incur different costs for the cell, depending on enzyme-specific properties such as protein size and half-life. Here, we developed enzyme cost minimization (ECM, a scalable method for computing enzyme amounts that support a given metabolic flux at a minimal protein cost. The complex interplay of enzyme and metabolite concentrations, e.g. through thermodynamic driving forces and enzyme saturation, would make it hard to solve this optimization problem directly. By treating enzyme cost as a function of metabolite levels, we formulated ECM as a numerically tractable, convex optimization problem. Its tiered approach allows for building models at different levels of detail, depending on the amount of available data. Validating our method with measured metabolite and protein levels in E. coli central metabolism, we found typical prediction fold errors of 4.1 and 2.6, respectively, for the two kinds of data. This result from the cost-optimized metabolic state is significantly better than randomly sampled metabolite profiles, supporting the hypothesis that enzyme cost is important for the fitness of E. coli. ECM can be used to predict enzyme levels and protein cost in natural and engineered pathways, and could be a valuable computational tool to assist metabolic engineering projects. Furthermore, it establishes a direct connection between protein cost and thermodynamics, and provides a physically plausible and computationally tractable way to include enzyme kinetics into constraint-based metabolic models, where kinetics have usually been ignored or

  19. Volunteer Computing for Science Gateways

    OpenAIRE

    Anderson, David

    2017-01-01

    This poster offers information about volunteer computing for science gateways that offer high-throughput computing services. Volunteer computing can be used to get computing power. This increases the visibility of the gateway to the general public as well as increasing computing capacity at little cost.

  20. Low-Cost Spectral Sensor Development Description.

    Energy Technology Data Exchange (ETDEWEB)

    Armijo, Kenneth Miguel; Yellowhair, Julius

    2014-11-01

    Solar spectral data for all parts of the US is limited due in part to the high cost of commercial spectrometers. Solar spectral information is necessary for accurate photovoltaic (PV) performance forecasting, especially for large utility-scale PV installations. A low-cost solar spectral sensor would address the obstacles and needs. In this report, a novel low-cost, discrete- band sensor device, comprised of five narrow-band sensors, is described. The hardware is comprised of commercial-off-the-shelf components to keep the cost low. Data processing algorithms were developed and are being refined for robustness. PV module short-circuit current ( I sc ) prediction methods were developed based on interaction-terms regression methodology and spectrum reconstruction methodology for computing I sc . The results suggest the computed spectrum using the reconstruction method agreed well with the measured spectrum from the wide-band spectrometer (RMS error of 38.2 W/m 2 -nm). Further analysis of computed I sc found a close correspondence of 0.05 A RMS error. The goal is for ubiquitous adoption of the low-cost spectral sensor in solar PV and other applications such as weather forecasting.

  1. 48 CFR 42.709-4 - Computing interest.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computing interest. 42.709... MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES Indirect Cost Rates 42.709-4 Computing interest. For 42.709-1(a)(1)(ii), compute interest on any paid portion of the disallowed cost as follows: (a) Consider...

  2. Spacelab experiment computer study. Volume 1: Executive summary (presentation)

    Science.gov (United States)

    Lewis, J. L.; Hodges, B. C.; Christy, J. O.

    1976-01-01

    A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.

  3. Experimental Validation of Plastic Mandible Models Produced by a "Low-Cost" 3-Dimensional Fused Deposition Modeling Printer.

    Science.gov (United States)

    Maschio, Federico; Pandya, Mirali; Olszewski, Raphael

    2016-03-22

    The objective of this study was to investigate the accuracy of 3-dimensional (3D) plastic (ABS) models generated using a low-cost 3D fused deposition modelling printer. Two human dry mandibles were scanned with a cone beam computed tomography (CBCT) Accuitomo device. Preprocessing consisted of 3D reconstruction with Maxilim software and STL file repair with Netfabb software. Then, the data were used to print 2 plastic replicas with a low-cost 3D fused deposition modeling printer (Up plus 2®). Two independent observers performed the identification of 26 anatomic landmarks on the 4 mandibles (2 dry and 2 replicas) with a 3D measuring arm. Each observer repeated the identifications 20 times. The comparison between the dry and plastic mandibles was based on 13 distances: 8 distances less than 12 mm and 5 distances greater than 12 mm. The mean absolute difference (MAD) was 0.37 mm, and the mean dimensional error (MDE) was 3.76%. The MDE decreased to 0.93% for distances greater than 12 mm. Plastic models generated using the low-cost 3D printer UPplus2® provide dimensional accuracies comparable to other well-established rapid prototyping technologies. Validated low-cost 3D printers could represent a step toward the better accessibility of rapid prototyping technologies in the medical field.

  4. Virtualization and cloud computing in dentistry.

    Science.gov (United States)

    Chow, Frank; Muftu, Ali; Shorter, Richard

    2014-01-01

    The use of virtualization and cloud computing has changed the way we use computers. Virtualization is a method of placing software called a hypervisor on the hardware of a computer or a host operating system. It allows a guest operating system to run on top of the physical computer with a virtual machine (i.e., virtual computer). Virtualization allows multiple virtual computers to run on top of one physical computer and to share its hardware resources, such as printers, scanners, and modems. This increases the efficient use of the computer by decreasing costs (e.g., hardware, electricity administration, and management) since only one physical computer is needed and running. This virtualization platform is the basis for cloud computing. It has expanded into areas of server and storage virtualization. One of the commonly used dental storage systems is cloud storage. Patient information is encrypted as required by the Health Insurance Portability and Accountability Act (HIPAA) and stored on off-site private cloud services for a monthly service fee. As computer costs continue to increase, so too will the need for more storage and processing power. Virtual and cloud computing will be a method for dentists to minimize costs and maximize computer efficiency in the near future. This article will provide some useful information on current uses of cloud computing.

  5. Computational Comparison of Several Greedy Algorithms for the Minimum Cost Perfect Matching Problem on Large Graphs

    DEFF Research Database (Denmark)

    Wøhlk, Sanne; Laporte, Gilbert

    2017-01-01

    The aim of this paper is to computationally compare several algorithms for the Minimum Cost Perfect Matching Problem on an undirected complete graph. Our work is motivated by the need to solve large instances of the Capacitated Arc Routing Problem (CARP) arising in the optimization of garbage...... collection in Denmark. Common heuristics for the CARP involve the optimal matching of the odd-degree nodes of a graph. The algorithms used in the comparison include the CPLEX solution of an exact formulation, the LEDA matching algorithm, a recent implementation of the Blossom algorithm, as well as six...

  6. What Does It Cost a University to Educate One Student

    Directory of Open Access Journals (Sweden)

    Maria Andrea Lotho Santiago

    2007-02-01

    Full Text Available A dilemma administrators continually face is whether to continue offering degree programs despite low student uptake, especially because producing reliable cost data to aid decision making can prove difficult. Often, a university determines a standard cost per credit or unit and uses this figure as a basis for computing the total cost of running a degree program. This is then compared to a revenue stream and the difference, whether positive or negative, is used in decision making. However, this method of computing costs, although appealing for its simplicity, may fail to capture the effects of economies that may arise as one school or college services another. In this paper, we use a basic cost accounting methodology applied to the higher education system of the Philippines to compute for a cost per degree per student for a sample of public and private universities. Although the methodology is more time consuming, the computed figures are deemed closer to actual costs and, thus, we argue, are more reliable as inputs to financial decision making.

  7. Computerized cost estimation spreadsheet and cost data base for fusion devices

    International Nuclear Information System (INIS)

    Hamilton, W.R.; Rothe, K.E.

    1985-01-01

    An automated approach to performing and cataloging cost estimates has been developed at the Fusion Engineering Design Center (FEDC), wherein the cost estimate record is stored in the LOTUS 1-2-3 spreadsheet on an IBM personal computer. The cost estimation spreadsheet is based on the cost coefficient/cost algorithm approach and incorporates a detailed generic code of cost accounts for both tokamak and tandem mirror devices. Component design parameters (weight, surface area, etc.) and cost factors are input, and direct and indirect costs are calculated. The cost data base file derived from actual cost experience within the fusion community and refined to be compatible with the spreadsheet costing approach is a catalog of cost coefficients, algorithms, and component costs arranged into data modules corresponding to specific components and/or subsystems. Each data module contains engineering, equipment, and installation labor cost data for different configurations and types of the specific component or subsystem. This paper describes the assumptions, definitions, methodology, and architecture incorporated in the development of the cost estimation spreadsheet and cost data base, along with the type of input required and the output format

  8. Development and implementation of a low cost micro computer system for LANDSAT analysis and geographic data base applications

    Science.gov (United States)

    Faust, N.; Jordon, L.

    1981-01-01

    Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.

  9. Greater accordance with the Dietary Approaches to Stop Hypertension dietary pattern is associated with lower diet-related greenhouse gas production but higher dietary costs in the United Kingdom.

    Science.gov (United States)

    Monsivais, Pablo; Scarborough, Peter; Lloyd, Tina; Mizdrak, Anja; Luben, Robert; Mulligan, Angela A; Wareham, Nicholas J; Woodcock, James

    2015-07-01

    The Dietary Approaches to Stop Hypertension (DASH) diet is a proven way to prevent and control hypertension and other chronic disease. Because the DASH diet emphasizes plant-based foods, including vegetables and grains, adhering to this diet might also bring about environmental benefits, including lower associated production of greenhouse gases (GHGs). The objective was to examine the interrelation between dietary accordance with the DASH diet and associated GHGs. A secondary aim was to examine the retail cost of diets by level of DASH accordance. In this cross-sectional study of adults aged 39-79 y from the European Prospective Investigation into Cancer and Nutrition-Norfolk, United Kingdom cohort (n = 24,293), dietary intakes estimated from food-frequency questionnaires were analyzed for their accordance with the 8 DASH food and nutrient-based targets. Associations between DASH accordance, GHGs, and dietary costs were evaluated in regression analyses. Dietary GHGs were estimated with United Kingdom-specific data on carbon dioxide equivalents associated with commodities and foods. Dietary costs were estimated by using national food prices from a United Kingdom-based supermarket comparison website. Greater accordance with the DASH dietary targets was associated with lower GHGs. Diets in the highest quintile of accordance had a GHG impact of 5.60 compared with 6.71 kg carbon dioxide equivalents/d for least-accordant diets (P dietary costs, with the mean cost of diets in the top quintile of DASH scores 18% higher than that of diets in the lowest quintile (P < 0.0001). Promoting wider uptake of the DASH diet in the United Kingdom may improve population health and reduce diet-related GHGs. However, to make the DASH diet more accessible, food affordability, particularly for lower income groups, will have to be addressed.

  10. Using Amazon's Elastic Compute Cloud to scale CMS' compute hardware dynamically.

    CERN Document Server

    Melo, Andrew Malone

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud-computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely on-demand as limits and caps on usage are imposed. Our trial workflows allow us t...

  11. Variability in the Initial Costs of Care and One-Year Outcomes of Observation Services

    Directory of Open Access Journals (Sweden)

    Abbass, Ibrahim

    2015-05-01

    Full Text Available Introduction: The use of observation units (OUs following emergency departments (ED visits as a model of care has increased exponentially in the last decade. About one-third of U.S. hospitals now have OUs within their facilities. While their use is associated with lower costs and comparable level of care compared to inpatient units, there is a wide variation in OUs characteristics and operational procedures. The objective of this research was to explore the variability in the initial costs of care of placing patients with non-specific chest pain in observation units (OUs and the one-year outcomes. Methods: The author retrospectively investigated medical insurance claims of 22,962 privately insured patients (2009-2011 admitted to 41 OUs. Outcomes included the one-year chest pain/cardiovascular related costs and primary and secondary outcomes. Primary outcomes included myocardial infarction, congestive heart failure, stroke or cardiac arrest, while secondary outcomes included revascularization procedures, ED revisits for angina pectoris or chest pain and hospitalization due to cardiovascular diseases. The author aggregated the adjusted costs and prevalence rates of outcomes for patients over OUs, and computed the weighted coefficients of variation (WCV to compare variations across OUs. Results: There was minimal variability in the initial costs of care (WCV=2.2%, while the author noticed greater variability in the outcomes. Greater variability were associated with the adjusted cardiovascular-related costs of medical services (WCV=17.6% followed by the adjusted prevalence odds ratio of patients experiencing primary outcomes (WCV=16.3% and secondary outcomes (WCV=10%. Conclusion: Higher variability in the outcomes suggests the need for more standardization of the observation services for chest pain patients. [West J Emerg Med. 2015;16(3:395–400.

  12. Greater confinement disposal of radioactive wastes

    International Nuclear Information System (INIS)

    Trevorrow, L.E.; Gilbert, T.L.; Luner, C.; Merry-Libby, P.A.; Meshkov, N.K.; Yu, C.

    1985-01-01

    Low-level radioactive waste (LLW) includes a broad spectrum of different radionuclide concentrations, half-lives, and hazards. Standard shallow-land burial practice can provide adequate protection of public health and safety for most LLW. A small volume fraction (approx. 1%) containing most of the activity inventory (approx. 90%) requires specific measures known as greater-confinement disposal (GCD). Different site characteristics and different waste characteristics - such as high radionuclide concentrations, long radionuclide half-lives, high radionuclide mobility, and physical or chemical characteristics that present exceptional hazards - lead to different GCD facility design requirements. Facility design alternatives considered for GCD include the augered shaft, deep trench, engineered structure, hydrofracture, improved waste form, and high-integrity container. Selection of an appropriate design must also consider the interplay between basic risk limits for protection of public health and safety, performance characteristics and objectives, costs, waste-acceptance criteria, waste characteristics, and site characteristics

  13. Drilling cost analysis

    International Nuclear Information System (INIS)

    Anand, A.B.

    1992-01-01

    Drilling assumes greater importance in present day uranium exploration which emphasizes to explore more areas on the basis of conceptual model than merely on surface anomalies. But drilling is as costly as it is important and consumes a major share (50% to 60%) of the exploration budget. As such the cost of drilling has great bearing on the exploration strategy as well as on the overall cost of the project. Therefore, understanding the cost analysis is very much important when planning or intensifying an exploration programme. This not only helps in controlling the current operations but also in planning the budgetary provisions for future operations. Also, if the work is entrusted to a private party, knowledge of in-house cost analysis helps in fixing the rates of drilling in different formations and areas to be drilled. Under this topic, various factors that contribute to the cost of drilling per meter as well as ways to minimize the drilling cost for better economic evaluation of mineral deposits are discussed. (author)

  14. The thermodynamic cost of quantum operations

    International Nuclear Information System (INIS)

    Bedingham, D J; Maroney, O J E

    2016-01-01

    The amount of heat generated by computers is rapidly becoming one of the main problems for developing new generations of information technology. The thermodynamics of computation sets the ultimate physical bounds on heat generation. A lower bound is set by the Landauer limit, at which computation becomes thermodynamically reversible. For classical computation there is no physical principle which prevents this limit being reached, and approaches to it are already being experimentally tested. In this paper we show that for quantum computation with a set of signal states satisfying given conditions, there is an unavoidable excess heat generation that renders it inherently thermodynamically irreversible. The Landauer limit cannot, in general, be reached by quantum computers. We show the existence of a lower bound to the heat generated by quantum computing that exceeds that given by the Landauer limit, give the special conditions where this excess cost may be avoided, and provide a protocol for achieving the limiting heat cost when these conditions are met. We also show how classical computing falls within the special conditions. (paper)

  15. How do high cost-sharing policies for physician care affect total care costs among people with chronic disease?

    Science.gov (United States)

    Xin, Haichang; Harman, Jeffrey S; Yang, Zhou

    2014-01-01

    This study examines whether high cost-sharing in physician care is associated with a differential impact on total care costs by health status. Total care includes physician care, emergency room (ER) visits and inpatient care. Since high cost-sharing policies can reduce needed care as well as unneeded care use, it raises the concern whether these policies are a good strategy for controlling costs among chronically ill patients. This study used the 2007 Medical Expenditure Panel Survey data with a cross-sectional study design. Difference in difference (DID), instrumental variable technique, two-part model, and bootstrap technique were employed to analyze cost data. Chronically ill individuals' probability of reducing any overall care costs was significantly less than healthier individuals (beta = 2.18, p = 0.04), while the integrated DID estimator from split results indicated that going from low cost-sharing to high cost-sharing significantly reduced costs by $12,853.23 more for sick people than for healthy people (95% CI: -$17,582.86, -$8,123.60). This greater cost reduction in total care among sick people likely resulted from greater cost reduction in physician care, and may have come at the expense of jeopardizing health outcomes by depriving patients of needed care. Thus, these policies would be inappropriate in the short run, and unlikely in the long run to control health plans costs among chronically ill individuals. A generous benefit design with low cost-sharing policies in physician care or primary care is recommended for both health plans and chronically ill individuals, to save costs and protect these enrollees' health status.

  16. The Cost of Smoking in California.

    Science.gov (United States)

    Max, Wendy; Sung, Hai-Yen; Shi, Yanling; Stark, Brad

    2016-05-01

    The economic impact of smoking, including healthcare costs and the value of lost productivity due to illness and mortality, was estimated for California for 2009. Smoking-attributable healthcare costs were estimated using a series of econometric models that estimate expenditures for hospital care, ambulatory care, prescriptions, home health care, and nursing home care. Lost productivity due to illness was estimated using an econometric model predicting how smoking status affects the number of days lost from work or other activities. The value of lives lost from premature mortality due to smoking was estimated using an epidemiological approach. Almost 4 million Californians still smoke, including 146 000 adolescents. The cost of smoking in 2009 totaled $18.1 billion, including $9.8 billion in healthcare costs, $1.4 billion in lost productivity from illness, and $6.8 billion in lost productivity from premature mortality. This amounts to $487 per California resident and $4603 per smoker. Costs were greater for men than for women. Hospital costs comprised 44% of healthcare costs. Despite extensive efforts at tobacco control in California, healthcare and lost productivity costs attributable to smoking remain high. Compared to costs for 1999, the total cost was 15% greater in 2009. However, after adjusting for inflation, real costs have fallen by 13% over the past decade, indicating that efforts have been successful in reducing the economic burden of smoking in the state. © The Author 2015. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. 28 CFR 100.16 - Cost estimate submission.

    Science.gov (United States)

    2010-07-01

    ..., quantity, and cost. (ii) Direct labor. Provide a time-phased (e.g., monthly, quarterly) breakdown of labor... estimates. (iii) Allocable direct costs. Indicate how allocable costs are computed and applied, including... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Cost estimate submission. 100.16 Section...

  18. Acute costs and predictors of higher treatment costs of trauma in New South Wales, Australia.

    Science.gov (United States)

    Curtis, Kate; Lam, Mary; Mitchell, Rebecca; Black, Deborah; Taylor, Colman; Dickson, Cara; Jan, Stephen; Palmer, Cameron S; Langcake, Mary; Myburgh, John

    2014-01-01

    Accurate economic data are fundamental for improving current funding models and ultimately in promoting the efficient delivery of services. The financial burden of a high trauma casemix to designated trauma centres in Australia has not been previously determined, and there is some evidence that the episode funding model used in Australia results in the underfunding of trauma. To describe the costs of acute trauma admissions in trauma centres, identify predictors of higher treatment costs and cost variance in New South Wales (NSW), Australia. Data linkage of admitted trauma patient and financial data provided by 12 Level 1 NSW trauma centres for the 08/09 financial year was performed. Demographic, injury details and injury scores were obtained from trauma registries. Individual patient general ledger costs (actual trauma patient costs), Australian Refined Diagnostic Related Groups (AR-DRG) and state-wide average costs (which form the basis of funding) were obtained. The actual costs incurred by the hospital were then compared with the state-wide AR-DRG average costs. Multivariable multiple linear regression was used for identifying predictors of costs. There were 17,522 patients, the average per patient cost was $10,603 and the median was $4628 (interquartile range: $2179-10,148). The actual costs incurred by trauma centres were on average $134 per bed day above AR-DRG costs-determined costs. Falls, road trauma and violence were the highest causes of total cost. Motor cyclists and pedestrians had higher median costs than motor vehicle occupants. As a result of greater numbers, patients with minor injury had comparable total costs with those generated by patients with severe injury. However the median cost of severely injured patients was nearly four times greater. The count of body regions injured, sex, length of stay, serious traumatic brain injury and admission to the Intensive Care Unit were significantly associated with increased costs (p<0.001). This

  19. Cost-effectiveness and the socialization of health care.

    Science.gov (United States)

    Musgrove, P

    1995-01-01

    The more health care is socialized, the more cost-effectiveness is an appropriate criterion for expenditure. Utility-maximizing individuals, facing divisibility of health care purchases and declining marginal health gains, and complete information about probable health improvements, should buy health care according to its cost-effectiveness. Absent these features, individual health spending will not be cost-effective; and in any case, differences in personal utilities and risk aversion will not lead to the same ranking of health care interventions for everyone. Private insurance frees consumers from concern for cost, which undermines cost-effectiveness, but lets them emphasize effectiveness, which favors value for money. This is most important for costly and cost-effective interventions, especially for poor people. Cost-effectiveness is more appropriate and easier to achieve under second-party insurance. More complete socialization of health care, via public finance, can yield greater efficiency by making insurance compulsory. Cost-effectiveness is also more attractive when taxpayers subsidize others' care: needs (effectiveness) take precedence over wants (utility). The gain in effectiveness may be greater, and the welfare loss from Pareto non-optimality smaller, in poor countries than in rich ones.

  20. An integrated on-line system for the evaluation of ECG patterns with a small process computer

    International Nuclear Information System (INIS)

    Schoffa, G.; Eggenberger, O.; Krueger, G.; Karlsruhe Univ.

    1975-01-01

    This paper describes an on-line system for ECG processing with a small computer (8K memory) and a magnetic tape cassette for mass storage capable to evaluate 30 ECG patterns in a twelfe lead system per day. The use of a small computer was possible by a compact and easy-to-handle operating system and space-saving programs. The system described was specifically intended for use in smaller hospitals with a low number of ECG's per day which do not allow an economic operation of greater DP installations. The economy calculations, based on the 'Break-even-point method' with special regard to the installations, mainennance and personnel costs already grant an economic operation of a small computer at a rate of 5 ECG's per day. (orig.) [de

  1. Suspected acute pulmonary emboli: cost-effectiveness of chest helical computed tomography versus a standard diagnostic algorithm incorporating ventilation-perfusion scintigraphy

    International Nuclear Information System (INIS)

    Larcos, G.; Chi, K.K.G.; Berry, G.; Westmead Hospital, Sydney, NSW; Shiell, A.

    2000-01-01

    There is a controversy regarding the investigation of patients with suspected acute pulmonary embolism (PE). To compare the cost-effectiveness of alternative methods of diagnosing acute PE, chest helical computed tomography (CT) alone and in combination with venous ultrasound (US) of legs and pulmonary angiography (PA) were compared to a conventional algorithm using ventilation-perfusion (V/Q) scintigraphy supplemented in selected cases by US and PA. A decision-analytical model was constructed to model the costs and effects of the three diagnostic strategies in a hypothetical cohort of 1000 patients each. Transition probabilities were based on published data. Life years gained by each strategy were estimated from published mortality rates. Schedule fees were used to estimate costs. The V/Q protocol is both more expensive and more effective than CT alone resulting in 20.1 additional lives saved at a (discounted) cost of $940 per life year gained. An additional 2.5 lives can be saved if CT replaces V/Q scintigraphy in the diagnostic algorithm but at a cost of $23,905 per life year saved. It resulted that the more effective diagnostic strategies are also more expensive. In patients with suspected PE, the incremental cost-effectiveness of the V/Q based strategy over CT alone is reasonable in comparison with other health interventions. The cost-effectiveness of the supplemented CT strategy is more questionable. Copyright (2000) The Australasian College of Physicians

  2. The truncated conjugate gradient (TCG), a non-iterative/fixed-cost strategy for computing polarization in molecular dynamics: Fast evaluation of analytical forces

    Science.gov (United States)

    Aviat, Félix; Lagardère, Louis; Piquemal, Jean-Philip

    2017-10-01

    In a recent paper [F. Aviat et al., J. Chem. Theory Comput. 13, 180-190 (2017)], we proposed the Truncated Conjugate Gradient (TCG) approach to compute the polarization energy and forces in polarizable molecular simulations. The method consists in truncating the conjugate gradient algorithm at a fixed predetermined order leading to a fixed computational cost and can thus be considered "non-iterative." This gives the possibility to derive analytical forces avoiding the usual energy conservation (i.e., drifts) issues occurring with iterative approaches. A key point concerns the evaluation of the analytical gradients, which is more complex than that with a usual solver. In this paper, after reviewing the present state of the art of polarization solvers, we detail a viable strategy for the efficient implementation of the TCG calculation. The complete cost of the approach is then measured as it is tested using a multi-time step scheme and compared to timings using usual iterative approaches. We show that the TCG methods are more efficient than traditional techniques, making it a method of choice for future long molecular dynamics simulations using polarizable force fields where energy conservation matters. We detail the various steps required for the implementation of the complete method by software developers.

  3. 25 CFR 700.81 - Monthly housing cost.

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false Monthly housing cost. 700.81 Section 700.81 Indians THE... Policies and Instructions Definitions § 700.81 Monthly housing cost. (a) General. The term monthly housing...) Computation of monthly housing cost for replacement dwelling. A person's monthly housing cost for a...

  4. Nuclear generating station and heavy water plant cost estimates for strategy studies

    International Nuclear Information System (INIS)

    Archinoff, G.H.

    1979-07-01

    Nuclear generating station capital, operating and maintenance costs are basic input data for strategy analyses of alternate nuclear fuel cycles. This report presents estimates of these costs for natural uranium CANDU stations, CANDU stations operating on advanced fuel cycles, and liquid metal fast breeder reactors. Cost estimates for heavy water plants are also presented. The results show that station capital costs for advanced fuel cycles are not expected to be significantly greater than those for natural uranium stations. LMFBR capital costs are expected to be 25-30 percent greater than for CANDU's. (auth)

  5. Greater-confinement disposal of low-level radioactive wastes

    International Nuclear Information System (INIS)

    Trevorrow, L.E.; Gilbert, T.L.; Luner, C.; Merry-Libby, P.A.; Meshkov, N.K.; Yu, C.

    1985-01-01

    Low-level radioactive wastes include a broad spectrum of wastes that have different radionuclide concentrations, half-lives, and physical and chemical properties. Standard shallow-land burial practice can provide adequate protection of public health and safety for most low-level wastes, but a small volume fraction (about 1%) containing most of the activity inventory (approx.90%) requires specific measures known as ''greater-confinement disposal'' (GCD). Different site characteristics and different waste characteristics - such as high radionuclide concentrations, long radionuclide half-lives, high radionuclide mobility, and physical or chemical characteristics that present exceptional hazards - lead to different GCD facility design requirements. Facility design alternatives considered for GCD include the augered shaft, deep trench, engineered structure, hydrofracture, improved waste form, and high-integrity container. Selection of an appropriate design must also consider the interplay between basic risk limits for protection of public health and safety, performance characteristics and objectives, costs, waste-acceptance criteria, waste characteristics, and site characteristics. This paper presents an overview of the factors that must be considered in planning the application of methods proposed for providing greater confinement of low-level wastes. 27 refs

  6. Personal Computer Based Controller For Switched Reluctance Motor Drives

    Science.gov (United States)

    Mang, X.; Krishnan, R.; Adkar, S.; Chandramouli, G.

    1987-10-01

    Th9, switched reluctance motor (SRM) has recently gained considerable attention in the variable speed drive market. Two important factors that have contributed to this are, the simplicity of construction and the possibility of developing low cost con-trollers with minimum number of switching devices in the drive circuits. This is mainly due to the state-of-art of the present digital circuits technology and the low cost of switching devices. The control of this motor drive is under research. Optimized performance of the SRM motor drive is very dependent on the integration of the controller, converter and the motor. This research on system integration involves considerable changes in the control algorithms and their implementation. A Personal computer (PC) based controller is very appropriate for this purpose. Accordingly, the present paper is concerned with the design of a PC based controller for a SRM. The PC allows for real-time microprocessor control with the possibility of on-line system parameter modifications. Software reconfiguration of this controller is easier than a hardware based controller. User friendliness is a natural consequence of such a system. Considering the low cost of PCs, this controller will offer an excellent cost-effective means of studying the control strategies for the SRM drive intop greater detail than in the past.

  7. Local matching indicators for concave transport costs

    OpenAIRE

    Delon , Julie; Salomon , Julien; Sobolevskii , A.

    2010-01-01

    International audience; In this note, we introduce a class of indicators that enable to compute efficiently optimal transport plans associated to arbitrary distributions of $N$ demands and $N$ supplies in $\\mathbf{R}$ in the case where the cost function is concave. The computational cost of these indicators is small and independent of $N$. A hierarchical use of them enables to obtain an efficient algorithm.

  8. Greater happiness for a greater number: Is that possible in Austria?

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    2011-01-01

    textabstractWhat is the final goal of public policy? Jeremy Bentham (1789) would say: greater happiness for a greater number. He thought of happiness as subjective enjoyment of life; in his words as “the sum of pleasures and pains”. In his time the happiness of the great number could not be measured

  9. Greater happiness for a greater number: Is that possible in Germany?

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    2009-01-01

    textabstractWhat is the final goal of public policy? Jeremy Bentham (1789) would say: greater happiness for a greater number. He thought of happiness as subjective enjoyment of life; in his words as “the sum of pleasures and pains”. In his time the Happiness of the great number could not be measured

  10. Effectiveness and cost-effectiveness of computer and other electronic aids for smoking cessation: a systematic review and network meta-analysis.

    Science.gov (United States)

    Chen, Y-F; Madan, J; Welton, N; Yahaya, I; Aveyard, P; Bauld, L; Wang, D; Fry-Smith, A; Munafò, M R

    2012-01-01

    Smoking is harmful to health. On average, lifelong smokers lose 10 years of life, and about half of all lifelong smokers have their lives shortened by smoking. Stopping smoking reverses or prevents many of these harms. However, cessation services in the NHS achieve variable success rates with smokers who want to quit. Approaches to behaviour change can be supplemented with electronic aids, and this may significantly increase quit rates and prevent a proportion of cases that relapse. The primary research question we sought to answer was: What is the effectiveness and cost-effectiveness of internet, pc and other electronic aids to help people stop smoking? We addressed the following three questions: (1) What is the effectiveness of internet sites, computer programs, mobile telephone text messages and other electronic aids for smoking cessation and/or reducing relapse? (2) What is the cost-effectiveness of incorporating internet sites, computer programs, mobile telephone text messages and other electronic aids into current nhs smoking cessation programmes? and (3) What are the current gaps in research into the effectiveness of internet sites, computer programs, mobile telephone text messages and other electronic aids to help people stop smoking? For the effectiveness review, relevant primary studies were sought from The Cochrane Library [Cochrane Central Register of Controlled Trials (CENTRAL)] 2009, Issue 4, and MEDLINE (Ovid), EMBASE (Ovid), PsycINFO (Ovid), Health Management Information Consortium (HMIC) (Ovid) and Cumulative Index to Nursing and Allied Health Literature (CINAHL) (EBSCOhost) from 1980 to December 2009. In addition, NHS Economic Evaluation Database (NHS EED) and Database of Abstracts of Reviews of Effects (DARE) were searched for information on cost-effectiveness and modelling for the same period. Reference lists of included studies and of relevant systematic reviews were examined to identify further potentially relevant studies. Research registries

  11. Development of a small-scale computer cluster

    Science.gov (United States)

    Wilhelm, Jay; Smith, Justin T.; Smith, James E.

    2008-04-01

    An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.

  12. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming

    Science.gov (United States)

    Philip A. Araman

    1990-01-01

    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  13. Consumer Dispersion and Logistics Costs in Various Distribution Systems

    DEFF Research Database (Denmark)

    Turkensteen, Marcel; Klose, Andreas

    We address the relationship between the geographical dispersion of a set of demand points and the expected logistics costs. This is relevant in the strategic marketing decision which groups of consumers to target. We devise quickly computable measures for the logistics costs. In our experiments......, dispersed sets of demand points are created. For various types of distribution systems, expected logistics costs are computed using continuous approximation, location and routing methodologies. We find that the average distance between locations is an effective estimate of the logistics costs....

  14. Optimizing Data Centre Energy and Environmental Costs

    Science.gov (United States)

    Aikema, David Hendrik

    Data centres use an estimated 2% of US electrical power which accounts for much of their total cost of ownership. This consumption continues to grow, further straining power grids attempting to integrate more renewable energy. This dissertation focuses on assessing and reducing data centre environmental and financial costs. Emissions of projects undertaken to lower the data centre environmental footprints can be assessed and the emission reduction projects compared using an ISO-14064-2-compliant greenhouse gas reduction protocol outlined herein. I was closely involved with the development of the protocol. Full lifecycle analysis and verifying that projects exceed business-as-usual expectations are addressed, and a test project is described. Consuming power when it is low cost or when renewable energy is available can be used to reduce the financial and environmental costs of computing. Adaptation based on the power price showed 10--50% potential savings in typical cases, and local renewable energy use could be increased by 10--80%. Allowing a fraction of high-priority tasks to proceed unimpeded still allows significant savings. Power grid operators use mechanisms called ancillary services to address variation and system failures, paying organizations to alter power consumption on request. By bidding to offer these services, data centres may be able to lower their energy costs while reducing their environmental impact. If providing contingency reserves which require only infrequent action, savings of up to 12% were seen in simulations. Greater power cost savings are possible for those ceding more control to the power grid operator. Coordinating multiple data centres adds overhead, and altering at which data centre requests are processed based on changes in the financial or environmental costs of power is likely to increase this overhead. Tests of virtual machine migrations showed that in some cases there was no visible increase in power use while in others power use

  15. 20 CFR 404.270 - Cost-of-living increases.

    Science.gov (United States)

    2010-04-01

    ... INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases § 404.270 Cost-of-living... rises in the cost of living. These automatic increases also apply to other benefit amounts, as described...

  16. Greater accordance with the Dietary Approaches to Stop Hypertension dietary pattern is associated with lower diet-related greenhouse gas production but higher dietary costs in the United Kingdom12

    Science.gov (United States)

    Monsivais, Pablo; Scarborough, Peter; Lloyd, Tina; Mizdrak, Anja; Luben, Robert; Mulligan, Angela A; Wareham, Nicholas J; Woodcock, James

    2015-01-01

    Background: The Dietary Approaches to Stop Hypertension (DASH) diet is a proven way to prevent and control hypertension and other chronic disease. Because the DASH diet emphasizes plant-based foods, including vegetables and grains, adhering to this diet might also bring about environmental benefits, including lower associated production of greenhouse gases (GHGs). Objective: The objective was to examine the interrelation between dietary accordance with the DASH diet and associated GHGs. A secondary aim was to examine the retail cost of diets by level of DASH accordance. Design: In this cross-sectional study of adults aged 39–79 y from the European Prospective Investigation into Cancer and Nutrition–Norfolk, United Kingdom cohort (n = 24,293), dietary intakes estimated from food-frequency questionnaires were analyzed for their accordance with the 8 DASH food and nutrient-based targets. Associations between DASH accordance, GHGs, and dietary costs were evaluated in regression analyses. Dietary GHGs were estimated with United Kingdom-specific data on carbon dioxide equivalents associated with commodities and foods. Dietary costs were estimated by using national food prices from a United Kingdom–based supermarket comparison website. Results: Greater accordance with the DASH dietary targets was associated with lower GHGs. Diets in the highest quintile of accordance had a GHG impact of 5.60 compared with 6.71 kg carbon dioxide equivalents/d for least-accordant diets (P dietary costs, with the mean cost of diets in the top quintile of DASH scores 18% higher than that of diets in the lowest quintile (P < 0.0001). Conclusions: Promoting wider uptake of the DASH diet in the United Kingdom may improve population health and reduce diet-related GHGs. However, to make the DASH diet more accessible, food affordability, particularly for lower income groups, will have to be addressed. PMID:25926505

  17. Development of computer software for pavement life cycle cost analysis.

    Science.gov (United States)

    1988-01-01

    The life cycle cost analysis program (LCCA) is designed to automate and standardize life cycle costing in Virginia. It allows the user to input information necessary for the analysis, and it then completes the calculations and produces a printed copy...

  18. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  19. The Challenge of Computers.

    Science.gov (United States)

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  20. Engineering computations at the national magnetic fusion energy computer center

    International Nuclear Information System (INIS)

    Murty, S.

    1983-01-01

    The National Magnetic Fusion Energy Computer Center (NMFECC) was established by the U.S. Department of Energy's Division of Magnetic Fusion Energy (MFE). The NMFECC headquarters is located at Lawrence Livermore National Laboratory. Its purpose is to apply large-scale computational technology and computing techniques to the problems of controlled thermonuclear research. In addition to providing cost effective computing services, the NMFECC also maintains a large collection of computer codes in mathematics, physics, and engineering that is shared by the entire MFE research community. This review provides a broad perspective of the NMFECC, and a list of available codes at the NMFECC for engineering computations is given

  1. MODERN ADVANCES IMPLEMENTATION FOR A PASTROL VENTURE MODELS OF NOVEL CLOUD COMPUTING

    OpenAIRE

    Sandeep Kumar* Ankur Goel

    2018-01-01

    In this paper nnovations are expected to affect the progress in environment. A majority of enterprises are effecting to cut back their computing cost from the options for virtualization. This need for lowering the computing cost has ended in the innovation of Cloud Computing. Cloud Computing offers better computing through improved utilization and reduced administration and infrastructure cost. Cloud Computing is separated around the world in distinguish format. This is the schema to emerge h...

  2. Cloud Computing with iPlant Atmosphere.

    Science.gov (United States)

    McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos

    2013-10-15

    Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.

  3. Scaling cost-sharing to wages: how employers can reduce health spending and provide greater economic security.

    Science.gov (United States)

    Robertson, Christopher T

    2014-01-01

    In the employer-sponsored insurance market that covers most Americans; many workers are "underinsured." The evidence shows onerous out-of-pocket payments causing them to forgo needed care, miss work, and fall into bankruptcies and foreclosures. Nonetheless, many higher-paid workers are "overinsured": the evidence shows that in this domain, surplus insurance stimulates spending and price inflation without improving health. Employers can solve these problems together by scaling cost-sharing to wages. This reform would make insurance better protect against risk and guarantee access to care, while maintaining or even reducing insurance premiums. Yet, there are legal obstacles to scaled cost-sharing. The group-based nature of employer health insurance, reinforced by federal law, makes it difficult for scaling to be achieved through individual choices. The Affordable Care Act's (ACA) "essential coverage" mandate also caps cost-sharing even for wealthy workers that need no such cap. Additionally, there is a tax distortion in favor of highly paid workers purchasing healthcare through insurance rather than out-of-pocket. These problems are all surmountable. In particular, the ACA has expanded the applicability of an unenforced employee-benefits rule that prohibits "discrimination" in favor of highly compensated workers. A novel analysis shows that this statute gives the Internal Revenue Service the authority to require scaling and to thereby eliminate the current inequities and inefficiencies caused by the tax distortion. The promise is smarter insurance for over 150 million Americans.

  4. CECP, Decommissioning Costs for PWR and BWR

    International Nuclear Information System (INIS)

    Bierschbach, M.C.

    1997-01-01

    1 - Description of program or function: The Cost Estimating Computer Program CECP, designed for use on an IBM personal computer or equivalent, was developed for estimating the cost of decommissioning boiling water reactor (BWR) and light-water reactor (PWR) power stations to the point of license termination. 2 - Method of solution: Cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial volume and costs; and manpower staffing costs. Using equipment and consumables costs and inventory data supplied by the user, CECP calculates unit cost factors and then combines these factors with transportation and burial cost algorithms to produce a complete report of decommissioning costs. In addition to costs, CECP also calculates person-hours, crew-hours, and exposure person-hours associated with decommissioning. 3 - Restrictions on the complexity of the problem: The program is designed for a specific waste charge structure. The waste cost data structure cannot handle intermediate waste handlers or changes in the charge rate structures. The decommissioning of a reactor can be divided into 5 periods. 200 different items for special equipment costs are possible. The maximum amount for each special equipment item is 99,999,999$. You can support data for 10 buildings, 100 components each; ESTS1071/01: There are 65 components for 28 systems available to specify the contaminated systems costs (BWR). ESTS1071/02: There are 75 components for 25 systems available to specify the contaminated systems costs (PWR)

  5. Cost-effectiveness of alternative management strategies for patients with solitary pulmonary nodules.

    Science.gov (United States)

    Gould, Michael K; Sanders, Gillian D; Barnett, Paul G; Rydzak, Chara E; Maclean, Courtney C; McClellan, Mark B; Owens, Douglas K

    2003-05-06

    Positron emission tomography (PET) with 18-fluorodeoxyglucose (FDG) is a potentially useful but expensive test to diagnose solitary pulmonary nodules. To evaluate the cost-effectiveness of strategies for pulmonary nodule diagnosis and to specifically compare strategies that did and did not include FDG-PET. Decision model. Accuracy and complications of diagnostic tests were estimated by using meta-analysis and literature review. Modeled survival was based on data from a large tumor registry. Cost estimates were derived from Medicare reimbursement and other sources. All adult patients with a new, noncalcified pulmonary nodule seen on chest radiograph. Patient lifetime. Societal. 40 clinically plausible combinations of 5 diagnostic interventions, including computed tomography, FDG-PET, transthoracic needle biopsy, surgery, and watchful waiting. Costs, quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios. The cost-effectiveness of strategies depended critically on the pretest probability of malignancy. For patients with low pretest probability (26%), strategies that used FDG-PET selectively when computed tomography results were possibly malignant cost as little as 20 000 dollars per QALY gained. For patients with high pretest probability (79%), strategies that used FDG-PET selectively when computed tomography results were benign cost as little as 16 000 dollars per QALY gained. For patients with intermediate pretest probability (55%), FDG-PET strategies cost more than 220 000 dollars per QALY gained because they were more costly but only marginally more effective than computed tomography-based strategies. The choice of strategy also depended on the risk for surgical complications, the probability of nondiagnostic needle biopsy, the sensitivity of computed tomography, and patient preferences for time spent in watchful waiting. In probabilistic sensitivity analysis, FDG-PET strategies were cost saving or cost less than 100 000 dollars per QALY

  6. Thoracoabdominal computed tomography in trauma patients: a cost-consequences analysis

    NARCIS (Netherlands)

    Vugt, R. van; Kool, D.R.; Brink, M.; Dekker, H.M.; Deunk, J.; Edwards, M.J.R.

    2014-01-01

    BACKGROUND: CT is increasingly used during the initial evaluation of blunt trauma patients. In this era of increasing cost-awareness, the pros and cons of CT have to be assessed. OBJECTIVES: This study was performed to evaluate cost-consequences of different diagnostic algorithms that use

  7. Does the cost of robotic cholecystectomy translate to a financial burden?

    Science.gov (United States)

    Rosemurgy, Alexander; Ryan, Carrie; Klein, Richard; Sukharamwala, Prashant; Wood, Thomas; Ross, Sharona

    2015-08-01

    Robotic application to cholecystectomy has dramatically increased, though its impact on cost of care and reimbursement has not been elucidated. We undertook this study to evaluate and compare cost of care and reimbursement with robotic versus laparoscopic cholecystectomy. The charges and reimbursement of all robotic and laparoscopic cholecystectomies at one hospital undertaken from June 2012 to June 2013 were determined. Operative duration is defined as time into and time out of the operating room. Data are presented as median data. Comparisons were undertaken using the Mann-Whitney U-test with significance accepted at p ≤ 0.05. Robotic cholecystectomy took longer (47 min longer) and had greater charges ($8,182.57 greater) than laparoscopic cholecystectomy (p depreciation, interest, and taxes (EBDIT), and Net Income were not impacted by approach. Relative to laparoscopic cholecystectomy, robotic cholecystectomy takes longer and has greater charges. Revenue, EBDIT, and Net Income are similar after either approach; this indicates that costs with either approach are similar. Notably, this is possible because much of hospital-based costs are determined by cost allocation and not cost accounting. Thus, the cost of longer operations and costs inherent to the robotic approach for cholecystectomy do not translate to a perceived financial burden.

  8. Computed tomography for preoperative planning in minimal-invasive total hip arthroplasty: Radiation exposure and cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huppertz, Alexander, E-mail: Alexander.Huppertz@charite.de [Imaging Science Institute Charite Berlin, Robert-Koch-Platz 7, D-10115 Berlin (Germany); Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Radmer, Sebastian, E-mail: s.radmer@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany); Asbach, Patrick, E-mail: Patrick.Asbach@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Juran, Ralf, E-mail: ralf.juran@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Schwenke, Carsten, E-mail: carsten.schwenke@scossis.de [Biostatistician, Scossis Statistical Consulting, Zeltinger Str. 58G, D-13465 Berlin (Germany); Diederichs, Gerd, E-mail: gerd.diederichs@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Hamm, Bernd, E-mail: Bernd.Hamm@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Sparmann, Martin, E-mail: m.sparmann@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany)

    2011-06-15

    Computed tomography (CT) was used for preoperative planning of minimal-invasive total hip arthroplasty (THA). 92 patients (50 males, 42 females, mean age 59.5 years) with a mean body-mass-index (BMI) of 26.5 kg/m{sup 2} underwent 64-slice CT to depict the pelvis, the knee and the ankle in three independent acquisitions using combined x-, y-, and z-axis tube current modulation. Arthroplasty planning was performed using 3D-Hip Plan (Symbios, Switzerland) and patient radiation dose exposure was determined. The effects of BMI, gender, and contralateral THA on the effective dose were evaluated by an analysis-of-variance. A process-cost-analysis from the hospital perspective was done. All CT examinations were of sufficient image quality for 3D-THA planning. A mean effective dose of 4.0 mSv (SD 0.9 mSv) modeled by the BMI (p < 0.0001) was calculated. The presence of a contralateral THA (9/92 patients; p = 0.15) and the difference between males and females were not significant (p = 0.08). Personnel involved were the radiologist (4 min), the surgeon (16 min), the radiographer (12 min), and administrative personnel (4 min). A CT operation time of 11 min and direct per-patient costs of 52.80 Euro were recorded. Preoperative CT for THA was associated with a slight and justifiable increase of radiation exposure in comparison to conventional radiographs and low per-patient costs.

  9. Cost-effectiveness of EOB-MRI for Hepatocellular Carcinoma in Japan.

    Science.gov (United States)

    Nishie, Akihiro; Goshima, Satoshi; Haradome, Hiroki; Hatano, Etsuro; Imai, Yasuharu; Kudo, Masatoshi; Matsuda, Masanori; Motosugi, Utaroh; Saitoh, Satoshi; Yoshimitsu, Kengo; Crawford, Bruce; Kruger, Eliza; Ball, Graeme; Honda, Hiroshi

    2017-04-01

    The objective of the study was to evaluate the cost-effectiveness of gadoxetic acid-enhanced magnetic resonance imaging (EOB-MRI) in the diagnosis and treatment of hepatocellular carcinoma (HCC) in Japan compared with extracellular contrast media-enhanced MRI (ECCM-MRI) and contrast media-enhanced computed tomography (CE-CT) scanning. A 6-stage Markov model was developed to estimate lifetime direct costs and clinical outcomes associated with EOB-MRI. Diagnostic sensitivity and specificity, along with clinical data on HCC survival, recurrence, treatment patterns, costs, and health state utility values, were derived from predominantly Japanese publications. Parameters unavailable from publications were estimated in a Delphi panel of Japanese clinical experts who also confirmed the structure and overall approach of the model. Sensitivity analyses, including one-way, probabilistic, and scenario analyses, were conducted to account for uncertainty in the results. Over a lifetime horizon, EOB-MRI was associated with lower direct costs (¥2,174,869) and generated a greater number of quality-adjusted life years (QALYs) (9.502) than either ECCM-MRI (¥2,365,421, 9.303 QALYs) or CE-CT (¥2,482,608, 9.215 QALYs). EOB-MRI was superior to the other diagnostic strategies considered, and this finding was robust over sensitivity and scenario analyses. A majority of the direct costs associated with HCC in Japan were found to be costs of treatment. The model results revealed the superior cost-effectiveness of the EOB-MRI diagnostic strategy compared with ECCM-MRI and CE-CT. EOB-MRI could be the first-choice imaging modality for medical care of HCC among patients with hepatitis or liver cirrhosis in Japan. Widespread implementation of EOB-MRI could reduce health care expenditures, particularly downstream treatment costs, associated with HCC. Copyright © 2017 Elsevier HS Journals, Inc. All rights reserved.

  10. Gedanken Experiments in Educational Cost Effectiveness

    Science.gov (United States)

    Brudner, Harvey J.

    1978-01-01

    Discusses the effectiveness of cost determining techniques in education. The areas discussed are: education and management; cost-effectiveness models; figures of merit determination; and the implications as they relate to the areas of audio-visual and computer educational technology. (Author/GA)

  11. Towards a Low-Cost Real-Time Photogrammetric Landslide Monitoring System Utilising Mobile and Cloud Computing Technology

    Science.gov (United States)

    Chidburee, P.; Mills, J. P.; Miller, P. E.; Fieber, K. D.

    2016-06-01

    Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM) pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i) the development of an Android mobile application; (ii) the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii) performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan) and a web-based system (Autodesk 123D Catch). Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard at a local-scale.

  12. TOWARDS A LOW-COST, REAL-TIME PHOTOGRAMMETRIC LANDSLIDE MONITORING SYSTEM UTILISING MOBILE AND CLOUD COMPUTING TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    P. Chidburee

    2016-06-01

    Full Text Available Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i the development of an Android mobile application; (ii the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan and a web-based system (Autodesk 123D Catch. Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard

  13. 5 CFR 838.241 - Cost-of-living adjustments.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Cost-of-living adjustments. 838.241... Affecting Employee Annuities Procedures for Computing the Amount Payable § 838.241 Cost-of-living... provide for cost-of-living adjustments on the former spouse's payment from employee annuity, the cost-of...

  14. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  15. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    Science.gov (United States)

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  16. A lightweight distributed framework for computational offloading in mobile cloud computing.

    Directory of Open Access Journals (Sweden)

    Muhammad Shiraz

    Full Text Available The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs. Therefore, Mobile Cloud Computing (MCC leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  17. Clinical and cost effectiveness of computer treatment for aphasia post stroke (Big CACTUS): study protocol for a randomised controlled trial.

    Science.gov (United States)

    Palmer, Rebecca; Cooper, Cindy; Enderby, Pam; Brady, Marian; Julious, Steven; Bowen, Audrey; Latimer, Nicholas

    2015-01-27

    Aphasia affects the ability to speak, comprehend spoken language, read and write. One third of stroke survivors experience aphasia. Evidence suggests that aphasia can continue to improve after the first few months with intensive speech and language therapy, which is frequently beyond what resources allow. The development of computer software for language practice provides an opportunity for self-managed therapy. This pragmatic randomised controlled trial will investigate the clinical and cost effectiveness of a computerised approach to long-term aphasia therapy post stroke. A total of 285 adults with aphasia at least four months post stroke will be randomly allocated to either usual care, computerised intervention in addition to usual care or attention and activity control in addition to usual care. Those in the intervention group will receive six months of self-managed word finding practice on their home computer with monthly face-to-face support from a volunteer/assistant. Those in the attention control group will receive puzzle activities, supplemented by monthly telephone calls. Study delivery will be coordinated by 20 speech and language therapy departments across the United Kingdom. Outcome measures will be made at baseline, six, nine and 12 months after randomisation by blinded speech and language therapist assessors. Primary outcomes are the change in number of words (of personal relevance) named correctly at six months and improvement in functional conversation. Primary outcomes will be analysed using a Hochberg testing procedure. Significance will be declared if differences in both word retrieval and functional conversation at six months are significant at the 5% level, or if either comparison is significant at 2.5%. A cost utility analysis will be undertaken from the NHS and personal social service perspective. Differences between costs and quality-adjusted life years in the three groups will be described and the incremental cost effectiveness ratio

  18. Cost estimating relationships for nuclear power plant operationa and maintenance

    International Nuclear Information System (INIS)

    Bowers, H.I.; Fuller, L.C.; Myers, M.L.

    1987-11-01

    Revised cost estimating relationships for 1987 are presented for estimating annual nonfuel operation and maintenance (O and M) costs for light-water reactor (LWR) nuclear power plants, which update guidelines published previously in 1982. The purpose of these cost estimating relationships is for use in long range planning and evaluations of the economics of nuclear energy for electric power generation. A listing of a computer program, LWROM, implementing the cost estimating relationships and written in advanced BASIC for IBM personal computers, is included

  19. Patient level costing in Ireland: process, challenges and opportunities.

    Science.gov (United States)

    Murphy, A; McElroy, B

    2015-03-01

    In 2013, the Department of Health released their policy paper on hospital financing entitled Money Follows the Patient. A fundamental building block for the proposed financing model is patient level costing. This paper outlines the patient level costing process, identifies the opportunities and considers the challenges associated with the process in the Irish hospital setting. Methods involved a review of the existing literature which was complemented with an interview with health service staff. There are considerable challenges associated with implementing patient level costing including deficits in information and communication technologies and financial expertise as well as timeliness of coding. In addition, greater clinical input into the costing process is needed compared to traditional costing processes. However, there are long-term benefits associated with patient level costing; these include empowerment of clinical staff, improved transparency and price setting and greater fairness, especially in the treatment of outliers. These can help to achieve the Government's Health Strategy. The benefits of patient level costing need to be promoted and a commitment to investment in overcoming the challenges is required.

  20. Direct cost of monitoring conventional hemodialysis conducted by nursing professionals.

    Science.gov (United States)

    Lima, Antônio Fernandes Costa

    2017-04-01

    to analyze the mean direct cost of conventional hemodialysis monitored by nursing professionals in three public teaching and research hospitals in the state of São Paulo, Brazil. this was a quantitative, explorative and descriptive investigation, based on a multiple case study approach. The mean direct cost was calculated by multiplying (clocked) time spent per procedure by the unit cost of direct labor. Values were calculated in Brazilian real (BRL). Hospital C presented the highest mean direct cost (BRL 184.52), 5.23 times greater than the value for Hospital A (BRL 35.29) and 3.91 times greater than Hospital B (BRL 47.22). the costing method used in this study can be reproduced at other dialysis centers to inform strategies aimed at efficient allocation of necessary human resources to successfully monitor conventional hemodialysis.

  1. The lifetime costs of overweight and obesity in childhood and adolescence: a systematic review.

    Science.gov (United States)

    Hamilton, D; Dee, A; Perry, I J

    2018-04-01

    Research into lifetime costs of obesity in childhood is growing. This review synthesizes that knowledge. A computerized search of the international literature since 2000 was conducted. Mean total lifetime healthcare and productivity costs were estimated and inflated to 2014 Irish euros. This resulted in 13 published articles. The methodology used in these studies varied widely, and only one study estimated both healthcare and productivity costs. Cognizant of this heterogeneity, the mean total lifetime cost of a child or adolescent with obesity was €149,206 (range, €129,410 to €178,933) for a boy and €148,196 (range, €136,576 to €173,842) for a girl. This was divided into an average of €16,229 (range, €6,580 to €35,810) in healthcare costs and €132,977 (range, €122,830 to €143,123) in productivity losses for boys and €19,636 (range, €8,016 to €45,283) and €128,560, respectively, for girls. Income penalty accounted for the greater part of productivity costs, amounting to €97,118 (range, €86,971 to €107,264) per male adolescent with obesity and €126,108 per female adolescent. Healthcare costs and income penalty appear greater in girls while costs because of workdays lost seem greater in boys. There is proportionality between body mass index and costs. Productivity costs are greater than healthcare costs. © 2017 World Obesity Federation.

  2. Assessment of the structural shielding integrity of some selected computed tomography facilities in the Greater Accra Region of Ghana

    International Nuclear Information System (INIS)

    Nkansah, A.

    2010-01-01

    The structural shielding integrity was assessed for four of the CT facilities at Trust Hospital, Korle-Bu Teaching Hospital, the 37 Military Hospital and Medical Imaging Ghana Ltd. in the Greater Accra Region of Ghana. From the shielding calculations, the concrete wall thickness computed are 120, 145, 140 and 155mm, for Medical Imaging Ghana Ltd. 37 Military, Trust Hospital and Korle-Bu Teaching Hospital respectively using Default DLP values. The wall thickness using Derived DLP values are 110, 110, 120 and 168mm for Medical Imaging Ghana Ltd, 37 Military Hospital, Trust Hospital and Korle-Bu Teaching Hospital respectively. These values are within the accepted standard concrete thickness of 102- 152mm prescribed by the National Council of Radiological Protection and measurement. The ultrasonic pulse testing indicated that all the sandcrete walls are of good quality and free of voids since pulse velocities estimated were approximately equal to 3.45km/s. an average dose rate measurement for supervised areas is 3.4 μSv/wk and controlled areas is 18.0 μSv/wk. These dose rates were below the acceptable levels of 100 μSv per week for the occupationally exposed and 20 μSv per week for members of the public provided by the ICRU. The results mean that the structural shielding thickness are adequate to protect members of the public and occupationally exposed workers (au).

  3. The use of 3D CADD (Computer Aided Design and Drafting) models in operation and maintenance cost reduction

    International Nuclear Information System (INIS)

    Didsbury, R.; Bains, N.; Cho, U.Y.

    1998-01-01

    The use of three dimensional(3D) computer-aided design and drafting(CADD) models, and the associated information technology and databases, in the engineering and construction phases of large projects is well established and yielding significant improvements in project cost, schedule and quality. The information contained in these models can also be extremely valuable to operating plants, particularly when the visual and spatial information contained in the 3D models is interfaced to other plant information databases. Indeed many plant owners and operators in the process and power industries are already using this technology to assist with such activities as plant configuration management, staff training, work planning and radiation protection. This paper will explore the application of 3D models and the associated databases in an operating plant environment and describe the resulting operational benefits and cost reduction benefits. Several industrial experience case studies will be presented along with suggestions for further future applications. (author). 4 refs., 1 tab., 8 figs

  4. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  5. Costs and clinical outcomes in individuals without known coronary artery disease undergoing coronary computed tomographic angiography from an analysis of Medicare category III transaction codes.

    Science.gov (United States)

    Min, James K; Shaw, Leslee J; Berman, Daniel S; Gilmore, Amanda; Kang, Ning

    2008-09-15

    Multidetector coronary computed tomographic angiography (CCTA) demonstrates high accuracy for the detection and exclusion of coronary artery disease (CAD) and predicts adverse prognosis. To date, opportunity costs relating the clinical and economic outcomes of CCTA compared with other methods of diagnosing CAD, such as myocardial perfusion single-photon emission computed tomography (SPECT), remain unknown. An observational, multicenter, patient-level analysis of patients without known CAD who underwent CCTA or SPECT was performed. Patients who underwent CCTA (n = 1,938) were matched to those who underwent SPECT (n = 7,752) on 8 demographic and clinical characteristics and 2 summary measures of cardiac medications and co-morbidities and were evaluated for 9-month expenditures and clinical outcomes. Adjusted total health care and CAD expenditures were 27% (p cost-efficient alternative to SPECT for the initial coronary evaluation of patients without known CAD.

  6. Higher-order techniques in computational electromagnetics

    CERN Document Server

    Graglia, Roberto D

    2016-01-01

    Higher-Order Techniques in Computational Electromagnetics explains 'high-order' techniques that can significantly improve the accuracy, computational cost, and reliability of computational techniques for high-frequency electromagnetics, such as antennas, microwave devices and radar scattering applications.

  7. Tracking and computing

    International Nuclear Information System (INIS)

    Niederer, J.

    1983-01-01

    This note outlines several ways in which large scale simulation computing and programming support may be provided to the SSC design community. One aspect of the problem is getting supercomputer power without the high cost and long lead times of large scale institutional computing. Another aspect is the blending of modern programming practices with more conventional accelerator design programs in ways that do not also swamp designers with the details of complicated computer technology

  8. Computed tomographic colonography to screen for colorectal cancer, extracolonic cancer, and aortic aneurysm: model simulation with cost-effectiveness analysis.

    Science.gov (United States)

    Hassan, Cesare; Pickhardt, Perry J; Pickhardt, Perry; Laghi, Andrea; Kim, Daniel H; Kim, Daniel; Zullo, Angelo; Iafrate, Franco; Di Giulio, Lorenzo; Morini, Sergio

    2008-04-14

    In addition to detecting colorectal neoplasia, abdominal computed tomography (CT) with colonography technique (CTC) can also detect unsuspected extracolonic cancers and abdominal aortic aneurysms (AAA).The efficacy and cost-effectiveness of this combined abdominal CT screening strategy are unknown. A computerized Markov model was constructed to simulate the occurrence of colorectal neoplasia, extracolonic malignant neoplasm, and AAA in a hypothetical cohort of 100,000 subjects from the United States who were 50 years of age. Simulated screening with CTC, using a 6-mm polyp size threshold for reporting, was compared with a competing model of optical colonoscopy (OC), both without and with abdominal ultrasonography for AAA detection (OC-US strategy). In the simulated population, CTC was the dominant screening strategy, gaining an additional 1458 and 462 life-years compared with the OC and OC-US strategies and being less costly, with a savings of $266 and $449 per person, respectively. The additional gains for CTC were largely due to a decrease in AAA-related deaths, whereas the modeled benefit from extracolonic cancer downstaging was a relatively minor factor. At sensitivity analysis, OC-US became more cost-effective only when the CTC sensitivity for large polyps dropped to 61% or when broad variations of costs were simulated, such as an increase in CTC cost from $814 to $1300 or a decrease in OC cost from $1100 to $500. With the OC-US approach, suboptimal compliance had a strong negative influence on efficacy and cost-effectiveness. The estimated mortality from CT-induced cancer was less than estimated colonoscopy-related mortality (8 vs 22 deaths), both of which were minor compared with the positive benefit from screening. When detection of extracolonic findings such as AAA and extracolonic cancer are considered in addition to colorectal neoplasia in our model simulation, CT colonography is a dominant screening strategy (ie, more clinically effective and more cost

  9. Method and computer program product for maintenance and modernization backlogging

    Science.gov (United States)

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  10. [Operating cost analysis of anaesthesia: activity based costing (ABC analysis)].

    Science.gov (United States)

    Majstorović, Branislava M; Kastratović, Dragana A; Vučović, Dragan S; Milaković, Branko D; Miličić, Biljana R

    2011-01-01

    Cost of anaesthesiology represent defined measures to determine a precise profile of expenditure estimation of surgical treatment, which is important regarding planning of healthcare activities, prices and budget. In order to determine the actual value of anaestesiological services, we started with the analysis of activity based costing (ABC) analysis. Retrospectively, in 2005 and 2006, we estimated the direct costs of anestesiological services (salaries, drugs, supplying materials and other: analyses and equipment.) of the Institute of Anaesthesia and Resuscitation of the Clinical Centre of Serbia. The group included all anesthetized patients of both sexes and all ages. We compared direct costs with direct expenditure, "each cost object (service or unit)" of the Republican Healthcare Insurance. The Summary data of the Departments of Anaesthesia documented in the database of the Clinical Centre of Serbia. Numerical data were utilized and the numerical data were estimated and analyzed by computer programs Microsoft Office Excel 2003 and SPSS for Windows. We compared using the linear model of direct costs and unit costs of anaesthesiological services from the Costs List of the Republican Healthcare Insurance. Direct costs showed 40% of costs were spent on salaries, (32% on drugs and supplies, and 28% on other costs, such as analyses and equipment. The correlation of the direct costs of anaestesiological services showed a linear correlation with the unit costs of the Republican Healthcare Insurance. During surgery, costs of anaesthesia would increase by 10% the surgical treatment cost of patients. Regarding the actual costs of drugs and supplies, we do not see any possibility of costs reduction. Fixed elements of direct costs provide the possibility of rationalization of resources in anaesthesia.

  11. Development of a Computer Program for an Analysis of the Logistics and Transportation Costs of the PWR Spent Fuels in Korea

    International Nuclear Information System (INIS)

    Cha, Jeong Hun; Choi, Heui Joo; Lee, Jong Youl; Choi, Jong Won

    2009-01-01

    It is expected that a substantial amount of spent fuels will be transported from the four nuclear power plant (NPP) sites in Korea to a hypothetical centralized interim storage facility or a final repository in the near future. The cost for the transportation is proportional to the amount of spent fuels. In this paper, a cost estimation program is developed based on the conceptual design of a transportation system and a logistics analysis. Using the developed computer program, named as CASK, the minimum capacity of a centralized interim storage facility (CISF) and the transportation cost for PWR spent fuels are calculated. The PWR spent fuels are transported from 4 NPP sites to a final repository (FR) via the CISF. Since NPP sites and the CISF are located along the coast, a sea-transportation is considered and a road-transportation is considered between the CISF and the FR. The result shows that the minimum capacity of the interim storage facility is 15,000 MTU

  12. Data mining in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ruxandra-Ştefania PETRE

    2012-10-01

    Full Text Available This paper describes how data mining is used in cloud computing. Data Mining is used for extracting potentially useful information from raw data. The integration of data mining techniques into normal day-to-day activities has become common place. Every day people are confronted with targeted advertising, and data mining techniques help businesses to become more efficient by reducing costs.Data mining techniques and applications are very much needed in the cloud computing paradigm. The implementation of data mining techniques through Cloud computing will allow the users to retrieve meaningful information from virtually integrated data warehouse that reduces the costs of infrastructure and storage.

  13. Greater physician involvement improves coding outcomes in endobronchial ultrasound-guided transbronchial needle aspiration procedures.

    Science.gov (United States)

    Pillai, Anilkumar; Medford, Andrew R L

    2013-01-01

    Correct coding is essential for accurate reimbursement for clinical activity. Published data confirm that significant aberrations in coding occur, leading to considerable financial inaccuracies especially in interventional procedures such as endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA). Previous data reported a 15% coding error for EBUS-TBNA in a U.K. service. We hypothesised that greater physician involvement with coders would reduce EBUS-TBNA coding errors and financial disparity. The study was done as a prospective cohort study in the tertiary EBUS-TBNA service in Bristol. 165 consecutive patients between October 2009 and March 2012 underwent EBUS-TBNA for evaluation of unexplained mediastinal adenopathy on computed tomography. The chief coder was prospectively electronically informed of all procedures and cross-checked on a prospective database and by Trust Informatics. Cost and coding analysis was performed using the 2010-2011 tariffs. All 165 procedures (100%) were coded correctly as verified by Trust Informatics. This compares favourably with the 14.4% coding inaccuracy rate for EBUS-TBNA in a previous U.K. prospective cohort study [odds ratio 201.1 (1.1-357.5), p = 0.006]. Projected income loss was GBP 40,000 per year in the previous study, compared to a GBP 492,195 income here with no coding-attributable loss in revenue. Greater physician engagement with coders prevents coding errors and financial losses which can be significant especially in interventional specialties. The intervention can be as cheap, quick and simple as a prospective email to the coding team with cross-checks by Trust Informatics and against a procedural database. We suggest that all specialties should engage more with their coders using such a simple intervention to prevent revenue losses. Copyright © 2013 S. Karger AG, Basel.

  14. The High Direct Medical Costs of Prader-Willi Syndrome.

    Science.gov (United States)

    Shoffstall, Andrew J; Gaebler, Julia A; Kreher, Nerissa C; Niecko, Timothy; Douglas, Diah; Strong, Theresa V; Miller, Jennifer L; Stafford, Diane E; Butler, Merlin G

    2016-08-01

    To assess medical resource utilization associated with Prader-Willi syndrome (PWS) in the US, hypothesized to be greater relative to a matched control group without PWS. We used a retrospective case-matched control design and longitudinal US administrative claims data (MarketScan) during a 5-year enrollment period (2009-2014). Patients with PWS were identified by Classification of Diseases, Ninth Revision, Clinical Modification diagnosis code 759.81. Controls were matched on age, sex, and payer type. Outcomes included total, outpatient, inpatient and prescription costs. After matching and application of inclusion/exclusion criteria, we identified 2030 patients with PWS (1161 commercial, 38 Medicare supplemental, and 831 Medicaid). Commercially insured patients with PWS (median age 10 years) had 8.8-times greater total annual direct medical costs than their counterparts without PWS (median age 10 years: median costs $14 907 vs $819; P < .0001; mean costs: $28 712 vs $3246). Outpatient care comprised the largest portion of medical resource utilization for enrollees with and without PWS (median $5605 vs $675; P < .0001; mean $11 032 vs $1804), followed by mean annual inpatient and medication costs, which were $10 879 vs $1015 (P < .001) and $6801 vs $428 (P < .001), respectively. Total annual direct medical costs were ∼42% greater for Medicaid-insured patients with PWS than their commercially insured counterparts, an increase partly explained by claims for Medicaid Waiver day and residential habilitation. Direct medical resource utilization was considerably greater among patients with PWS than members without the condition. This study provides a first step toward quantifying the financial burden of PWS posed to individuals, families, and society. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Beat-ID: Towards a computationally low-cost single heartbeat biometric identity check system based on electrocardiogram wave morphology

    Science.gov (United States)

    Paiva, Joana S.; Dias, Duarte

    2017-01-01

    In recent years, safer and more reliable biometric methods have been developed. Apart from the need for enhanced security, the media and entertainment sectors have also been applying biometrics in the emerging market of user-adaptable objects/systems to make these systems more user-friendly. However, the complexity of some state-of-the-art biometric systems (e.g., iris recognition) or their high false rejection rate (e.g., fingerprint recognition) is neither compatible with the simple hardware architecture required by reduced-size devices nor the new trend of implementing smart objects within the dynamic market of the Internet of Things (IoT). It was recently shown that an individual can be recognized by extracting features from their electrocardiogram (ECG). However, most current ECG-based biometric algorithms are computationally demanding and/or rely on relatively large (several seconds) ECG samples, which are incompatible with the aforementioned application fields. Here, we present a computationally low-cost method (patent pending), including simple mathematical operations, for identifying a person using only three ECG morphology-based characteristics from a single heartbeat. The algorithm was trained/tested using ECG signals of different duration from the Physionet database on more than 60 different training/test datasets. The proposed method achieved maximal averaged accuracy of 97.450% in distinguishing each subject from a ten-subject set and false acceptance and rejection rates (FAR and FRR) of 5.710±1.900% and 3.440±1.980%, respectively, placing Beat-ID in a very competitive position in terms of the FRR/FAR among state-of-the-art methods. Furthermore, the proposed method can identify a person using an average of 1.020 heartbeats. It therefore has FRR/FAR behavior similar to obtaining a fingerprint, yet it is simpler and requires less expensive hardware. This method targets low-computational/energy-cost scenarios, such as tiny wearable devices (e.g., a

  16. Price-Transparency and Cost Accounting

    Science.gov (United States)

    Eakin, Cynthia; Fischer, Katrina

    2015-01-01

    Health care reform is directed toward improving access and quality while containing costs. An essential part of this is improvement of pricing models to more accurately reflect the costs of providing care. Transparent prices that reflect costs are necessary to signal information to consumers and producers. This information is central in a consumer-driven marketplace. The rapid increase in high deductible insurance and other forms of cost sharing incentivizes the search for price information. The organizational ability to measure costs across a cycle of care is an integral component of creating value, and will play a greater role as reimbursements transition to episode-based care, value-based purchasing, and accountable care organization models. This article discusses use of activity-based costing (ABC) to better measure the cost of health care. It describes examples of ABC in health care organizations and discusses impediments to adoption in the United States including cultural and institutional barriers. PMID:25862425

  17. Price-Transparency and Cost Accounting

    Directory of Open Access Journals (Sweden)

    Peter Hilsenrath PhD

    2015-04-01

    Full Text Available Health care reform is directed toward improving access and quality while containing costs. An essential part of this is improvement of pricing models to more accurately reflect the costs of providing care. Transparent prices that reflect costs are necessary to signal information to consumers and producers. This information is central in a consumer-driven marketplace. The rapid increase in high deductible insurance and other forms of cost sharing incentivizes the search for price information. The organizational ability to measure costs across a cycle of care is an integral component of creating value, and will play a greater role as reimbursements transition to episode-based care, value-based purchasing, and accountable care organization models. This article discusses use of activity-based costing (ABC to better measure the cost of health care. It describes examples of ABC in health care organizations and discusses impediments to adoption in the United States including cultural and institutional barriers.

  18. Direct cost of monitoring conventional hemodialysis conducted by nursing professionals

    Directory of Open Access Journals (Sweden)

    Antônio Fernandes Costa Lima

    Full Text Available ABSTRACT Objective: to analyze the mean direct cost of conventional hemodialysis monitored by nursing professionals in three public teaching and research hospitals in the state of São Paulo, Brazil. Method: this was a quantitative, explorative and descriptive investigation, based on a multiple case study approach. The mean direct cost was calculated by multiplying (clocked time spent per procedure by the unit cost of direct labor. Values were calculated in Brazilian real (BRL. Results: Hospital C presented the highest mean direct cost (BRL 184.52, 5.23 times greater than the value for Hospital A (BRL 35.29 and 3.91 times greater than Hospital B (BRL 47.22. Conclusion: the costing method used in this study can be reproduced at other dialysis centers to inform strategies aimed at efficient allocation of necessary human resources to successfully monitor conventional hemodialysis.

  19. GPU-accelerated micromagnetic simulations using cloud computing

    Energy Technology Data Exchange (ETDEWEB)

    Jermain, C.L., E-mail: clj72@cornell.edu [Cornell University, Ithaca, NY 14853 (United States); Rowlands, G.E.; Buhrman, R.A. [Cornell University, Ithaca, NY 14853 (United States); Ralph, D.C. [Cornell University, Ithaca, NY 14853 (United States); Kavli Institute at Cornell, Ithaca, NY 14853 (United States)

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  20. GPU-accelerated micromagnetic simulations using cloud computing

    International Nuclear Information System (INIS)

    Jermain, C.L.; Rowlands, G.E.; Buhrman, R.A.; Ralph, D.C.

    2016-01-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  1. Cost analysis and cost justification of automated data processing in the clinical laboratory.

    Science.gov (United States)

    Westlake, G E

    1983-03-01

    Prospective cost analysis of alternative data processing systems can be facilitated by proper selection of the costs to be analyzed and realistic appraisal of the effect on staffing. When comparing projects with dissimilar cash flows, techniques such as analysis of net present value can be helpful in identifying financial benefits. Confidence and accuracy in prospective analyses will increase as more retrospective studies are published. Several accounts now in the literature describe long-term experience with turnkey laboratory information systems. Acknowledging the difficulty in longitudinal studies, they all report favorable effects on labor costs and recovery of lost charges. Enthusiasm is also expressed for the many intangible benefits of the systems. Several trends suggest that cost justification and cost effectiveness will be more easily demonstrated in the future. These are the rapidly decreasing cost of hardware (with corresponding reduction in service costs) and the entry into the market of additional systems designed for medium to small hospitals. The effect of broadening the sales base may be lower software prices. Finally, operational and executive data management and reporting are destined to become the premier extensions of the LIS for cost justification. Aptly applied, these facilities can promote understanding of costs, control of costs, and greater efficiency in providing laboratory services.

  2. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  3. An Offload NIC for NASA, NLR, and Grid Computing

    Science.gov (United States)

    Awrach, James

    2013-01-01

    This work addresses distributed data management and access dynamically configurable high-speed access to data distributed and shared over wide-area high-speed network environments. An offload engine NIC (network interface card) is proposed that scales at nX10-Gbps increments through 100-Gbps full duplex. The Globus de facto standard was used in projects requiring secure, robust, high-speed bulk data transport. Novel extension mechanisms were derived that will combine these technologies for use by GridFTP, bandwidth management resources, and host CPU (central processing unit) acceleration. The result will be wire-rate encrypted Globus grid data transactions through offload for splintering, encryption, and compression. As the need for greater network bandwidth increases, there is an inherent need for faster CPUs. The best way to accelerate CPUs is through a network acceleration engine. Grid computing data transfers for the Globus tool set did not have wire-rate encryption or compression. Existing technology cannot keep pace with the greater bandwidths of backplane and network connections. Present offload engines with ports to Ethernet are 32 to 40 Gbps f-d at best. The best of ultra-high-speed offload engines use expensive ASICs (application specific integrated circuits) or NPUs (network processing units). The present state of the art also includes bonding and the use of multiple NICs that are also in the planning stages for future portability to ASICs and software to accommodate data rates at 100 Gbps. The remaining industry solutions are for carrier-grade equipment manufacturers, with costly line cards having multiples of 10-Gbps ports, or 100-Gbps ports such as CFP modules that interface to costly ASICs and related circuitry. All of the existing solutions vary in configuration based on requirements of the host, motherboard, or carriergrade equipment. The purpose of the innovation is to eliminate data bottlenecks within cluster, grid, and cloud computing systems

  4. Designer's unified cost model

    Science.gov (United States)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  5. Forestry-related pathways for the movement of exotic plant pests into and within the greater Caribbean region

    Science.gov (United States)

    Leslie Newton; Heike Meissner; Andrea. Lemay

    2011-01-01

    Forests of the Greater Caribbean Region (GCR) are important ecologically and economically. These unique ecosystems are under increasing pressure from exotic pests, which may cause extensive environmental damage and cost billions of dollars in control programs, lost production, and forest restoration.

  6. Assessing and forecasting groundwater development costs in Sub ...

    African Journals Online (AJOL)

    Greater use of groundwater in Sub-Saharan Africa is a pre-requisite for improved human welfare; however, the costs associated with groundwater development are prohibitively high and poorly defined. This study identifies and disaggregates the costs of groundwater development in 11 Sub-Saharan African countries, while ...

  7. An assessment of mass burn incineration costs

    International Nuclear Information System (INIS)

    Fox, M.R.; Scutter, J.N.; Sutton, A.M.

    1993-01-01

    This study comprises the third and final part of a cost assessment exercise of waste-to-energy options. The specific objectives of this particular study were: to determine the capital and operating costs of three generic types of mass burn waste-to-energy systems, for waste inputs of 200,000 and 400,000 t/y of municipal solid waste (MSW); to verify the mass and energy balances of the systems; to develop a computer cost model to manipulate the data as required; to carry out sensitivity checks on the computer model of changes to key parameters; and to conduct the study in a manner approximating as closely as possible to a real commercial situation. (author)

  8. The cost-effectiveness and cost-utility of high-dose palliative radiotherapy for advanced non-small-cell lung cancer

    International Nuclear Information System (INIS)

    Coy, Peter; Schaafsma, Joseph; Schofield, John A.

    2000-01-01

    Purpose: To compute cost-effectiveness/cost-utility (CE/CU) ratios, from the treatment clinic and societal perspectives, for high-dose palliative radiotherapy treatment (RT) for advanced non-small-cell lung cancer (NSCLC) against best supportive care (BSC) as comparator, and thereby demonstrate a method for computing CE/CU ratios when randomized clinical trial (RCT) data cannot be generated. Methods and Materials: Unit cost estimates based on an earlier reported 1989-90 analysis of treatment costs at the Vancouver Island Cancer Centre, Victoria, British Columbia, Canada, are updated to 1997-1998 and then used to compute the incremental cost of an average dose of high-dose palliative RT. The incremental number of life days and quality-adjusted life days (QALDs) attributable to treatment are from earlier reported regression analyses of the survival and quality-of-life data from patients who enrolled prospectively in a lung cancer management cost-effectiveness study at the clinic over a 2-year period from 1990 to 1992. Results: The baseline CE and CU ratios are $9245 Cdn per life year (LY) and $12,836 per quality-adjusted life year (QALY), respectively, from the clinic perspective; and $12,253/LY and $17,012/QALY, respectively, from the societal perspective. Multivariate sensitivity analysis for the CE ratio produces a range of $5513-28,270/LY from the clinic perspective, and $7307-37,465/LY from the societal perspective. Similar calculations for the CU ratio produce a range of $7205-37,134/QALY from the clinic perspective, and $9550-49,213/QALY from the societal perspective. Conclusion: The cost effectiveness and cost utility of high-dose palliative RT for advanced NSCLC compares favorably with the cost effectiveness of other forms of treatment for NSCLC, of treatments of other forms of cancer, and of many other commonly used medical interventions; and lies within the US $50,000/QALY benchmark often cited for cost-effective care

  9. ULTRAFISH: generalization of SUPERFISH to m greater than or equal to 1

    International Nuclear Information System (INIS)

    Gluckstern, R.L.; Holsinger, R.F.; Halbach, K.; Minerbo, G.N.

    1982-01-01

    The present version of the SUPERFISH program computes fundamental and higher order resonant frequencies and corresponding fields of azimuthally symmetric TE and TM modes (m = 0) in an electromagnetic cavity which is a figure of revolution about a longitudinal axis. We have developed the program ULTRAFISH which computes the resonant frequencies and fields in such a cavity for azimuthally asymmetric modes (cos mphi with m greater than or equal to 1). These modes no longer can be characterized as TE and TM and lead to simultaneous equations involving two field components. These are taken for convenience to be rEphi and rHphi, in terms of which all four other field components are expressed. Several different formulations for solving the equations are being investigated

  10. Computers appreciated by marketers

    International Nuclear Information System (INIS)

    Mantho, M.

    1993-01-01

    The computer has been worth its weight in gold to the fueloil man. In fact, with falling prices on both software and machines, the worth is greater than gold. Every so often, about every three years, we ask some questions about the utilization of computers. This time, we looked into the future, to find out the acceptance of other marvels such as the cellular phone and hand held computer. At the moment, there isn't much penetration. Contact by two-way radio as well as computing meters on trucks still reign supreme

  11. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    International Nuclear Information System (INIS)

    Wattson, Daniel A.; Hunink, M.G. Myriam; DiPiro, Pamela J.; Das, Prajnan; Hodgson, David C.; Mauch, Peter M.; Ng, Andrea K.

    2014-01-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening

  12. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wattson, Daniel A., E-mail: dwattson@partners.org [Harvard Radiation Oncology Program, Boston, Massachusetts (United States); Hunink, M.G. Myriam [Departments of Radiology and Epidemiology, Erasmus Medical Center, Rotterdam, the Netherlands and Center for Health Decision Science, Harvard School of Public Health, Boston, Massachusetts (United States); DiPiro, Pamela J. [Department of Imaging, Dana-Farber Cancer Institute, Boston, Massachusetts (United States); Das, Prajnan [Department of Radiation Oncology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Hodgson, David C. [Department of Radiation Oncology, University of Toronto, Toronto, Ontario (Canada); Mauch, Peter M.; Ng, Andrea K. [Department of Radiation Oncology, Brigham and Women' s Hospital and Dana-Farber Cancer Institute, Boston, Massachusetts (United States)

    2014-10-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening

  13. Low-dose chest computed tomography for lung cancer screening among Hodgkin lymphoma survivors: a cost-effectiveness analysis.

    Science.gov (United States)

    Wattson, Daniel A; Hunink, M G Myriam; DiPiro, Pamela J; Das, Prajnan; Hodgson, David C; Mauch, Peter M; Ng, Andrea K

    2014-10-01

    Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening may be cost effective for all smokers but possibly not

  14. Cloud Computing. Technology Briefing. Number 1

    Science.gov (United States)

    Alberta Education, 2013

    2013-01-01

    Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…

  15. Advanced Fuel Cycle Cost Basis

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2009-12-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  16. Advanced Fuel Cycle Cost Basis

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert

    2007-04-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 26 cost modules—24 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, and high-level waste.

  17. Advanced Fuel Cycle Cost Basis

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2008-03-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  18. 20 CFR 404.278 - Additional cost-of-living increase.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Additional cost-of-living increase. 404.278... DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases § 404.278 Additional cost-of-living increase. (a) General. In addition to the cost-of-living increase explained in...

  19. Reducing healthcare costs facilitated by surgical auditing: a systematic review.

    Science.gov (United States)

    Govaert, Johannes Arthuur; van Bommel, Anne Charlotte Madeline; van Dijk, Wouter Antonie; van Leersum, Nicoline Johanneke; Tollenaar, Robertus Alexandre Eduard Mattheus; Wouters, Michael Wilhemus Jacobus Maria

    2015-07-01

    Surgical auditing has been developed in order to benchmark and to facilitate quality improvement. The aim of this review is to determine if auditing combined with systematic feedback of information on process and outcomes of care results in lower costs of surgical care. A systematic search of published literature before 21-08-2013 was conducted in Pubmed, Embase, Web of Science, and Cochrane Library. Articles were selected if they met the inclusion criteria of describing a surgical audit with cost-evaluation. The systematic search resulted in 3608 papers. Six studies were identified as relevant, all showing a positive effect of surgical auditing on quality of healthcare and therefore cost savings was reported. Cost reductions ranging from $16 to $356 per patient were seen in audits evaluating general or vascular procedures. The highest potential cost reduction was described in a colorectal surgical audit (up to $1,986 per patient). All six identified articles in this review describe a reduction in complications and thereby a reduction in costs due to surgical auditing. Surgical auditing may be of greater value when high-risk procedures are evaluated, since prevention of adverse events in these procedures might be of greater clinical and therefore of greater financial impact. This systematic review shows that surgical auditing can function as a quality instrument and therefore as a tool to reduce costs. Since evidence is scarce so far, further studies should be performed to investigate if surgical auditing has positive effects to turn the rising healthcare costs around. In the future, incorporating (actual) cost analyses and patient-related outcome measures would increase the audits' value and provide a complete overview of the value of healthcare.

  20. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  1. 5 CFR 847.705 - Cost-of-living adjustments.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Cost-of-living adjustments. 847.705... FUND INSTRUMENTALITIES Computation of Benefits Under the Retroactive Provisions § 847.705 Cost-of-living adjustments. Cost-of-living adjustments are applied to the rate payable to the retiree or survivor...

  2. The economic costs of energy

    International Nuclear Information System (INIS)

    Brookes, L.G.

    1980-01-01

    At a recent symposium, the economic costs of nuclear power were examined in four lectures which considered; (1) The performance of different types, size and ages of nuclear power plants. (2) The comparison between coal and nuclear power costs based on the principle of net effective cash. (3) The capital requirements of a nuclear programme. (4) The comparative costs, now and in the future, of coal-fired and nuclear plants. It is concluded that uncertainties seem to get greater rather than smaller with time probably due to the high and fluctuating world inflation rates and the great uncertainty about world economic performance introduced by the politicising of world oil supplies. (UK)

  3. Differential fitness costs of reproduction between the sexes

    OpenAIRE

    Penn, Dustin J.; Smith, Ken R.

    2006-01-01

    Natural selection does not necessarily favor maximal reproduction because reproduction imposes fitness costs, reducing parental survival, and offspring quality. Here, we show that parents in a preindustrial population in North America incurred fitness costs from reproduction, and women incurred greater costs than men. We examined the survivorship and reproductive success (Darwinian fitness) of 21,684 couples married between 1860 and 1895 identified in the Utah Population Database. We found th...

  4. 48 CFR 49.303-4 - Adjustment of indirect costs.

    Science.gov (United States)

    2010-10-01

    ... costs. 49.303-4 Section 49.303-4 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT TERMINATION OF CONTRACTS Additional Principles for Cost-Reimbursement Contracts... compute indirect costs for other contracts performed during the applicable accounting period. [48 FR 42447...

  5. Accelerating artificial intelligence with reconfigurable computing

    Science.gov (United States)

    Cieszewski, Radoslaw

    Reconfigurable computing is emerging as an important area of research in computer architectures and software systems. Many algorithms can be greatly accelerated by placing the computationally intense portions of an algorithm into reconfigurable hardware. Reconfigurable computing combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be changed over the lifetime of the system. Similar to an ASIC, reconfigurable systems provide a method to map circuits into hardware. Reconfigurable systems therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Such a field, where there is many different algorithms which can be accelerated, is an artificial intelligence. This paper presents example hardware implementations of Artificial Neural Networks, Genetic Algorithms and Expert Systems.

  6. Framework for Computation Offloading in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dejan Kovachev

    2012-12-01

    Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.

  7. The operating cost of electrocoagulation

    Energy Technology Data Exchange (ETDEWEB)

    Donini, J.C.; Kan, J.; Szynkarczuk, J.; Hassan, T.A.; Kar, K.L. (Canadian Centre for Mineral and Energy Technology, Devon, AB (Canada))

    1994-12-01

    The electrocoagulation of kaolinite and bentonite suspensions was studied in a pilot-scale electrocoagulation system to assess the operating cost and efficiency of the process. Factors affecting the operating cost such as formation of passivation layers on electrode plates, flow velocity, and concentration of NaCl in the suspension were examined. The operating costs investigated were the power cost of the electrocoagulation cell and the material cost due to the consumption of the aluminum electrode. Comparison was based on the settling properties of the treated product: turbidity, settling rate, and cake height. Higher NaCl concentration resulted in greater amounts of Al dissolved chemically and electrochemically into the suspension and thus a better clarity of the supernatant of the treated product. Increased flow velocity could reduce significantly the operating cost while improving both clarity of the supernatant and compactness of the sludge volume. The passivation layers developed quickly with time during the electrocoagulation process and more energy became wasted on the layers. 10 refs., 12 figs.

  8. A predation cost to bold fish in the wild

    DEFF Research Database (Denmark)

    Hulthén, Kaj; Chapman, Ben; Nilsson, Anders P.

    2017-01-01

    in the animal kingdom. Theory predicts that individual behavioural types differ in a cost-benefit trade-off where bolder individuals benefit from greater access to resources while paying higher predation-risk costs. However, explicitly linking predation events to individual behaviour under natural conditions...

  9. Dark respiration of leaves and traps of terrestrial carnivorous plants: are there greater energetic costs in traps?

    Czech Academy of Sciences Publication Activity Database

    Adamec, Lubomír

    2010-01-01

    Roč. 5, č. 1 (2010), s. 121-124 ISSN 1895-104X Institutional research plan: CEZ:AV0Z60050516 Keywords : Aerobic respiration * metabolic costs * trap specialization Subject RIV: EF - Botanics Impact factor: 0.685, year: 2010

  10. Distribution costs -- the cost of local delivery

    International Nuclear Information System (INIS)

    Winger, N.; Zarnett, P.; Carr, J.

    2000-01-01

    Most of the power transmission system in the province of Ontario is owned and operated as a regulated monopoly by Ontario Hydro Services Company (OHSC). Local distribution systems deliver to end-users from bulk supply points within a service territory. OHSC distributes to approximately one million, mostly rural customers, while the approximately 250 municipal utilities together serve about two million, mostly urban customers. Under the Energy Competition Act of 1998 local distribution companies will face some new challenges, including unbundled billing systems, a broader range of distribution costs, increased costs, made up of corporate taxes or payments in lieu of taxes and added costs for regulatory affairs. The consultants provide a detailed discussion of the components of distribution costs, the three components of the typical budget process (capital expenditures, (CAPEX), operating and maintenance (O and M) and administration and corporate (GA and C), a summary of some typical distribution costs in Ontario, and the estimated impacts of the Energy Competition Act (ECA) compliance on charges and rates. Various mitigation strategies are also reviewed. Among these are joint ventures by local distribution companies to reduce ECA compliance costs, re-examination of controllable costs, temporary reduction of the allowable return on equity (ROE) by 50 per cent, and/or reducing the competitive transition charge (CTC). It is estimated that either one of these two reductions could eliminate the full amount of the five to seven per cent uplift in delivered energy service costs. The conclusion of the consultants is that local distribution delivery charges will make up a greater proportion of end-user cost in the future than it has in the past. An increase to customers of about five per cent is expected when the competitive electricity market opens and unbundled billing begins. The cost increase could be mitigated by a combination of actions that would be needed for about

  11. The cost of nuclear electricity: France after Fukushima

    International Nuclear Information System (INIS)

    Boccard, Nicolas

    2014-01-01

    The Fukushima disaster has lead the French government to release novel cost information relative to its nuclear electricity program allowing us to compute a levelized cost. We identify a modest escalation of capital cost and a larger than expected operational cost. Under the best scenario, the cost of French nuclear power over the last four decades is 59€/MWh (at 2010 prices) while in the worst case it is 83€/MWh. On the basis of these findings, we estimate the future cost of nuclear power in France to be at least 76€/MWh and possibly 117€/MWh. A comparison with the US confirms that French nuclear electricity nevertheless remains cheaper. Comparisons with coal, natural gas and wind power are carried out to find the advantage of these. - Highlights: • We compute the levelized cost of French nuclear power over 40 years using a novel court of audit report. • We include R and D, technology development, fissile fuel, financing cost, decommissioning and the back-end cycle. • We find a mild capital cost escalation and a high operation cost driven by a low fleet availability. • The levelized cost ranges between 59 and 83€/MWh (at 2010 prices) and compares favorably to the US. • A tentative cost for future nuclear power ranges between 76 and 117€/MWh and compares unfavorably against alternative fuels

  12. Computing Bounds on Resource Levels for Flexible Plans

    Science.gov (United States)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  13. Cost of space-based laser ballistic missile defense.

    Science.gov (United States)

    Field, G; Spergel, D

    1986-03-21

    Orbiting platforms carrying infrared lasers have been proposed as weapons forming the first tier of a ballistic missile defense system under the President's Strategic Defense Initiative. As each laser platform can destroy a limited number of missiles, one of several methods of countering such a system is to increase the number of offensive missiles. Hence it is important to know whether the cost-exchange ratio, defined as the ratio of the cost to the defense of destroying a missile to the cost to the offense of deploying an additional missile, is greater or less than 1. Although the technology to be used in a ballistic missile defense system is still extremely uncertain, it is useful to examine methods for calculating the cost-exchange ratio. As an example, the cost of an orbiting infrared laser ballistic missile defense system employed against intercontinental ballistic missiles launched simultaneously from a small area is compared to the cost of additional offensive missiles. If one adopts lower limits to the costs for the defense and upper limits to the costs for the offense, the cost-exchange ratio comes out substantially greater than 1. If these estimates are confirmed, such a ballistic missile defense system would be unable to maintain its effectiveness at less cost than it would take to proliferate the ballistic missiles necessary to overcome it and would therefore not satisfy the President's requirements for an effective strategic defense. Although the method is illustrated by applying it to a space-based infrared laser system, it should be straightforward to apply it to other proposed systems.

  14. Greater happiness for a greater number: Is that possible? If so how? (Arabic)

    NARCIS (Netherlands)

    R. Veenhoven (Ruut); E. Samuel (Emad)

    2012-01-01

    textabstractWhat is the final goal of public policy? Jeremy Bentham (1789) would say: greater happiness for a greater number. He thought of happiness as subjective enjoyment of life; in his words as “the sum of pleasures and pains”. In his time, the happiness of the great number could not be

  15. Nuclear-fuel-cycle costs. Consolidated Fuel-Reprocessing Program

    International Nuclear Information System (INIS)

    Burch, W.D.; Haire, M.J.; Rainey, R.H.

    1981-01-01

    The costs for the back-end of the nuclear fuel cycle, which were developed as part of the Nonproliferation Alternative Systems Assessment Program (NASAP), are presented. Total fuel-cycle costs are given for the pressurized-water reactor once-through and fuel-recycle systems, and for the liquid-metal fast-breeder-reactor system. These calculations show that fuel-cycle costs are a small part of the total power costs. For breeder reactors, fuel-cycle costs are about half that of the present once-through system. The total power cost of the breeder-reactor system is greater than that of light-water reactor at today's prices for uranium and enrichment

  16. Computation of piecewise affine terminal cost functions for model predictive control

    NARCIS (Netherlands)

    Brunner, F.D.; Lazar, M.; Allgöwer, F.; Fränzle, Martin; Lygeros, John

    2014-01-01

    This paper proposes a method for the construction of piecewise affine terminal cost functions for model predictive control (MPC). The terminal cost function is constructed on a predefined partition by solving a linear program for a given piecewise affine system, a stabilizing piecewise affine

  17. Designers' unified cost model

    Science.gov (United States)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  18. Hardware for soft computing and soft computing for hardware

    CERN Document Server

    Nedjah, Nadia

    2014-01-01

    Single and Multi-Objective Evolutionary Computation (MOEA),  Genetic Algorithms (GAs), Artificial Neural Networks (ANNs), Fuzzy Controllers (FCs), Particle Swarm Optimization (PSO) and Ant colony Optimization (ACO) are becoming omnipresent in almost every intelligent system design. Unfortunately, the application of the majority of these techniques is complex and so requires a huge computational effort to yield useful and practical results. Therefore, dedicated hardware for evolutionary, neural and fuzzy computation is a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, digital as well as analog hardware implementations of such computation become cost-effective. The idea behind this book is to offer a variety of hardware designs for soft computing techniques that can be embedded in any final product. Also, to introduce the successful application of soft computing technique to solve many hard problem encountered during the design of embedded hardware designs. Reconfigurable em...

  19. A Novel Mu Rhythm-based Brain Computer Interface Design that uses a Programmable System on Chip.

    Science.gov (United States)

    Joshi, Rohan; Saraswat, Prateek; Gajendran, Rudhram

    2012-01-01

    This paper describes the system design of a portable and economical mu rhythm based Brain Computer Interface which employs Cypress Semiconductors Programmable System on Chip (PSoC). By carrying out essential processing on the PSoC, the use of an extra computer is eliminated, resulting in considerable cost savings. Microsoft Visual Studio 2005 and PSoC Designer 5.01 are employed in developing the software for the system, the hardware being custom designed. In order to test the usability of the BCI, preliminary testing is carried out by training three subjects who were able to demonstrate control over their electroencephalogram by moving a cursor present at the center of the screen towards the indicated direction with an average accuracy greater than 70% and a bit communication rate of up to 7 bits/min.

  20. Computer graphics in engineering education

    CERN Document Server

    Rogers, David F

    2013-01-01

    Computer Graphics in Engineering Education discusses the use of Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) as an instructional material in engineering education. Each of the nine chapters of this book covers topics and cites examples that are relevant to the relationship of CAD-CAM with engineering education. The first chapter discusses the use of computer graphics in the U.S. Naval Academy, while Chapter 2 covers key issues in instructional computer graphics. This book then discusses low-cost computer graphics in engineering education. Chapter 4 discusses the uniform b

  1. Trouble Sleeping Associated With Lower Work Performance and Greater Health Care Costs: Longitudinal Data From Kansas State Employee Wellness Program.

    Science.gov (United States)

    Hui, Siu-kuen Azor; Grandner, Michael A

    2015-10-01

    To examine the relationships between employees' trouble sleeping and absenteeism, work performance, and health care expenditures over a 2-year period. Utilizing the Kansas State employee wellness program (EWP) data set from 2008 to 2009, multinomial logistic regression analyses were conducted with trouble sleeping as the predictor and absenteeism, work performance, and health care costs as the outcomes. EWP participants (N = 11,698 in 2008; 5636 followed up in 2009) who had higher levels of sleep disturbance were more likely to be absent from work (all P work performance ratings (all P health care costs (P work attendance, work performance, and health care costs.

  2. The Potential Cost-Effectiveness of Amblyopia Screening Programs

    Science.gov (United States)

    Rein, David B.; Wittenborn, John S.; Zhang, Xinzhi; Song, Michael; Saaddine, Jinan B.

    2013-01-01

    Background To estimate the incremental cost-effectiveness of amblyopia screening at preschool and kindergarten, we compared the costs and benefits of 3 amblyopia screening scenarios to no screening and to each other: (1) acuity/stereopsis (A/S) screening at kindergarten, (2) A/S screening at preschool and kindergarten, and (3) photoscreening at preschool and A/S screening at kindergarten. Methods We programmed a probabilistic microsimulation model of amblyopia natural history and response to treatment with screening costs and outcomes estimated from 2 state programs. We calculated the probability that no screening and each of the 3 interventions were most cost-effective per incremental quality-adjusted life year (QALY) gained and case avoided. Results Assuming a minimal 0.01 utility loss from monocular vision loss, no screening was most cost-effective with a willingness to pay (WTP) of less than $16,000 per QALY gained. A/S screening at kindergarten alone was most cost-effective between a WTP of $17,000 and $21,000. A/S screening at preschool and kindergarten was most cost-effective between a WTP of $22,000 and $75,000, and photoscreening at preschool and A/S screening at kindergarten was most cost-effective at a WTP greater than $75,000. Cost-effectiveness substantially improved when assuming a greater utility loss. All scenarios were cost-effective when assuming a WTP of $10,500 per case of amblyopia cured. Conclusions All 3 screening interventions evaluated are likely to be considered cost-effective relative to many other potential public health programs. The choice of screening option depends on budgetary resources and the value placed on monocular vision loss prevention by funding agencies. PMID:21877675

  3. Security Architecture of Cloud Computing

    OpenAIRE

    V.KRISHNA REDDY; Dr. L.S.S.REDDY

    2011-01-01

    The Cloud Computing offers service over internet with dynamically scalable resources. Cloud Computing services provides benefits to the users in terms of cost and ease of use. Cloud Computing services need to address the security during the transmission of sensitive data and critical applications to shared and public cloud environments. The cloud environments are scaling large for data processing and storage needs. Cloud computing environment have various advantages as well as disadvantages o...

  4. Optimization Using Metamodeling in the Context of Integrated Computational Materials Engineering (ICME)

    Energy Technology Data Exchange (ETDEWEB)

    Hammi, Youssef; Horstemeyer, Mark F; Wang, Paul; David, Francis; Carino, Ricolindo

    2013-11-18

    model in Abaqus FE analyses of the tube forming, sizing, drawing, welding, and normalizing processes. The simulation results coupled with the manufacturing cost data were used to develop prototype metamodeling (quick response) codes which could be used to predict and optimize the microstructure-process-property-cost relationships. The developed ICME metamodeling toolkits are flexible enough to be applied to other manufacturing processes (e.g. forging, forming, casting, extrusion, rolling, stamping, and welding/joining) and metamodeling codes can run on laptop computers. Based on the work completed in Phase I, in Phase II, PDT proposes to continue to refine the ISV model by correlating and incorporating the uncertainties in the microstructure, mechanical testing, and modeling. Following the model refinement, FE analyses will be simulated and will provide even more realistic predictions as they include an appropriate window of uncertainty. Using the HPC output (FE analyses) as input, the quick-response metamodel codes will more accurately predict and optimize the microstructure-process-property-cost relationships. Furthermore, PDT propose to employ the ICME metamodeling toolkits to help develop a new tube product using entirely new high strength steel. The modeling of the high strength steel manufacturing process will replace the costly and time consuming trial-and-error methods that were used in the tubing industry previously. This simulation-based process prototyping will greatly benefit our industrial partners by opening up new market spaces due to new products with greater capabilities.

  5. How rebates, copayments, and administration costs affect the cost-effectiveness of osteoporosis therapies.

    Science.gov (United States)

    Ferko, Nicole C; Borisova, Natalie; Airia, Parisa; Grima, Daniel T; Thompson, Melissa F

    2012-11-01

    Because of rising drug expenditures, cost considerations have become essential, necessitating the requirement for cost-effectiveness analyses for managed care organizations (MCOs). The study objective is to examine the impact of various drug-cost components, in addition to wholesale acquisition cost (WAC), on the cost-effectiveness of osteoporosis therapies. A Markov model of osteoporosis was used to exemplify different drug cost scenarios. We examined the effect of varying rebates for oral bisphosphonates--risedronate and ibandronate--as well as considering the impact of varying copayments and administration costs for intravenous zoledronate. The population modeled was 1,000 American women, > or = 50 years with osteoporosis. Patients were followed for 1 year to reflect an annual budget review of formularies by MCOs. The cost of therapy was based on an adjusted WAC, and is referred to as net drug cost. The total annual cost incurred by an MCO for each drug regimen was calculated using the net drug cost and fracture cost. We estimated cost on a quality adjusted life year (QALY) basis. When considering different rebates, results for risedronate versus ibandronate vary from cost-savings (i.e., costs less and more effective) to approximately $70,000 per QALY. With no risedronate rebate, an ibandronate rebate of approximately 65% is required before cost per QALY surpasses $50,000. With rebates greater than 25% for risedronate, irrespective of ibandronate rebates, results become cost-saving. Results also showed the magnitude of cost savings to the MCO varied by as much as 65% when considering no administration cost and the highest coinsurance rate for zoledronate. Our study showed that cost-effectiveness varies considerably when factors in addition to the WAC are considered. This paper provides recommendations for pharmaceutical manufacturers and MCOs when developing and interpreting such analyses.

  6. Estimate of the cost of multiple sclerosis in Spain by literature review.

    Science.gov (United States)

    Fernández, Oscar; Calleja-Hernández, Miguel Angel; Meca-Lallana, José; Oreja-Guevara, Celia; Polanco, Ana; Pérez-Alcántara, Ferran

    2017-08-01

    Multiple Sclerosis (MS) is a progressive disease leading to increasing disability and costs. A literature review was carried out to identify MS costs and to estimate its economic burden in Spain. Areas Covered: The public electronic databases PubMed, ScienceDirect and IBECS were consulted and a manual review of communications presented at related congresses was carried out. A total of 225 references were obtained, of which 43 were finally included in the study. Expert Commentary: Three major cost groups were identified: direct healthcare costs, direct non-healthcare costs and indirect costs. There is a direct relationship between disease progression and increased costs, mainly direct non-healthcare costs (greater need for informal care) and indirect costs (greater loss of productivity). The total cost associated with MS in Spain is €1,395 million per year, and that the mean annual cost per patient is €30,050. Beyond costs, a large impact on the quality of life of patients, with an annual loss of up to 13,000 quality-adjusted life years was also estimated. MS has a large economic impact on Spanish society and a significant impact on the quality of life of patients.

  7. Cost benefit analysis of instrumentation, supervision and control systems for nuclear power plants

    International Nuclear Information System (INIS)

    Hagen, P.

    1973-08-01

    A cost benefit analysis is carried out on a BWR type reactor power plant in which an on-line computer performs plant supervision, reporting, logging, calibration and control functions, using display devices and plotters, while an off-line computer is available for bigger jobs such as fuel management calculations. All on-line functions are briefly described and specified. Three types of computer system are considered, a simplex system, a dual computer system and a multi-processor system. These systems are analysed with respect to reliability, back-up instrumentation requirements and costs. While the multiprocessor system gave in all cases the lowest annual failure costs, the margin to the duplex system was so small that hardware, maintenance and software costs would play an important role in making a decision. (JIW)

  8. BENEFIT-COST ANALYSIS IN U.S. ENVIRONMENTAL REGULATORY DECISIONS

    OpenAIRE

    Easter, K. William; Archibald, Sandra O.

    1998-01-01

    As the number and cost of environmental regulations have increased over the last thirty years, the regulated community, taxpayers, and policy makers have begun to demand that the benefits of regulations justify their costs. The use of benefit-cost analysis as an integral part of developing new regulations is increasing and the demands and expectations being placed on the method have expanded. Although benefit-cost analysis is expected to play an even greater role in environmental decision mak...

  9. Factors cost effectively improved using computer simulations of ...

    African Journals Online (AJOL)

    LPhidza

    effectively managed using computer simulations in semi-arid conditions pertinent to much of sub-Saharan Africa. ... small scale farmers to obtain optimal crop yields thus ensuring their food security and livelihood is ... those that simultaneously incorporate and simulate processes involved throughout the course of crop ...

  10. 20 CFR 228.60 - Cost-of-living increase.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Cost-of-living increase. 228.60 Section 228... COMPUTATION OF SURVIVOR ANNUITIES The Tier II Annuity Component § 228.60 Cost-of-living increase. The tier II... tier II component at the time the survivor annuity begins, all cost-of-living increases that were...

  11. Integrated Cost Allocation of Transmission Usage under Electricity Markets

    Directory of Open Access Journals (Sweden)

    Hermagasantos Zein

    2012-08-01

    Full Text Available Cost allocation of transmission usage on the power networks is an important issue especially in the modern electricity market mechanism. In this context, all costs that have been embedded in the transmission, embedded cost, should be covered by the transmission users. This paper follows general methods, where generators are fullyresponsible to cover the embedded cost. It proposes a method to determine the cost allocation of transmission usage based on decomposition through the superposition techinique to determine power flow contributions from an integrated base case of the results of the power flow calculations of all transactions, bilateral and nonbilateral contracts. Mathematically, the applied formulations are illustrated clearly in this paper. The proposed method has been tested with 5-bus system and the results are much different compared to a few of the published methods. This is shown by the test results on the 5 bus system. The published methods produce total power flow contributions in each line is greater than the actual. And they earn total revenues approximately 11.6% greater than the embedded cost. While on the proposed method, the power flow contribu tions are equal to the actual and the revenues are equal to the embedded cost. It shows also that the proposed method gives results as expected.

  12. Stereo Disparity through Cost Aggregation with Guided Filter

    Directory of Open Access Journals (Sweden)

    Pauline Tan

    2014-10-01

    Full Text Available Estimating the depth, or equivalently the disparity, of a stereo scene is a challenging problem in computer vision. The method proposed by Rhemann et al. in 2011 is based on a filtering of the cost volume, which gives for each pixel and for each hypothesized disparity a cost derived from pixel-by-pixel comparison. The filtering is performed by the guided filter proposed by He et al. in 2010. It computes a weighted local average of the costs. The weights are such that similar pixels tend to have similar costs. Eventually, a winner-take-all strategy selects the disparity with the minimal cost for each pixel. Non-consistent labels according to left-right consistency are rejected; a densification step can then be launched to fill the disparity map. The method can be used to solve other labeling problems (optical flow, segmentation but this article focuses on the stereo matching problem.

  13. Computer Augmented Learning; A Survey.

    Science.gov (United States)

    Kindred, J.

    The report contains a description and summary of computer augmented learning devices and systems. The devices are of two general types programed instruction systems based on the teaching machines pioneered by Pressey and developed by Skinner, and the so-called "docile" systems that permit greater user-direction with the computer under student…

  14. Effectiveness and cost effectiveness of television, radio and print advertisements in promoting the New York smokers' quitline.

    Science.gov (United States)

    Farrelly, Matthew C; Hussin, Altijani; Bauer, Ursula E

    2007-12-01

    This study assessed the relative effectiveness and cost effectiveness of television, radio and print advertisements to generate calls to the New York smokers' quitline. Regression analysis was used to link total county level monthly quitline calls to television, radio and print advertising expenditures. Based on regression results, standardised measures of the relative effectiveness and cost effectiveness of expenditures were computed. There was a positive and statistically significant relation between call volume and expenditures for television (padvertisements and a marginally significant effect for expenditures on newspaper advertisements (peffect was for television advertising. However, because of differences in advertising costs, for every $1000 increase in television, radio and newspaper expenditures, call volume increased by 0.1%, 5.7% and 2.8%, respectively. Television, radio and print media all effectively increased calls to the New York smokers' quitline. Although increases in expenditures for television were the most effective, their relatively high costs suggest they are not currently the most cost effective means to promote a quitline. This implies that a more efficient mix of media would place greater emphasis on radio than television. However, because the current study does not adequately assess the extent to which radio expenditures would sustain their effectiveness with substantial expenditure increases, it is not feasible to determine a more optimal mix of expenditures.

  15. Global timber investments, wood costs, regulation, and risk

    Science.gov (United States)

    F. Cubbage; S. Koesbandana; P Mac Donagh; R. Rubilar; G Balmelli; V. Morales Olmos; R. De La Torre; M. Murara; V.A. Hoeflich; H. Kotze; R Gonzalez; O. Carrero; G. Frey; T. Adams; J. Turner; R. Lord; J. Huang; C. MacIntyre; Kathleen McGinley; R. Abt; R. Phillips

    2010-01-01

    We estimated financial returns and wood production costs in 2008 for the primary timber plantation species. Excluding land costs, returns for exotic plantations in almost all of South America e Brazil, Argentina, Uruguay, Chile, Colombia, Venezuela, and Paraguay e were substantial. Eucalyptus species returns were generally greater than those for Pinus species in each...

  16. Computer software to estimate timber harvesting system production, cost, and revenue

    Science.gov (United States)

    Dr. John E. Baumgras; Dr. Chris B. LeDoux

    1992-01-01

    Large variations in timber harvesting cost and revenue can result from the differences between harvesting systems, the variable attributes of harvesting sites and timber stands, or changing product markets. Consequently, system and site specific estimates of production rates and costs are required to improve estimates of harvesting revenue. This paper describes...

  17. Computers for Your Classroom: CAI and CMI.

    Science.gov (United States)

    Thomas, David B.; Bozeman, William C.

    1981-01-01

    The availability of compact, low-cost computer systems provides a means of assisting classroom teachers in the performance of their duties. Computer-assisted instruction (CAI) and computer-managed instruction (CMI) are two applications of computer technology with which school administrators should become familiar. CAI is a teaching medium in which…

  18. Computer systems: What the future holds

    Science.gov (United States)

    Stone, H. S.

    1976-01-01

    Developement of computer architecture is discussed in terms of the proliferation of the microprocessor, the utility of the medium-scale computer, and the sheer computational power of the large-scale machine. Changes in new applications brought about because of ever lowering costs, smaller sizes, and faster switching times are included.

  19. Battlefield awareness computers: the engine of battlefield digitization

    Science.gov (United States)

    Ho, Jackson; Chamseddine, Ahmad

    1997-06-01

    To modernize the army for the 21st century, the U.S. Army Digitization Office (ADO) initiated in 1995 the Force XXI Battle Command Brigade-and-Below (FBCB2) Applique program which became a centerpiece in the U.S. Army's master plan to win future information wars. The Applique team led by TRW fielded a 'tactical Internet' for Brigade and below command to demonstrate the advantages of 'shared situation awareness' and battlefield digitization in advanced war-fighting experiments (AWE) to be conducted in March 1997 at the Army's National Training Center in California. Computing Devices is designated the primary hardware developer for the militarized version of the battlefield awareness computers. The first generation of militarized battlefield awareness computer, designated as the V3 computer, was an integration of off-the-shelf components developed to meet the agressive delivery requirements of the Task Force XXI AWE. The design efficiency and cost effectiveness of the computer hardware were secondary in importance to delivery deadlines imposed by the March 1997 AWE. However, declining defense budgets will impose cost constraints on the Force XXI production hardware that can only be met by rigorous value engineering to further improve design optimization for battlefield awareness without compromising the level of reliability the military has come to expect in modern military hardened vetronics. To answer the Army's needs for a more cost effective computing solution, Computing Devices developed a second generation 'combat ready' battlefield awareness computer, designated the V3+, which is designed specifically to meet the upcoming demands of Force XXI (FBCB2) and beyond. The primary design objective is to achieve a technologically superior design, value engineered to strike an optimal balance between reliability, life cycle cost, and procurement cost. Recognizing that the diverse digitization demands of Force XXI cannot be adequately met by any one computer hardware

  20. Land cover mapping of Greater Mesoamerica using MODIS data

    Science.gov (United States)

    Giri, Chandra; Jenkins, Clinton N.

    2005-01-01

    A new land cover database of Greater Mesoamerica has been prepared using moderate resolution imaging spectroradiometer (MODIS, 500 m resolution) satellite data. Daily surface reflectance MODIS data and a suite of ancillary data were used in preparing the database by employing a decision tree classification approach. The new land cover data are an improvement over traditional advanced very high resolution radiometer (AVHRR) based land cover data in terms of both spatial and thematic details. The dominant land cover type in Greater Mesoamerica is forest (39%), followed by shrubland (30%) and cropland (22%). Country analysis shows forest as the dominant land cover type in Belize (62%), Cost Rica (52%), Guatemala (53%), Honduras (56%), Nicaragua (53%), and Panama (48%), cropland as the dominant land cover type in El Salvador (60.5%), and shrubland as the dominant land cover type in Mexico (37%). A three-step approach was used to assess the quality of the classified land cover data: (i) qualitative assessment provided good insight in identifying and correcting gross errors; (ii) correlation analysis of MODIS- and Landsat-derived land cover data revealed strong positive association for forest (r2 = 0.88), shrubland (r2 = 0.75), and cropland (r2 = 0.97) but weak positive association for grassland (r2 = 0.26); and (iii) an error matrix generated using unseen training data provided an overall accuracy of 77.3% with a Kappa coefficient of 0.73608. Overall, MODIS 500 m data and the methodology used were found to be quite useful for broad-scale land cover mapping of Greater Mesoamerica.

  1. Cloud computing for comparative genomics with windows azure platform.

    Science.gov (United States)

    Kim, Insik; Jung, Jae-Yoon; Deluca, Todd F; Nelson, Tristan H; Wall, Dennis P

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services.

  2. Time Domain Partitioning of Electricity Production Cost Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Barrows, C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hummon, M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Jones, W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hale, E. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-01-01

    Production cost models are often used for planning by simulating power system operations over long time horizons. The simulation of a day-ahead energy market can take several weeks to compute. Tractability improvements are often made through model simplifications, such as: reductions in transmission modeling detail, relaxation of commitment variable integrality, reductions in cost modeling detail, etc. One common simplification is to partition the simulation horizon so that weekly or monthly horizons can be simulated in parallel. However, horizon partitions are often executed with overlap periods of arbitrary and sometimes zero length. We calculate the time domain persistence of historical unit commitment decisions to inform time domain partitioning of production cost models. The results are implemented using PLEXOS production cost modeling software in an HPC environment to improve the computation time of simulations while maintaining solution integrity.

  3. Cost-effectiveness of clinical decision support system in improving maternal health care in Ghana.

    Directory of Open Access Journals (Sweden)

    Maxwell Ayindenaba Dalaba

    Full Text Available This paper investigated the cost-effectiveness of a computer-assisted Clinical Decision Support System (CDSS in the identification of maternal complications in Ghana.A cost-effectiveness analysis was performed in a before- and after-intervention study. Analysis was conducted from the provider's perspective. The intervention area was the Kassena- Nankana district where computer-assisted CDSS was used by midwives in maternal care in six selected health centres. Six selected health centers in the Builsa district served as the non-intervention group, where the normal Ghana Health Service activities were being carried out.Computer-assisted CDSS increased the detection of pregnancy complications during antenatal care (ANC in the intervention health centres (before-intervention = 9 /1,000 ANC attendance; after-intervention = 12/1,000 ANC attendance; P-value = 0.010. In the intervention health centres, there was a decrease in the number of complications during labour by 1.1%, though the difference was not statistically significant (before-intervention =107/1,000 labour clients; after-intervention = 96/1,000 labour clients; P-value = 0.305. Also, at the intervention health centres, the average cost per pregnancy complication detected during ANC (cost -effectiveness ratio decreased from US$17,017.58 (before-intervention to US$15,207.5 (after-intervention. Incremental cost -effectiveness ratio (ICER was estimated at US$1,142. Considering only additional costs (cost of computer-assisted CDSS, cost per pregnancy complication detected was US$285.Computer -assisted CDSS has the potential to identify complications during pregnancy and marginal reduction in labour complications. Implementing computer-assisted CDSS is more costly but more effective in the detection of pregnancy complications compared to routine maternal care, hence making the decision to implement CDSS very complex. Policy makers should however be guided by whether the additional benefit is worth

  4. Cost-effective pediatric head and body phantoms for computed tomography dosimetry and its evaluation using pencil ion chamber and CT dose profiler

    Directory of Open Access Journals (Sweden)

    A Saravanakumar

    2015-01-01

    Full Text Available In the present work, a pediatric head and body phantom was fabricated using polymethyl methacrylate (PMMA at a low cost when compared to commercially available phantoms for the purpose of computed tomography (CT dosimetry. The dimensions of head and body phantoms were 10 cm diameter, 15 cm length and 16 cm diameter, 15 cm length, respectively. The dose from a 128-slice CT machine received by the head and body phantom at the center and periphery were measured using a 100 mm pencil ion chamber and 150 mm CT dose profiler (CTDP. Using these values, the weighted computed tomography dose index (CTDIw and in turn the volumetric CTDI (CTDIv were calculated for various combinations of tube voltage and current-time product. A similar study was carried out using standard calibrated phantom and the results have been compared with the fabricated ones to ascertain that the performance of the latter is equivalent to that of the former. Finally, CTDIv measured using fabricated and standard phantoms were compared with respective values displayed on the console. The difference between the values was well within the limits specified by Atomic Energy Regulatory Board (AERB, India. These results indicate that the cost-effective pediatric phantom can be employed for CT dosimetry.

  5. ResourceGate: A New Solution for Cloud Computing Resource Allocation

    OpenAIRE

    Abdullah A. Sheikh

    2012-01-01

    Cloud computing has taken place to be focused by educational and business communities. These concerns include their needs to improve the Quality of Services (QoS) provided, also services such as reliability, performance and reducing costs. Cloud computing provides many benefits in terms of low cost and accessibility of data. Ensuring these benefits is considered to be the major factor in the cloud computing environment. This paper surveys recent research related to cloud computing resource al...

  6. ANL statement of site strategy for computing workstations

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R. (ed.); Boxberger, L.M.; Amiot, L.W.; Bretscher, M.E.; Engert, D.E.; Moszur, F.M.; Mueller, C.J.; O' Brien, D.E.; Schlesselman, C.G.; Troyer, L.J.

    1991-11-01

    This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is to develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.

  7. 19 CFR 152.106 - Computed value.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Computed value. 152.106 Section 152.106 Customs... (CONTINUED) CLASSIFICATION AND APPRAISEMENT OF MERCHANDISE Valuation of Merchandise § 152.106 Computed value. (a) Elements. The computed value of imported merchandise is the sum of: (1) The cost or value of the...

  8. Greater power and computational efficiency for kernel-based association testing of sets of genetic variants.

    Science.gov (United States)

    Lippert, Christoph; Xiang, Jing; Horta, Danilo; Widmer, Christian; Kadie, Carl; Heckerman, David; Listgarten, Jennifer

    2014-11-15

    Set-based variance component tests have been identified as a way to increase power in association studies by aggregating weak individual effects. However, the choice of test statistic has been largely ignored even though it may play an important role in obtaining optimal power. We compared a standard statistical test-a score test-with a recently developed likelihood ratio (LR) test. Further, when correction for hidden structure is needed, or gene-gene interactions are sought, state-of-the art algorithms for both the score and LR tests can be computationally impractical. Thus we develop new computationally efficient methods. After reviewing theoretical differences in performance between the score and LR tests, we find empirically on real data that the LR test generally has more power. In particular, on 15 of 17 real datasets, the LR test yielded at least as many associations as the score test-up to 23 more associations-whereas the score test yielded at most one more association than the LR test in the two remaining datasets. On synthetic data, we find that the LR test yielded up to 12% more associations, consistent with our results on real data, but also observe a regime of extremely small signal where the score test yielded up to 25% more associations than the LR test, consistent with theory. Finally, our computational speedups now enable (i) efficient LR testing when the background kernel is full rank, and (ii) efficient score testing when the background kernel changes with each test, as for gene-gene interaction tests. The latter yielded a factor of 2000 speedup on a cohort of size 13 500. Software available at http://research.microsoft.com/en-us/um/redmond/projects/MSCompBio/Fastlmm/. heckerma@microsoft.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  9. Cost optimization for buildings with hybrid ventilation systems

    Science.gov (United States)

    Ji, Kun; Lu, Yan

    2018-02-13

    A method including: computing a total cost for a first zone in a building, wherein the total cost is equal to an actual energy cost of the first zone plus a thermal discomfort cost of the first zone; and heuristically optimizing the total cost to identify temperature setpoints for a mechanical heating/cooling system and a start time and an end time of the mechanical heating/cooling system, based on external weather data and occupancy data of the first zone.

  10. Evaluating biomechanics of user-selected sitting and standing computer workstation.

    Science.gov (United States)

    Lin, Michael Y; Barbir, Ana; Dennerlein, Jack T

    2017-11-01

    A standing computer workstation has now become a popular modern work place intervention to reduce sedentary behavior at work. However, user's interaction related to a standing computer workstation and its differences with a sitting workstation need to be understood to assist in developing recommendations for use and set up. The study compared the differences in upper extremity posture and muscle activity between user-selected sitting and standing workstation setups. Twenty participants (10 females, 10 males) volunteered for the study. 3-D posture, surface electromyography, and user-reported discomfort were measured while completing simulated tasks with each participant's self-selected workstation setups. Sitting computer workstation associated with more non-neutral shoulder postures and greater shoulder muscle activity, while standing computer workstation induced greater wrist adduction angle and greater extensor carpi radialis muscle activity. Sitting computer workstation also associated with greater shoulder abduction postural variation (90th-10th percentile) while standing computer workstation associated with greater variation for should rotation and wrist extension. Users reported similar overall discomfort levels within the first 10 min of work but had more than twice as much discomfort while standing than sitting after 45 min; with most discomfort reported in the low back for standing and shoulder for sitting. These different measures provide understanding in users' different interactions with sitting and standing and by alternating between the two configurations in short bouts may be a way of changing the loading pattern on the upper extremity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. The reliability of running economy expressed as oxygen cost and energy cost in trained distance runners.

    Science.gov (United States)

    Shaw, Andrew J; Ingham, Stephen A; Fudge, Barry W; Folland, Jonathan P

    2013-12-01

    This study assessed the between-test reliability of oxygen cost (OC) and energy cost (EC) in distance runners, and contrasted it with the smallest worthwhile change (SWC) of these measures. OC and EC displayed similar levels of within-subject variation (typical error < 3.85%). However, the typical error (2.75% vs 2.74%) was greater than the SWC (1.38% vs 1.71%) for both OC and EC, respectively, indicating insufficient sensitivity to confidently detect small, but meaningful, changes in OC and EC.

  12. Costs associated with implementation of computer-assisted clinical decision support system for antenatal and delivery care: case study of Kassena-Nankana district of northern Ghana.

    Directory of Open Access Journals (Sweden)

    Maxwell Ayindenaba Dalaba

    Full Text Available This study analyzed cost of implementing computer-assisted Clinical Decision Support System (CDSS in selected health care centres in Ghana.A descriptive cross sectional study was conducted in the Kassena-Nankana district (KND. CDSS was deployed in selected health centres in KND as an intervention to manage patients attending antenatal clinics and the labour ward. The CDSS users were mainly nurses who were trained. Activities and associated costs involved in the implementation of CDSS (pre-intervention and intervention were collected for the period between 2009-2013 from the provider perspective. The ingredients approach was used for the cost analysis. Costs were grouped into personnel, trainings, overheads (recurrent costs and equipment costs (capital cost. We calculated cost without annualizing capital cost to represent financial cost and cost with annualizing capital costs to represent economic cost.Twenty-two trained CDSS users (at least 2 users per health centre participated in the study. Between April 2012 and March 2013, users managed 5,595 antenatal clients and 872 labour clients using the CDSS. We observed a decrease in the proportion of complications during delivery (pre-intervention 10.74% versus post-intervention 9.64% and a reduction in the number of maternal deaths (pre-intervention 4 deaths versus post-intervention 1 death. The overall financial cost of CDSS implementation was US$23,316, approximately US$1,060 per CDSS user trained. Of the total cost of implementation, 48% (US$11,272 was pre-intervention cost and intervention cost was 52% (US$12,044. Equipment costs accounted for the largest proportion of financial cost: 34% (US$7,917. When economic cost was considered, total cost of implementation was US$17,128-lower than the financial cost by 26.5%.The study provides useful information in the implementation of CDSS at health facilities to enhance health workers' adherence to practice guidelines and taking accurate decisions to

  13. Costs associated with implementation of computer-assisted clinical decision support system for antenatal and delivery care: case study of Kassena-Nankana district of northern Ghana.

    Science.gov (United States)

    Dalaba, Maxwell Ayindenaba; Akweongo, Patricia; Williams, John; Saronga, Happiness Pius; Tonchev, Pencho; Sauerborn, Rainer; Mensah, Nathan; Blank, Antje; Kaltschmidt, Jens; Loukanova, Svetla

    2014-01-01

    This study analyzed cost of implementing computer-assisted Clinical Decision Support System (CDSS) in selected health care centres in Ghana. A descriptive cross sectional study was conducted in the Kassena-Nankana district (KND). CDSS was deployed in selected health centres in KND as an intervention to manage patients attending antenatal clinics and the labour ward. The CDSS users were mainly nurses who were trained. Activities and associated costs involved in the implementation of CDSS (pre-intervention and intervention) were collected for the period between 2009-2013 from the provider perspective. The ingredients approach was used for the cost analysis. Costs were grouped into personnel, trainings, overheads (recurrent costs) and equipment costs (capital cost). We calculated cost without annualizing capital cost to represent financial cost and cost with annualizing capital costs to represent economic cost. Twenty-two trained CDSS users (at least 2 users per health centre) participated in the study. Between April 2012 and March 2013, users managed 5,595 antenatal clients and 872 labour clients using the CDSS. We observed a decrease in the proportion of complications during delivery (pre-intervention 10.74% versus post-intervention 9.64%) and a reduction in the number of maternal deaths (pre-intervention 4 deaths versus post-intervention 1 death). The overall financial cost of CDSS implementation was US$23,316, approximately US$1,060 per CDSS user trained. Of the total cost of implementation, 48% (US$11,272) was pre-intervention cost and intervention cost was 52% (US$12,044). Equipment costs accounted for the largest proportion of financial cost: 34% (US$7,917). When economic cost was considered, total cost of implementation was US$17,128-lower than the financial cost by 26.5%. The study provides useful information in the implementation of CDSS at health facilities to enhance health workers' adherence to practice guidelines and taking accurate decisions to improve

  14. The Determinants of Costs and Length of Stay for Hip Fracture Patients

    Science.gov (United States)

    Castelli, Adriana; Daidone, Silvio; Jacobs, Rowena; Kasteridis, Panagiotis; Street, Andrew David

    2015-01-01

    Background and Purpose An ageing population at greater risk of proximal femoral fracture places an additional clinical and financial burden on hospital and community medical services. We analyse the variation in i) length of stay (LoS) in hospital and ii) costs across the acute care pathway for hip fracture from emergency admission, to hospital stay and follow-up outpatient appointments. Patients and Methods We analyse patient-level data from England for 2009/10 for around 60,000 hip fracture cases in 152 hospitals using a random effects generalized linear multi-level model where the dependent variable is given by the patient’s cost or length of stay (LoS). We control for socio-economic characteristics, type of fracture and intervention, co-morbidities, discharge destination of patients, and quality indicators. We also control for provider and social care characteristics. Results Older patients and those from more deprived areas have higher costs and LoS, as do those with specific co-morbidities or that develop pressure ulcers, and those transferred between hospitals or readmitted within 28 days. Costs are also higher for those having a computed tomography (CT) scan or cemented arthroscopy. Costs and LoS are lower for those admitted via a 24h emergency department, receiving surgery on the same day of admission, and discharged to their own homes. Interpretation Patient and treatment characteristics are more important as determinants of cost and LoS than provider or social care factors. A better understanding of the impact of these characteristics can support providers to develop treatment strategies and pathways to better manage this patient population. PMID:26204450

  15. Finding New Math Identities by Computer

    Science.gov (United States)

    Bailey, David H.; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Recently a number of interesting new mathematical identities have been discovered by means of numerical searches on high performance computers, using some newly discovered algorithms. These include the following: pi = ((sup oo)(sub k=0))(Sigma) (1 / 16) (sup k) ((4 / 8k+1) - (2 / 8k+4) - (1 / 8k+5) - (1 / 8k+6)) and ((17 pi(exp 4)) / 360) = ((sup oo)(sub k=1))(Sigma) (1 + (1/2) + (1/3) + ... + (1/k))(exp 2) k(exp -2), zeta(3, 1, 3, 1, ..., 3, 1) = (2 pi(exp 4m) / (4m+2)! where m = number of (3,1) pairs. and where zeta(n1,n2,...,nr) = (sub k1 (is greater than) k2 (is greater than) ... (is greater than) kr)(Sigma) (1 / (k1 (sup n1) k2 (sup n2) ... kr (sup nr). The first identity is remarkable in that it permits one to compute the n-th binary or hexadecimal digit of pu directly, without computing any of the previous digits, and without using multiple precision arithmetic. Recently the ten billionth hexadecimal digit of pi was computed using this formula. The third identity has connections to quantum field theory. (The first and second of these been formally established; the third is affirmed by numerical evidence only.) The background and results of this work will be described, including an overview of the algorithms and computer techniques used in these studies.

  16. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Science.gov (United States)

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  17. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Directory of Open Access Journals (Sweden)

    Seyhan Yazar

    Full Text Available A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR on Amazon EC2 instances and Google Compute Engine (GCE, using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2 for E.coli and 53.5% (95% CI: 34.4-72.6 for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1 and 173.9% (95% CI: 134.6-213.1 more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  18. A model for calculating the optimal replacement interval of computer systems

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1981-08-01

    A mathematical model for calculating the optimal replacement interval of computer systems is described. This model is made to estimate the best economical interval of computer replacement when computing demand, cost and performance of computer, etc. are known. The computing demand is assumed to monotonously increase every year. Four kinds of models are described. In the model 1, a computer system is represented by only a central processing unit (CPU) and all the computing demand is to be processed on the present computer until the next replacement. On the other hand in the model 2, the excessive demand is admitted and may be transferred to other computing center and processed costly there. In the model 3, the computer system is represented by a CPU, memories (MEM) and input/output devices (I/O) and it must process all the demand. Model 4 is same as model 3, but the excessive demand is admitted to be processed in other center. (1) Computing demand at the JAERI, (2) conformity of Grosch's law for the recent computers, (3) replacement cost of computer systems, etc. are also described. (author)

  19. Computer networks and advanced communications

    International Nuclear Information System (INIS)

    Koederitz, W.L.; Macon, B.S.

    1992-01-01

    One of the major methods for getting the most productivity and benefits from computer usage is networking. However, for those who are contemplating a change from stand-alone computers to a network system, the investigation of actual networks in use presents a paradox: network systems can be highly productive and beneficial; at the same time, these networks can create many complex, frustrating problems. The issue becomes a question of whether the benefits of networking are worth the extra effort and cost. In response to this issue, the authors review in this paper the implementation and management of an actual network in the LSU Petroleum Engineering Department. The network, which has been in operation for four years, is large and diverse (50 computers, 2 sites, PC's, UNIX RISC workstations, etc.). The benefits, costs, and method of operation of this network will be described, and an effort will be made to objectively weigh these elements from the point of view of the average computer user

  20. Cost justification of chiller replacement

    International Nuclear Information System (INIS)

    Baker, T.J.; Baumer, R.A.

    1993-01-01

    We often hear of products with paybacks that are too good to be true. Just a few weeks ago,a client received a recommendation from a national service company's local office. In the letter the company recommended that open-quotes due to the age and condition of the boiler ... that the school consider replacing the boiler... The cost for the new boiler can usually be recovered by lower fuel bills in 2 to 3 yearsclose quotes. This was for an installation in Southeast Texas where the boiler is only used 4 to 5 months per year. Analysis show the above claims to be nonsense. A new boiler would cost about $47,000 installed. Current total gas bills for the facility are $15,630 per year. They would have to shutoff the gas to the facility to have a three year payback. In fact, only two-thirds of the gas is used to heat the facility so we have only $10, 000 to write off against the new boiler. How much will the greater efficiency save? A 30% savings due to greater efficiency produces $3,000 per year in gas savings to offset the $47,000 cost, a 16 year payback. And much of the efficiency savings can be realized by adjusting the existing boiler. In another care a client wanted to investigate replacement of a twenty year old chiller plant with more efficient equipment. We investigated the project and determined that the payback would be greater than ten years. They did not operate the equipment during the summer and at less than 50% of capacity the balance of the year

  1. 47 CFR 32.2124 - General purpose computers.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false General purpose computers. 32.2124 Section 32... General purpose computers. (a) This account shall include the original cost of computers and peripheral... financial, statistical, or other business analytical reports; preparation of payroll, customer bills, and...

  2. Cost of treating sagittal synostosis in the first year of life.

    Science.gov (United States)

    Abbott, Megan M; Rogers, Gary F; Proctor, Mark R; Busa, Kathleen; Meara, John G

    2012-01-01

    Endoscopically assisted suturectomy (EAS) has been reported to reduce the morbidity and cost of treating sagittal synostosis when compared with traditional open cranial vault remodeling (CVR) procedures. Whereas the former claim is well substantiated and intuitive, the latter has not been validated by rigorous cost analysis. Patient medical records and financial database reports were culled retrospectively to determine the total cost associated with both EAS and CVR during 1 year of care. Recorded cost data included physician and hospital services, orthotic equipment and fittings, and indirect patient cost. Ten patients treated with CVR were compared with 10 patients who underwent EAS. The CVR patients incurred greater costs in nearly all categories studied, including overall 1-year costs, physician services, hospital services, supplies/equipment, medications/intravenous fluids, and laboratory and blood bank services. Postoperative costs were greater in the EAS group, primarily because of the cost associated with orthotic services and indirect patient costs for travel and lost work. However, overall indirect patient costs for the whole year did not differ between the groups. One-year median costs were $55,121 for CVR and $23,377 for EAS. Early clinical results were similar for the 2 groups. Cranial vault remodeling was more costly in the first year of treatment than EAS, although indirect patient costs were similar. The favorable cost of EAS compared with CVR provides further justification to consider this procedure as first-line treatment of sagittal synostosis in young infants.

  3. The Cost-effectiveness of Alcohol Screening, Brief Intervention, and Referral to Treatment (SBIRT) in Emergency and Outpatient Medical Settings.

    Science.gov (United States)

    Barbosa, Carolina; Cowell, Alexander; Bray, Jeremy; Aldridge, Arnie

    2015-06-01

    This study analyzed the cost-effectiveness of delivering alcohol screening, brief intervention, and referral to treatment (SBIRT) in emergency departments (ED) when compared to outpatient medical settings. A probabilistic decision analytic tree categorized patients into health states. Utility weights and social costs were assigned to each health state. Health outcome measures were the proportion of patients not drinking above threshold levels at follow-up, the proportion of patients transitioning from above threshold levels at baseline to abstinent or below threshold levels at follow-up, and the quality-adjusted life years (QALYs) gained. Expected costs under a provider perspective were the marginal costs of SBIRT, and under a societal perspective were the sum of SBIRT cost per patient and the change in social costs. Incremental cost-effectiveness ratios were computed. When considering provider costs only, compared to outpatient, SBIRT in ED cost $8.63 less, generated 0.005 more QALYs per patient, and resulted in 13.8% more patients drinking below threshold levels. Sensitivity analyses in which patients were assumed to receive a fixed number of treatment sessions that met clinical sites' guidelines made SBIRT more expensive in ED than outpatient; the ED remained more effective. In this sensitivity analysis, the ED was the most cost-effective setting if decision makers were willing to pay more than $1500 per QALY gained. Alcohol SBIRT generates costs savings and improves health in both ED and outpatient settings. EDs provide better effectiveness at a lower cost and greater social cost reductions than outpatient. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. City-scale analysis of water-related energy identifies more cost-effective solutions.

    Science.gov (United States)

    Lam, Ka Leung; Kenway, Steven J; Lant, Paul A

    2017-02-01

    Energy and greenhouse gas management in urban water systems typically focus on optimising within the direct system boundary of water utilities that covers the centralised water supply and wastewater treatment systems, despite a greater energy influence by the water end use. This work develops a cost curve of water-related energy management options from a city perspective for a hypothetical Australian city. It is compared with that from the water utility perspective. The curves are based on 18 water-related energy management options that have been implemented or evaluated in Australia. In the studied scenario, the cost-effective energy saving potential from a city perspective (292 GWh/year) is far more significant than that from a utility perspective (65 GWh/year). In some cases, for similar capital cost, if regional water planners invested in end use options instead of utility options, a greater energy saving potential at a greater cost-effectiveness could be achieved in urban water systems. For example, upgrading a wastewater treatment plant for biogas recovery at a capital cost of $27.2 million would save 31 GWh/year with a marginal cost saving of $63/MWh, while solar hot water system rebates at a cost of $28.6 million would save 67 GWh/year with a marginal cost saving of $111/MWh. Options related to hot water use such as water-efficient shower heads, water-efficient clothes washers and solar hot water system rebates are among the most cost-effective city-scale opportunities. This study demonstrates the use of cost curves to compare both utility and end use options in a consistent framework. It also illustrates that focusing solely on managing the energy use within the utility would miss substantial non-utility water-related energy saving opportunities. There is a need to broaden the conventional scope of cost curve analysis to include water-related energy and greenhouse gas at the water end use, and to value their management from a city perspective. This

  5. Cost-effectiveness of computed tomography colonography in colorectal cancer screening: a systematic review.

    Science.gov (United States)

    Hanly, Paul; Skally, Mairead; Fenlon, Helen; Sharp, Linda

    2012-10-01

    The European Code Against Cancer recommends individuals aged ≥ 50 should participate in colorectal cancer screening. CT-colonography (CTC) is one of several screening tests available. We systematically reviewed evidence on, and identified key factors influencing, cost-effectiveness of CTC screening. PubMed, Medline, and the Cochrane library were searched for cost-effectiveness or cost-utility analyses of CTC-based screening, published in English, January 1999 to July 2010. Data was abstracted on setting, model type and horizon, screening scenario(s), comparator(s), participants, uptake, CTC performance and cost, effectiveness, ICERs, and whether extra-colonic findings and medical complications were considered. Sixteen studies were identified from the United States (n = 11), Canada (n = 2), and France, Italy, and the United Kingdom (1 each). Markov state-transition (n = 14) or microsimulation (n = 2) models were used. Eleven considered direct medical costs only; five included indirect costs. Fourteen compared CTC with no screening; fourteen compared CTC with colonoscopy-based screening; fewer compared CTC with sigmoidoscopy (8) or fecal tests (4). Outcomes assessed were life-years gained/saved (13), QALYs (2), or both (1). Three considered extra-colonic findings; seven considered complications. CTC appeared cost-effective versus no screening and, in general, flexible sigmoidoscopy and fecal occult blood testing. Results were mixed comparing CTC to colonoscopy. Parameters most influencing cost-effectiveness included: CTC costs, screening uptake, threshold for polyp referral, and extra-colonic findings. Evidence on cost-effectiveness of CTC screening is heterogeneous, due largely to between-study differences in comparators and parameter values. Future studies should: compare CTC with currently favored tests, especially fecal immunochemical tests; consider extra-colonic findings; and conduct comprehensive sensitivity analyses.

  6. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  7. Ready To Buy a Computer?

    Science.gov (United States)

    Rourke, Martha; Rourke, Patrick

    1974-01-01

    The school district business manager can make sound, cost-conscious decisions in the purchase of computer equipment by developing a list of cost-justified applications for automation, considering the software, writing performance specifications for bidding or negotiating a contract, and choosing the vendor wisely prior to the purchase; and by…

  8. [Cost per capita in outpatients by gender].

    Science.gov (United States)

    Villarreal-Ríos, Enrique; Campos Esparza, Maribel; Galicia Rodríguez, Liliana; Martínez González, Lidia; Vargas Daza, Emma Rosa; Torres Labra, Guadalupe; Patiño Vega, Adolfo; Rivera Martínez, María Teresa; Aparicio Rojas, Raúl; Juárez Durán, Martín

    2011-03-01

    The objective of this study is to identify the annual cost per capita by gender in first level of attention. It is a cost study in Family Physician Units in Mexico. The information corresponded to the year of 2004 and the study divided in the use profile and cost attention. USE PROFILE OF SERVICES: it was studied 1,585 clinical registries of patients, use profile defined by average and attention reasons by department, gender and age group. COST ATTENTION: considered in American dollars it included fixed unit cost (departmentalization adjusted by productivity), variable unit cost (micro cost technical), department unite cost by type attention, and department unit cost by age and gender. The life expectancy was of 73 years for men and 78 for women. Three scenes were identified. The annual cost per capita is superior among woman [US$73.24 (IC 95% $11.38 - $197.49)] than in man [$ 53.11 (IC 95% 2.51 - 207.71)]. The conclusion found that in the first level of attention the cost per capita is greater in woman than in man.

  9. The Future of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Anamaroa SIclovan

    2011-12-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offeredto the consumers as a product delivered online. This represents an advantage for the organization both regarding the cost and the opportunity for the new business. This paper presents the future perspectives in cloud computing. The paper presents some issues of the cloud computing paradigm. It is a theoretical paper.Keywords: Cloud Computing, Pay-per-use

  10. Herd-Level Mastitis-Associated Costs on Canadian Dairy Farms

    Science.gov (United States)

    Aghamohammadi, Mahjoob; Haine, Denis; Kelton, David F.; Barkema, Herman W.; Hogeveen, Henk; Keefe, Gregory P.; Dufour, Simon

    2018-01-01

    Mastitis imposes considerable and recurring economic losses on the dairy industry worldwide. The main objective of this study was to estimate herd-level costs incurred by expenditures and production losses associated with mastitis on Canadian dairy farms in 2015, based on producer reports. Previously, published mastitis economic frameworks were used to develop an economic model with the most important cost components. Components investigated were divided between clinical mastitis (CM), subclinical mastitis (SCM), and other costs components (i.e., preventive measures and product quality). A questionnaire was mailed to 374 dairy producers randomly selected from the (Canadian National Dairy Study 2015) to collect data on these costs components, and 145 dairy producers returned a completed questionnaire. For each herd, costs due to the different mastitis-related components were computed by applying the values reported by the dairy producer to the developed economic model. Then, for each herd, a proportion of the costs attributable to a specific component was computed by dividing absolute costs for this component by total herd mastitis-related costs. Median self-reported CM incidence was 19 cases/100 cow-year and mean self-reported bulk milk somatic cell count was 184,000 cells/mL. Most producers reported using post-milking teat disinfection (97%) and dry cow therapy (93%), and a substantial proportion of producers reported using pre-milking teat disinfection (79%) and wearing gloves during milking (77%). Mastitis costs were substantial (662 CAD per milking cow per year for a typical Canadian dairy farm), with a large portion of the costs (48%) being attributed to SCM, and 34 and 15% due to CM and implementation of preventive measures, respectively. For SCM, the two most important cost components were the subsequent milk yield reduction and culling (72 and 25% of SCM costs, respectively). For CM, first, second, and third most important cost components were culling (48

  11. Herd-Level Mastitis-Associated Costs on Canadian Dairy Farms

    Directory of Open Access Journals (Sweden)

    Mahjoob Aghamohammadi

    2018-05-01

    Full Text Available Mastitis imposes considerable and recurring economic losses on the dairy industry worldwide. The main objective of this study was to estimate herd-level costs incurred by expenditures and production losses associated with mastitis on Canadian dairy farms in 2015, based on producer reports. Previously, published mastitis economic frameworks were used to develop an economic model with the most important cost components. Components investigated were divided between clinical mastitis (CM, subclinical mastitis (SCM, and other costs components (i.e., preventive measures and product quality. A questionnaire was mailed to 374 dairy producers randomly selected from the (Canadian National Dairy Study 2015 to collect data on these costs components, and 145 dairy producers returned a completed questionnaire. For each herd, costs due to the different mastitis-related components were computed by applying the values reported by the dairy producer to the developed economic model. Then, for each herd, a proportion of the costs attributable to a specific component was computed by dividing absolute costs for this component by total herd mastitis-related costs. Median self-reported CM incidence was 19 cases/100 cow-year and mean self-reported bulk milk somatic cell count was 184,000 cells/mL. Most producers reported using post-milking teat disinfection (97% and dry cow therapy (93%, and a substantial proportion of producers reported using pre-milking teat disinfection (79% and wearing gloves during milking (77%. Mastitis costs were substantial (662 CAD per milking cow per year for a typical Canadian dairy farm, with a large portion of the costs (48% being attributed to SCM, and 34 and 15% due to CM and implementation of preventive measures, respectively. For SCM, the two most important cost components were the subsequent milk yield reduction and culling (72 and 25% of SCM costs, respectively. For CM, first, second, and third most important cost components were

  12. Prototyping low-cost and flexible vehicle diagnostic systems

    Directory of Open Access Journals (Sweden)

    Marisol GARCÍA-VALLS

    2016-12-01

    Full Text Available Diagnostic systems are software and hardware-based equipment that interoperate with an external monitored system. Traditionally, they have been expensive equipment running test algorithms to monitor physical properties of, e.g., vehicles, or civil infrastructure equipment, among others. As computer hardware is increasingly powerful (whereas its cost and size is decreasing and communication software becomes easier to program and more run-time efficient, new scenarios are enabled that yield to lower cost monitoring solutions. This paper presents a low cost approach towards the development of a diagnostic systems relying on a modular component-based approach and running on a resource limited embedded computer. Results on a prototype implementation are shown that validate the presented design, its flexibility, performance, and communication latency.

  13. Advanced computer-based training

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H D; Martin, H D

    1987-05-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment.

  14. Advanced computer-based training

    International Nuclear Information System (INIS)

    Fischer, H.D.; Martin, H.D.

    1987-01-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment. (orig.) [de

  15. LPGC, Levelized Steam Electric Power Generator Cost

    International Nuclear Information System (INIS)

    Coen, J.J.; Delene, J.G.

    1994-01-01

    1 - Description of program or function: LPGC is a set of nine microcomputer programs for estimating power generation costs for large steam-electric power plants. These programs permit rapid evaluation using various sets of economic and technical ground rules. The levelized power generation costs calculated may be used to compare the relative economics of nuclear and coal-fired plants based on life-cycle costs. Cost calculations include capital investment cost, operation and maintenance cost, fuel cycle cost, decommissioning cost, and total levelized power generation cost. These programs can be used for quick analyses of power generation costs using alternative economic parameters, such as interest rate, escalation rate, inflation rate, plant lead times, capacity factor, fuel prices, etc. The two major types of electric generating plants considered are pressurized-water reactor (PWR) and pulverized coal-fired plants. Data are also provided for the Large Scale Prototype Breeder (LSPB) type liquid metal reactor. Costs for plant having either one or two units may be obtained. 2 - Method of solution: LPGC consists of nine individual menu-driven programs controlled by a driver program, MAINPWR. The individual programs are PLANTCAP, for calculating capital investment costs; NUCLOM, for determining operation and maintenance (O and M) costs for nuclear plants; COALOM, for computing O and M costs for coal-fired plants; NFUEL, for calculating levelized fuel costs for nuclear plants; COALCOST, for determining levelized fuel costs for coal-fired plants; FCRATE, for computing the fixed charge rate on the capital investment; LEVEL, for calculating levelized power generation costs; CAPITAL, for determining capitalized cost from overnight cost; and MASSGEN, for generating, deleting, or changing fuel cycle mass balance data for use with NFUEL. LPGC has three modes of operation. In the first, each individual code can be executed independently to determine one aspect of the total

  16. Impact of generic alendronate cost on the cost-effectiveness of osteoporosis screening and treatment.

    Directory of Open Access Journals (Sweden)

    Smita Nayak

    Full Text Available Since alendronate became available in generic form in the Unites States in 2008, its price has been decreasing. The objective of this study was to investigate the impact of alendronate cost on the cost-effectiveness of osteoporosis screening and treatment in postmenopausal women.Microsimulation cost-effectiveness model of osteoporosis screening and treatment for U.S. women age 65 and older. We assumed screening initiation at age 65 with central dual-energy x-ray absorptiometry (DXA, and alendronate treatment for individuals with osteoporosis; with a comparator of "no screening" and treatment only after fracture occurrence. We evaluated annual alendronate costs of $20 through $800; outcome measures included fractures; nursing home admission; medication adverse events; death; costs; quality-adjusted life-years (QALYs; and incremental cost-effectiveness ratios (ICERs in 2010 U.S. dollars per QALY gained. A lifetime time horizon was used, and direct costs were included. Base-case and sensitivity analyses were performed.Base-case analysis results showed that at annual alendronate costs of $200 or less, osteoporosis screening followed by treatment was cost-saving, resulting in lower total costs than no screening as well as more QALYs (10.6 additional quality-adjusted life-days. When assuming alendronate costs of $400 through $800, screening and treatment resulted in greater lifetime costs than no screening but was highly cost-effective, with ICERs ranging from $714 per QALY gained through $13,902 per QALY gained. Probabilistic sensitivity analyses revealed that the cost-effectiveness of osteoporosis screening followed by alendronate treatment was robust to joint input parameter estimate variation at a willingness-to-pay threshold of $50,000/QALY at all alendronate costs evaluated.Osteoporosis screening followed by alendronate treatment is effective and highly cost-effective for postmenopausal women across a range of alendronate costs, and may be cost

  17. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images can be viewed on a computer monitor, printed on film or transferred to a CD or DVD. CT images of internal organs, bones, soft tissue and blood vessels provide greater detail ...

  18. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images can be viewed on a computer monitor, printed on film or transferred to a CD or DVD. CT images of internal organs, bones, soft tissue and blood vessels provide greater detail ...

  19. Trouble Sleeping Associated with Lower Work Performance and Greater Healthcare Costs: Longitudinal Data from Kansas State Employee Wellness Program

    Science.gov (United States)

    Hui, Siu-kuen Azor; Grandner, Michael A.

    2015-01-01

    Objective To examine the relationships between employees’ trouble sleeping and absenteeism, work performance, and healthcare expenditures over a two year period. Methods Utilizing the Kansas State employee wellness program (EWP) dataset from 2008–2009, multinomial logistic regression analyses were conducted with trouble sleeping as the predictor and absenteeism, work performance, and healthcare costs as the outcomes. Results EWP participants (N=11,698 in 2008; 5,636 followed up in 2009) who had higher levels of sleep disturbance were more likely to be absent from work (all p performance ratings (all p performance, and healthcare costs. PMID:26461857

  20. 48 CFR 1602.170-5 - Cost or pricing data.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Cost or pricing data. 1602... Terms 1602.170-5 Cost or pricing data. (a) Experience-rated carriers. Cost or pricing data for... pricing data for community rated carriers is the specialized rating data used by carriers in computing a...

  1. PET-CT in oncological patients: analysis of informal care costs in cost-benefit assessment.

    Science.gov (United States)

    Orlacchio, Antonio; Ciarrapico, Anna Micaela; Schillaci, Orazio; Chegai, Fabrizio; Tosti, Daniela; D'Alba, Fabrizio; Guazzaroni, Manlio; Simonetti, Giovanni

    2014-04-01

    The authors analysed the impact of nonmedical costs (travel, loss of productivity) in an economic analysis of PET-CT (positron-emission tomography-computed tomography) performed with standard contrast-enhanced CT protocols (CECT). From October to November 2009, a total of 100 patients referred to our institute were administered a questionnaire to evaluate the nonmedical costs of PET-CT. In addition, the medical costs (equipment maintenance and depreciation, consumables and staff) related to PET-CT performed with CECT and PET-CT with low-dose nonenhanced CT and separate CECT were also estimated. The medical costs were 919.3 euro for PET-CT with separate CECT, and 801.3 euro for PET-CT with CECT. Therefore, savings of approximately 13% are possible. Moreover, savings in nonmedical costs can be achieved by reducing the number of hospital visits required by patients undergoing diagnostic imaging. Nonmedical costs heavily affect patients' finances as well as having an indirect impact on national health expenditure. Our results show that PET-CT performed with standard dose CECT in a single session provides benefits in terms of both medical and nonmedical costs.

  2. A low-cost computer-controlled Arduino-based educational laboratory system for teaching the fundamentals of photovoltaic cells

    International Nuclear Information System (INIS)

    Zachariadou, K; Yiasemides, K; Trougkakos, N

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C number sign and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments’ academic staff. (paper)

  3. A low-cost computer-controlled Arduino-based educational laboratory system for teaching the fundamentals of photovoltaic cells

    Energy Technology Data Exchange (ETDEWEB)

    Zachariadou, K; Yiasemides, K; Trougkakos, N [Technological Educational Institute of Piraeus, P Ralli and Thivon 250, 12244 Egaleo (Greece)

    2012-11-15

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C number sign and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments' academic staff. (paper)

  4. Efficient physical embedding of topologically complex information processing networks in brains and computer circuits.

    Directory of Open Access Journals (Sweden)

    Danielle S Bassett

    2010-04-01

    Full Text Available Nervous systems are information processing networks that evolved by natural selection, whereas very large scale integrated (VLSI computer circuits have evolved by commercially driven technology development. Here we follow historic intuition that all physical information processing systems will share key organizational properties, such as modularity, that generally confer adaptivity of function. It has long been observed that modular VLSI circuits demonstrate an isometric scaling relationship between the number of processing elements and the number of connections, known as Rent's rule, which is related to the dimensionality of the circuit's interconnect topology and its logical capacity. We show that human brain structural networks, and the nervous system of the nematode C. elegans, also obey Rent's rule, and exhibit some degree of hierarchical modularity. We further show that the estimated Rent exponent of human brain networks, derived from MRI data, can explain the allometric scaling relations between gray and white matter volumes across a wide range of mammalian species, again suggesting that these principles of nervous system design are highly conserved. For each of these fractal modular networks, the dimensionality of the interconnect topology was greater than the 2 or 3 Euclidean dimensions of the space in which it was embedded. This relatively high complexity entailed extra cost in physical wiring: although all networks were economically or cost-efficiently wired they did not strictly minimize wiring costs. Artificial and biological information processing systems both may evolve to optimize a trade-off between physical cost and topological complexity, resulting in the emergence of homologous principles of economical, fractal and modular design across many different kinds of nervous and computational networks.

  5. Yankee links computing needs, increases productivity

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    Yankee Atomic Electric Company provides design and consultation services to electric utility companies that operate nuclear power plants. This means bringing together the skills and talents of more than 500 people in many disciplines, including computer-aided design, human resources, financial services, and nuclear engineering. The company was facing a problem familiar to many companies in the nuclear industry.Key corporate data and applications resided on UNIX or other types of computer systems, but most users at Yankee had personal computers on their desks. How could Yankee enable the PC users to share the data, applications, and resources of the larger computing environment such as UNIX, while ensuring they could still use their favorite PC applications? The solution was PC-NFS from Sunsoft, of Chelmsford, Mass., which links PCs to UNIX and other systems. The Yankee computing story is an example of computer downsizing-the trend of moving away from mainframe computers in favor of lower-cost, more flexible client/server computing. Today, Yankee Atomic has more than 350 PCs on desktops throughout the company, using PC-NFS, which enables them t;o use the data, applications, disks, and printers of the FUNIX server systems. This new client/server environment has reduced Yankee's computing costs while increasing its computing power and its ability to respond to customers

  6. 21 Cost-Saving Measures For The Judiciary

    Directory of Open Access Journals (Sweden)

    Jessica Vapnek

    2013-02-01

    Full Text Available Courts around the world are increasingly facing budget cuts and funding shortfalls. Budget problems are particularly acute in developing countries, where courts need to increase efficiency and access to justice while also managing resource limitations. International development agencies and donors expect measurable progress to justify continued funding of judicial reform projects. Yet, as rule of law efforts in developing countries improve public perception of courts and streamline court administration, more cases may be filed. Greater use of the courts puts greater strain on court resources, triggering the need to implement cost-saving measures while maintaining effective court administration.This paper outlines 21 measures that courts can implement to reduce costs. Specific examples from developing countries are presented wherever possible, with additional examples drawn from the United States and Europe. Although this paper is intended mainly for audiences in developing countries, the issues facing those courts are similar to issues addressed through court reforms in the United States over the past 50 years. For this reason, examples of cost-saving measures from developed countries such as the United States may be directly applicable or could be used as starting points to spur further cost savings innovation in the developing world.Section I of this paper explains the context for the implementation of judicial cost-saving measures, and raises some issues for reflection. Section II sets out specific judicial cost-saving measures, dividing them into three categories: measures that address court operations; measures directed at staffing and salaries; and measures that relate to court and case management. Section III discusses ways that countries and judiciaries can generate ideas for new and innovative cost-saving mechanisms.

  7. An efficient and cost effective nuclear medicine image network

    International Nuclear Information System (INIS)

    Sampathkumaran, K.S.; Miller, T.R.

    1987-01-01

    An image network that is in use in a large nuclear medicine department is described. This network was designed to efficiently handle a large volume of clinical data at reasonable cost. Small, limited function computers are attached to each scintillation camera for data acquisition. The images are transferred by cable network or floppy disc to a large, powerful central computer for processing and display. Cost is minimized by use of small acquisition computers not equipped with expensive video display systems or elaborate analysis software. Thus, financial expenditure can be concentrated in a powerful central computer providing a centralized data base, rapid processing, and an efficient environment for program development. Clinical work is greatly facilitated because the physicians can process and display all studies without leaving the main reading area. (orig.)

  8. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    Science.gov (United States)

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  9. An Analysis of the RCA Price-S Cost Estimation Model as it Relates to Current Air Force Computer Software Acquisition and Management.

    Science.gov (United States)

    1979-12-01

    because of the use of complex computational algorithms (Ref 25). Another important factor effecting the cost of soft- ware is the size of the development...involved the alignment and navigational algorithm portions of the software. The second avionics system application was the development of an inertial...001 1 COAT CONL CREA CINT CMAT CSTR COPR CAPP New Code .001 .001 .001 .001 1001 ,OO .00 Device TDAT T03NL TREA TINT Types o * Quantity OGAT OONL OREA

  10. Impact of an Advanced Imaging Utilization Review Program on Downstream Health Care Utilization and Costs for Low Back Pain.

    Science.gov (United States)

    Graves, Janessa M; Fulton-Kehoe, Deborah; Jarvik, Jeffrey G; Franklin, Gary M

    2018-06-01

    Early magnetic resonance imaging (MRI) for acute low back pain (LBP) has been associated with increased costs, greater health care utilization, and longer disability duration in workers' compensation claimants. To assess the impact of a state policy implemented in June 2010 that required prospective utilization review (UR) for early MRI among workers' compensation claimants with LBP. Interrupted time series. In total, 76,119 Washington State workers' compensation claimants with LBP between 2006 and 2014. Proportion of workers receiving imaging per month (MRI, computed tomography, radiographs) and lumbosacral injections and surgery; mean total health care costs per worker; mean duration of disability per worker. Measures were aggregated monthly and attributed to injury month. After accounting for secular trends, decreases in early MRI [level change: -5.27 (95% confidence interval, -4.22 to -6.31); trend change: -0.06 (-0.01 to -0.12)], any MRI [-4.34 (-3.01 to -5.67); -0.10 (-0.04 to -0.17)], and injection [trend change: -0.12 (-0.06 to -0.18)] utilization were associated with the policy. Radiograph utilization increased in parallel [level change: 2.46 (1.24-3.67)]. In addition, the policy resulted in significant decreasing changes in mean costs per claim, mean disability duration, and proportion of workers who received disability benefits. The policy had no effect on computed tomography or surgery utilization. The UR policy had discernable effects on health care utilization, costs, and disability. Integrating evidence-based guidelines with UR can improve quality of care and patient outcomes, while reducing use of low-value health services.

  11. Filmless versus film-based systems in radiographic examination costs: an activity-based costing method

    Directory of Open Access Journals (Sweden)

    Sase Yuji

    2011-09-01

    Full Text Available Abstract Background Since the shift from a radiographic film-based system to that of a filmless system, the change in radiographic examination costs and costs structure have been undetermined. The activity-based costing (ABC method measures the cost and performance of activities, resources, and cost objects. The purpose of this study is to identify the cost structure of a radiographic examination comparing a filmless system to that of a film-based system using the ABC method. Methods We calculated the costs of radiographic examinations for both a filmless and a film-based system, and assessed the costs or cost components by simulating radiographic examinations in a health clinic. The cost objects of the radiographic examinations included lumbar (six views, knee (three views, wrist (two views, and other. Indirect costs were allocated to cost objects using the ABC method. Results The costs of a radiographic examination using a filmless system are as follows: lumbar 2,085 yen; knee 1,599 yen; wrist 1,165 yen; and other 1,641 yen. The costs for a film-based system are: lumbar 3,407 yen; knee 2,257 yen; wrist 1,602 yen; and other 2,521 yen. The primary activities were "calling patient," "explanation of scan," "take photographs," and "aftercare" for both filmless and film-based systems. The cost of these activities cost represented 36.0% of the total cost for a filmless system and 23.6% of a film-based system. Conclusions The costs of radiographic examinations using a filmless system and a film-based system were calculated using the ABC method. Our results provide clear evidence that the filmless system is more effective than the film-based system in providing greater value services directly to patients.

  12. Filmless versus film-based systems in radiographic examination costs: an activity-based costing method.

    Science.gov (United States)

    Muto, Hiroshi; Tani, Yuji; Suzuki, Shigemasa; Yokooka, Yuki; Abe, Tamotsu; Sase, Yuji; Terashita, Takayoshi; Ogasawara, Katsuhiko

    2011-09-30

    Since the shift from a radiographic film-based system to that of a filmless system, the change in radiographic examination costs and costs structure have been undetermined. The activity-based costing (ABC) method measures the cost and performance of activities, resources, and cost objects. The purpose of this study is to identify the cost structure of a radiographic examination comparing a filmless system to that of a film-based system using the ABC method. We calculated the costs of radiographic examinations for both a filmless and a film-based system, and assessed the costs or cost components by simulating radiographic examinations in a health clinic. The cost objects of the radiographic examinations included lumbar (six views), knee (three views), wrist (two views), and other. Indirect costs were allocated to cost objects using the ABC method. The costs of a radiographic examination using a filmless system are as follows: lumbar 2,085 yen; knee 1,599 yen; wrist 1,165 yen; and other 1,641 yen. The costs for a film-based system are: lumbar 3,407 yen; knee 2,257 yen; wrist 1,602 yen; and other 2,521 yen. The primary activities were "calling patient," "explanation of scan," "take photographs," and "aftercare" for both filmless and film-based systems. The cost of these activities cost represented 36.0% of the total cost for a filmless system and 23.6% of a film-based system. The costs of radiographic examinations using a filmless system and a film-based system were calculated using the ABC method. Our results provide clear evidence that the filmless system is more effective than the film-based system in providing greater value services directly to patients.

  13. Transaction cost of micro and small enterprises financing

    Directory of Open Access Journals (Sweden)

    Ghana Atma Sulistya

    2016-10-01

    Full Text Available High transaction costs become one of the obstacles for the micro and small enterprises (MSEs to access financial loans to the bank. In order to minimize the transaction costs, group lending scheme become  alternative, so that both sides are pay lower transaction costs, and MSEs are able to improve their welfare. This study aims to analyze the credit process and transaction costs incurred on the model of individuals and groups lending and to compare the magnitude of transaction costs on both models. Mixed Method Analysis is used to analyze the component of transaction costs and the magnitude of the transaction cost on both models.These results indicate there are differences in transaction costs incurred on both schemes. In the amount of the transaction costs, the overall group scheme still allows for greater compared to individual schemes and dominated by the cost of the disbursement. Even so, the transaction cost per member group is much smaller than the individual schemes.

  14. An application of the Rayleigh distribution to contract cost data

    OpenAIRE

    Abernethy, Thomas S.

    1984-01-01

    Approved for public release; distribution unlimited Accurate cost models are essential to the proper monitoring of contract cost data. The greater the accuracy of the model, the earlier contract cost overruns can be recognized and their cause(s) ascertained. The availability of a variety of cost models allows flexibility in choosing the correct model for the particular circumstances and increases the chances of being able to select a model that can provide reliable forecasts about future c...

  15. Cloud computing: An innovative tool for library services

    OpenAIRE

    Sahu, R.

    2015-01-01

    Cloud computing is a new technique of information communication technology because of its potential benefits such as reduced cost, accessible anywhere any time as well as its elasticity and flexibility. In this Paper defines cloud Computing, definition, essential characteristics, model of cloud computing, components of cloud, advantages & drawbacks of cloud computing and also describe cloud computing in libraries.

  16. Precise fixpoint computation through strategy iteration

    DEFF Research Database (Denmark)

    Gawlitza, Thomas; Seidl, Helmut

    2007-01-01

    We present a practical algorithm for computing least solutions of systems of equations over the integers with addition, multiplication with positive constants, maximum and minimum. The algorithm is based on strategy iteration. Its run-time (w.r.t. the uniform cost measure) is independent of the s......We present a practical algorithm for computing least solutions of systems of equations over the integers with addition, multiplication with positive constants, maximum and minimum. The algorithm is based on strategy iteration. Its run-time (w.r.t. the uniform cost measure) is independent...

  17. Model reduction by weighted Component Cost Analysis

    Science.gov (United States)

    Kim, Jae H.; Skelton, Robert E.

    1990-01-01

    Component Cost Analysis considers any given system driven by a white noise process as an interconnection of different components, and assigns a metric called 'component cost' to each component. These component costs measure the contribution of each component to a predefined quadratic cost function. A reduced-order model of the given system may be obtained by deleting those components that have the smallest component costs. The theory of Component Cost Analysis is extended to include finite-bandwidth colored noises. The results also apply when actuators have dynamics of their own. Closed-form analytical expressions of component costs are also derived for a mechanical system described by its modal data. This is very useful to compute the modal costs of very high order systems. A numerical example for MINIMAST system is presented.

  18. Learning Together; part 2: training costs and health gain - a cost analysis.

    Science.gov (United States)

    Cullen, Katherine; Riches, Wendy; Macaulay, Chloe; Spicer, John

    2017-01-01

    Learning Together is a complex educational intervention aimed at improving health outcomes for children and young people. There is an additional cost as two doctors are seeing patients together for a longer appointment than a standard general practice (GP) appointment. Our approach combines the impact of the training clinics on activity in South London in 2014-15 with health gain, using NICE guidance and standards to allow comparison of training options. Activity data was collected from Training Practices hosting Learning Together. A computer based model was developed to analyse the costs of the Learning Together intervention compared to usual training in a partial economic evaluation. The results of the model were used to value the health gain required to make the intervention cost effective. Data were returned for 363 patients booked into 61 clinics across 16 Training Practices. Learning Together clinics resulted in an increase in costs of £37 per clinic. Threshold analysis illustrated one child with a common illness like constipation needs to be well for two weeks, in one Practice hosting four training clinics for the clinics to be considered cost effective. Learning Together is of minimal training cost. Our threshold analysis produced a rubric that can be used locally to test cost effectiveness at a Practice or Programme level.

  19. Computation of spot prices and congestion costs in large interconnected power systems

    International Nuclear Information System (INIS)

    Mukerji, R.; Jordan, G.A.; Clayton, R.; Haringa, G.E.

    1995-01-01

    Foremost among the new paradigms for the US utility industry is the ''poolco'' concept proposed by Prof. William W. Hogan of Harvard University. This concept uses a central pool or power exchange in which physical power is traded based on spot prices or market clearing prices. The rapid and accurate calculation of these ''spot'' prices and associated congestion costs for large interconnected power systems is the central tenet upon which the poolco concept is based. The market clearing price would be the same throughout the system if there were no system losses and transmission limitations did not exist. System losses cause small differences in market clearing prices as the cost of supplying a MW at various load buses includes the cost of losses. Transmission limits may cause large differences in market clearing prices between regions as low cost generation is blocked by the transmission constraints from serving certain loads. In models currently in use in the electric power industry spot price calculations range from ''bubble diagram'' type contract path models to full electrical representation such as GE-MAPS. The modeling aspects of the full electrical representation are included in the Appendix. The problem with the bubble diagram representation is that these models are liable to produce unacceptably large errors in the calculation of spot prices and congestion costs. The subtleties of the calculation of spot prices and congestion costs are illustrated in this paper

  20. Behaviors of cost functions in image registration between 201Tl brain tumor single-photon emission computed tomography and magnetic resonance images

    International Nuclear Information System (INIS)

    Soma, Tsutomu; Takaki, Akihiro; Teraoka, Satomi; Ishikawa, Yasushi; Murase, Kenya; Koizumi, Kiyoshi

    2008-01-01

    We studied the behaviors of cost functions in the registration of thallium-201 ( 201 Tl) brain tumor single-photon emission computed tomography (SPECT) and magnetic resonance (MR) images, as the similarity index of image positioning. A marker for image registration [technetium-99m ( 99m Tc) point source] was attached at three sites on the heads of 13 patients with brain tumor, from whom 42 sets of 99m Tc- 201 Tl SPECT (the dual-isotope acquisition) and MR images were obtained. The 201 Tl SPECT and MR images were manually registered according to the markers. From the positions where the two images were registered, the position of the 201 Tl SPECT was moved to examine the behaviors of the three cost functions, i.e., ratio image uniformity (RIU), mutual information (MI), and normalized MI (NMI). The cost functions MI and NMI reached the maximum at positions adjacent to those where the SPECT and MR images were manually registered. As for the accuracy of image registration in terms of the cost functions MI and NMI, on average, the images were accurately registered within 3 deg of rotation around the X-, Y-, and Z-axes, and within 1.5 mm (within 2 pixels), 3 mm (within 3 pixels), and 4 mm (within 1 slice) of translation to the X-, Y-, and Z-axes, respectively. In terms of rotation around the Z-axis, the cost function RIU reached the minimum at positions where the manual registration of the two images was substantially inadequate. The MI and NMI were suitable cost functions in the registration of 201 Tl SPECT and MR images. The behavior of the RIU, in contrast, was unstable, being unsuitable as an index of image registration. (author)

  1. High performance computing in Windows Azure cloud

    OpenAIRE

    Ambruš, Dejan

    2013-01-01

    High performance, security, availability, scalability, flexibility and lower costs of maintenance have essentially contributed to the growing popularity of cloud computing in all spheres of life, especially in business. In fact cloud computing offers even more than this. With usage of virtual computing clusters a runtime environment for high performance computing can be efficiently implemented also in a cloud. There are many advantages but also some disadvantages of cloud computing, some ...

  2. Model implementation for dynamic computation of system cost for advanced life support

    Science.gov (United States)

    Levri, J. A.; Vaccari, D. A.

    2004-01-01

    Life support system designs for long-duration space missions have a multitude of requirements drivers, such as mission objectives, political considerations, cost, crew wellness, inherent mission attributes, as well as many other influences. Evaluation of requirements satisfaction can be difficult, particularly at an early stage of mission design. Because launch cost is a critical factor and relatively easy to quantify, it is a point of focus in early mission design. The method used to determine launch cost influences the accuracy of the estimate. This paper discusses the appropriateness of dynamic mission simulation in estimating the launch cost of a life support system. This paper also provides an abbreviated example of a dynamic simulation life support model and possible ways in which such a model might be utilized for design improvement. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  3. Computer tomography: a cost-saving examination?

    International Nuclear Information System (INIS)

    Barneveld Binkhuysen, F.H.; Puijlaert, C.B.A.J.

    1987-01-01

    The research concerns the influence of the body computer tomograph (BCT) on the efficiency in radiology and in the hospital as a whole in The Netherlands. Hospitals with CT are compared with hospitals without CT. In radiology the substitution effect is investigated, with use of the number of radiological performances per clinical patient as a parameter. This parameter proves to decrease in hospitals with a CT, in contrast to hospitals without a CT. The often-expressed opinion that the CT should specifically perform complementary examinations appears incorrect. As to the efficiency in the hospital this is related to the average hospital in-patient stay. The average hospital in-patient stay proves to be shorter in hospitals with a CT than in those without a CT. The CT has turned out to be a very effective expedient which unfortunately, however, is being used inefficiently in The Netherlands, owing to limited installation. 17 refs.; 6 figs.; 5 tabs

  4. Integrated waste management system costs in a MPC system

    International Nuclear Information System (INIS)

    Supko, E.M.

    1995-01-01

    The impact on system costs of including a centralized interim storage facility as part of an integrated waste management system based on multi-purpose canister (MPC) technology was assessed in analyses by Energy Resources International, Inc. A system cost savings of $1 to $2 billion occurs if the Department of Energy begins spent fuel acceptance in 1998 at a centralized interim storage facility. That is, the savings associated with decreased utility spent fuel management costs will be greater than the cost of constructing and operating a centralized interim storage facility

  5. Challenges and opportunities of cloud computing for atmospheric sciences

    Science.gov (United States)

    Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.

    2016-04-01

    Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.

  6. A Comparative Cost Analysis of Picture Archiving and ...

    African Journals Online (AJOL)

    Method: An incremental cost analysis for chest radiographs,, computed tomography and magnetic resonance imaging brain scans with and without contrast were performed. The overall incremental cost for PACS in comparison with a conventional radiology site was determined. The net present value was also determined to ...

  7. An opportunity cost model of subjective effort and task performance

    Science.gov (United States)

    Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus

    2013-01-01

    Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775

  8. Reusability Framework for Cloud Computing

    OpenAIRE

    Singh, Sukhpal; Singh, Rishideep

    2012-01-01

    Cloud based development is a challenging task for several software engineering projects, especially for those which needs development with reusability. Present time of cloud computing is allowing new professional models for using the software development. The expected upcoming trend of computing is assumed to be this cloud computing because of speed of application deployment, shorter time to market, and lower cost of operation. Until Cloud Co mputing Reusability Model is considered a fundamen...

  9. How Europe's Low-Cost Carriers Sidestepped Traditional Carriers' Competitive Advantages

    DEFF Research Database (Denmark)

    Hvass, Kristian Anders

    -cost airlines were more successful and had a greater initial impact in their early years than their U.S. compatriots. This paper will attempt to highlight some of the differences between the two markets and explain why European low-cost airlines had more advantages following their market deregulation...

  10. A probabilistic approach to the computation of the levelized cost of electricity

    International Nuclear Information System (INIS)

    Geissmann, Thomas

    2017-01-01

    This paper sets forth a novel approach to calculate the levelized cost of electricity (LCOE) using a probabilistic model that accounts for endogenous input parameters. The approach is applied to the example of a nuclear and gas power project. Monte Carlo simulation results show that a correlation between input parameters has a significant effect on the model outcome. By controlling for endogeneity, a statistically significant difference in the mean LCOE estimate and a change in the order of input leverages is observed. Moreover, the paper discusses the role of discounting options and external costs in detail. In contrast to the gas power project, the economic viability of the nuclear project is considerably weaker. - Highlights: • First model of levelized cost of electricity accounting for uncertainty and endogeneities in input parameters. • Allowance for endogeneities significantly affects results. • Role of discounting options and external costs is discussed and modelled.

  11. Report of the Task Force on Computer Charging.

    Science.gov (United States)

    Computer Co-ordination Group, Ottawa (Ontario).

    The objectives of the Task Force on Computer Charging as approved by the Committee of Presidents of Universities of Ontario were: (1) to identify alternative methods of costing computing services; (2) to identify alternative methods of pricing computing services; (3) to develop guidelines for the pricing of computing services; (4) to identify…

  12. The costs of inequality: whole-population modelling study of lifetime inpatient hospital costs in the English National Health Service by level of neighbourhood deprivation

    Science.gov (United States)

    Doran, Tim; Cookson, Richard

    2016-01-01

    Background There are substantial socioeconomic inequalities in both life expectancy and healthcare use in England. In this study, we describe how these two sets of inequalities interact by estimating the social gradient in hospital costs across the life course. Methods Hospital episode statistics, population and index of multiple deprivation data were combined at lower-layer super output area level to estimate inpatient hospital costs for 2011/2012 by age, sex and deprivation quintile. Survival curves were estimated for each of the deprivation groups and used to estimate expected annual costs and cumulative lifetime costs. Results A steep social gradient was observed in overall inpatient hospital admissions, with rates ranging from 31 298/100 000 population in the most affluent fifth of areas to 43 385 in the most deprived fifth. This gradient was steeper for emergency than for elective admissions. The total cost associated with this inequality in 2011/2012 was £4.8 billion. A social gradient was also observed in the modelled lifetime costs where the lower life expectancy was not sufficient to outweigh the higher average costs in the more deprived populations. Lifetime costs for women were 14% greater than for men, due to higher costs in the reproductive years and greater life expectancy. Conclusions Socioeconomic inequalities result in increased morbidity and decreased life expectancy. Interventions to reduce inequality and improve health in more deprived neighbourhoods have the potential to save money for health systems not only within years but across peoples’ entire lifetimes, despite increased costs due to longer life expectancies. PMID:27189975

  13. Simple and Effective Algorithms: Computer-Adaptive Testing.

    Science.gov (United States)

    Linacre, John Michael

    Computer-adaptive testing (CAT) allows improved security, greater scoring accuracy, shorter testing periods, quicker availability of results, and reduced guessing and other undesirable test behavior. Simple approaches can be applied by the classroom teacher, or other content specialist, who possesses simple computer equipment and elementary…

  14. Cost Minimization for Joint Energy Management and Production Scheduling Using Particle Swarm Optimization

    Science.gov (United States)

    Shah, Rahul H.

    production planning framework are discussed. A modified Particle Swarm Optimization solution technique is adopted to solve the proposed scheduling problem. The algorithm is described in detail and compared to Genetic Algorithm. Case studies are presented to illustrate the benefits of using the proposed model and the effectiveness of the Particle Swarm Optimization approach. Numerical Experiments are implemented and analyzed to test the effectiveness of the proposed model. The proposed scheduling strategy can achieve savings of around 19 to 27 % in cost per part when compared to the baseline scheduling scenarios. By optimizing key production system parameters from the cost per part model, the baseline scenarios can obtain around 20 to 35 % in savings for the cost per part. These savings further increase by 42 to 55 % when system parameter optimization is integrated with the proposed scheduling problem. Using this method, the most influential parameters on the cost per part are the rated power from production, the production rate, and the initial machine reliabilities. The modified Particle Swarm Optimization algorithm adopted allows greater diversity and exploration compared to Genetic Algorithm for the proposed joint model which results in it being more computationally efficient in determining the optimal scheduling. While Genetic Algorithm could achieve a solution quality of 2,279.63 at an expense of 2,300 seconds in computational effort. In comparison, the proposed Particle Swarm Optimization algorithm achieved a solution quality of 2,167.26 in less than half the computation effort which is required by Genetic Algorithm.

  15. Computer-assisted estimating for the Los Alamos Scientific Laboratory

    International Nuclear Information System (INIS)

    Spooner, J.E.

    1976-02-01

    An analysis is made of the cost estimating system currently in use at the Los Alamos Scientific Laboratory (LASL) and the benefits of computer assistance are evaluated. A computer-assisted estimating system (CAE) is proposed for LASL. CAE can decrease turnaround and provide more flexible response to management requests for cost information and analyses. It can enhance value optimization at the design stage, improve cost control and change-order justification, and widen the use of cost information in the design process. CAE costs are not well defined at this time although they appear to break even with present operations. It is recommended that a CAE system description be submitted for contractor consideration and bid while LASL system development continues concurrently

  16. SISTEM DETEKSI WAJAH PADA OPEN SOURCE PHYSICAL COMPUTING

    Directory of Open Access Journals (Sweden)

    Yupit Sudianto

    2014-01-01

    Full Text Available Face detection is one of the interesting research area. Majority of this research implemented on a computer. Development of face detection on a computer requires a significant investment costs. In addition to having to spend the cost of procurement of computers, is also required for operational cost such as electricity use, because the computer requires large power/watt.This research is proposed to build a face detection system using Arduino. The system will be autonomous, in other word the role of computer will be replaced by Arduino. Arduino is used is Arduino Mega 2560 with specifications microcontroller AT MEGA 2560, a speed of 16 MHz, 256 KB flash memory, 8 KB SRAM, 4 KB EEPROM. So not all face detection algorithm can be implemented on the Arduino. The limitations of memory owned by the arduino will be resolved by applying the method of template matching using the facial features in the form of a template that is shaped like a mask. Detection rate achieved in this study is 80% - 100%. Where, in the Arduino's success in identifying the face are influenced by the distance between the camera with the human face and human movement.

  17. Cost Benefit of Comprehensive Primary and Preventive School-Based Health Care.

    Science.gov (United States)

    Padula, William V; Connor, Katherine A; Mueller, Josiah M; Hong, Jonathan C; Velazquez, Gabriela Calderon; Johnson, Sara B

    2018-01-01

    The Rales Health Center is a comprehensive school-based health center at an urban elementary/middle school. Rales Health Center provides a full range of pediatric services using an enriched staffing model consisting of pediatrician, nurse practitioner, registered nurses, and medical office assistant. This staffing model provides greater care but costs more than traditional school-based health centers staffed by part-time nurses. The objective was to analyze the cost benefit of Rales Health Center enhanced staffing model compared with a traditional school-based health center (standard care), focusing on asthma care, which is among the most prevalent chronic conditions of childhood. In 2016, cost-benefit analysis using a decision tree determined the net social benefit of Rales Health Center compared with standard care from the U.S. societal perspective based on the 2015-2016 academic year. It was assumed that Rales Health Center could handle greater patient throughput related to asthma, decreased prescription costs, reduced parental resources in terms of missed work time, and improved student attendance. Univariate and multivariate probabilistic sensitivity analyses were conducted. The expected cost to operate Rales Health Center was $409,120, compared with standard care cost of $172,643. Total monetized incremental benefits of Rales Health Center were estimated to be $993,414. The expected net social benefit for Rales Health Center was $756,937, which demonstrated substantial societal benefit at a return of $4.20 for every dollar invested. This net social benefit estimate was robust to sensitivity analyses. Despite the greater cost associated with the Rales Health Center's enhanced staffing model, the results of this analysis highlight the cost benefit of providing comprehensive, high-quality pediatric care in schools, particularly schools with a large proportion of underserved students. Copyright © 2018 American Journal of Preventive Medicine. Published by

  18. Can broader diffusion of value-based insurance design increase benefits from US health care without increasing costs? Evidence from a computer simulation model.

    Directory of Open Access Journals (Sweden)

    R Scott Braithwaite

    2010-02-01

    Full Text Available BACKGROUND: Evidence suggests that cost sharing (i.e.,copayments and deductibles decreases health expenditures but also reduces essential care. Value-based insurance design (VBID has been proposed to encourage essential care while controlling health expenditures. Our objective was to estimate the impact of broader diffusion of VBID on US health care benefits and costs. METHODS AND FINDINGS: We used a published computer simulation of costs and life expectancy gains from US health care to estimate the impact of broader diffusion of VBID. Two scenarios were analyzed: (1 applying VBID solely to pharmacy benefits and (2 applying VBID to both pharmacy benefits and other health care services (e.g., devices. We assumed that cost sharing would be eliminated for high-value services ($300,000 per life-year. All costs are provided in 2003 US dollars. Our simulation estimated that approximately 60% of health expenditures in the US are spent on low-value services, 20% are spent on intermediate-value services, and 20% are spent on high-value services. Correspondingly, the vast majority (80% of health expenditures would have cost sharing that is impacted by VBID. With prevailing patterns of cost sharing, health care conferred 4.70 life-years at a per-capita annual expenditure of US$5,688. Broader diffusion of VBID to pharmaceuticals increased the benefit conferred by health care by 0.03 to 0.05 additional life-years, without increasing costs and without increasing out-of-pocket payments. Broader diffusion of VBID to other health care services could increase the benefit conferred by health care by 0.24 to 0.44 additional life-years, also without increasing costs and without increasing overall out-of-pocket payments. Among those without health insurance, using cost saving from VBID to subsidize insurance coverage would increase the benefit conferred by health care by 1.21 life-years, a 31% increase. CONCLUSION: Broader diffusion of VBID may amplify benefits from

  19. Can broader diffusion of value-based insurance design increase benefits from US health care without increasing costs? Evidence from a computer simulation model.

    Science.gov (United States)

    Braithwaite, R Scott; Omokaro, Cynthia; Justice, Amy C; Nucifora, Kimberly; Roberts, Mark S

    2010-02-16

    Evidence suggests that cost sharing (i.e.,copayments and deductibles) decreases health expenditures but also reduces essential care. Value-based insurance design (VBID) has been proposed to encourage essential care while controlling health expenditures. Our objective was to estimate the impact of broader diffusion of VBID on US health care benefits and costs. We used a published computer simulation of costs and life expectancy gains from US health care to estimate the impact of broader diffusion of VBID. Two scenarios were analyzed: (1) applying VBID solely to pharmacy benefits and (2) applying VBID to both pharmacy benefits and other health care services (e.g., devices). We assumed that cost sharing would be eliminated for high-value services (value services ($100,000-$300,000 per life-year or unknown), and would be increased for low-value services (>$300,000 per life-year). All costs are provided in 2003 US dollars. Our simulation estimated that approximately 60% of health expenditures in the US are spent on low-value services, 20% are spent on intermediate-value services, and 20% are spent on high-value services. Correspondingly, the vast majority (80%) of health expenditures would have cost sharing that is impacted by VBID. With prevailing patterns of cost sharing, health care conferred 4.70 life-years at a per-capita annual expenditure of US$5,688. Broader diffusion of VBID to pharmaceuticals increased the benefit conferred by health care by 0.03 to 0.05 additional life-years, without increasing costs and without increasing out-of-pocket payments. Broader diffusion of VBID to other health care services could increase the benefit conferred by health care by 0.24 to 0.44 additional life-years, also without increasing costs and without increasing overall out-of-pocket payments. Among those without health insurance, using cost saving from VBID to subsidize insurance coverage would increase the benefit conferred by health care by 1.21 life-years, a 31% increase

  20. Spacelab Mission Implementation Cost Assessment (SMICA)

    Science.gov (United States)

    Guynes, B. V.

    1984-01-01

    A total savings of approximately 20 percent is attainable if: (1) mission management and ground processing schedules are compressed; (2) the equipping, staffing, and operating of the Payload Operations Control Center is revised, and (3) methods of working with experiment developers are changed. The development of a new mission implementation technique, which includes mission definition, experiment development, and mission integration/operations, is examined. The Payload Operations Control Center is to relocate and utilize new computer equipment to produce cost savings. Methods of reducing costs by minimizing the Spacelab and payload processing time during pre- and post-mission operation at KSC are analyzed. The changes required to reduce costs in the analytical integration process are studied. The influence of time, requirements accountability, and risk on costs is discussed. Recommendation for cost reductions developed by the Spacelab Mission Implementation Cost Assessment study are listed.

  1. Offshore Wind Resource, Cost, and Economic Potential in the State of Maine

    Energy Technology Data Exchange (ETDEWEB)

    Musial, Walter D. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-02-12

    This report provides information for decision-makers about floating offshore wind technologies in the state of Maine. It summarizes research efforts performed at the National Renewable Energy Laboratory between 2015 and 2017 to analyze the resource potential, cost of offshore wind, and economic potential of offshore wind from four primary reports: Musial et al. (2016); Beiter et al. (2016, 2017); and Mone et al. (unpublished). From Musial et al. (2016), Maine's technical offshore wind resource potential ranked seventh in the nation overall with more than 411 terawatt-hours/year of offshore resource generating potential. Although 90% of this wind resource is greater than 9.0-meters-per-second average velocity, most of the resource is over deep water, where floating wind technology is needed. Levelized cost of energy and levelized avoided cost of energy were computed to estimate the unsubsidized 'economic potential' for Maine in the year 2027 (Beiter et al. 2016, 2017). The studies found that Maine may have 65 gigawatts of economic potential by 2027, the highest of any U.S. state. Bottom-line costs for the Aqua Ventus project, which is part of the U.S. Department of Energy's Advanced Technology Demonstration project, were released from a proprietary report written by NREL in 2016 for the University of Maine (Mone et al. unpublished). The report findings were that economies of scale and new technology advancements lowered the cost from $300/megawatt-hour (MWh) for the two-turbine 12-megawatt (MW) Aqua Ventus 1 project, to $126/MWh for the commercial-scale, 498-MW Aqua Ventus-2 project. Further cost reductions to $77/MWh were found when new technology advancements were applied for the 1,000-MW Aqua Ventus-3 project in 2030. No new analysis was conducted for this report.

  2. Cloud Computing Organizational Benefits : A Managerial concern

    OpenAIRE

    Mandala, Venkata Bhaskar Reddy; Chandra, Marepalli Sharat

    2012-01-01

    Context: Software industry is looking for new methods and opportunities to reduce the project management problems and operational costs. Cloud Computing concept is providing answers to these problems. Cloud Computing is made possible with the availability of high internet bandwidth. Cloud Computing is providing wide range of various services to varied customer base. Cloud Computing has some key elements such as on-demand services, large pool of configurable computing resources and minimal man...

  3. Inequalities in the distribution of the costs of alcohol misuse in Scotland: a cost of illness study.

    Science.gov (United States)

    Johnston, Marjorie C; Ludbrook, Anne; Jaffray, Mariesha A

    2012-01-01

    To examine the distribution of the costs of alcohol misuse across Scotland in 2009/2010, in relation to deprivation. A cost of illness approach was used. Alcohol-related harmful effects were assessed for inclusion using a literature review. This was based upon the following categories: direct healthcare costs, intangible health costs, social care costs, crime costs and labour and productivity costs. An analysis of secondary data supplemented by a literature review was carried out to quantify each harmful effect, determine its value and provide an estimate of the distribution by deprivation. The deprivation distributions used were area measures (primarily the Scottish Index of Multiple Deprivation). The overall cost was £7457 million. Two alcohol harmful effects were not included in the overall cost by deprivation due to a lack of data. These were 'children's social work and hearing system' and the criminal justice system costs from 'alcohol-specific offences'. The included alcohol harmful effects demonstrated that 40.41% of the total cost arose from the 20% most deprived areas. The intangible cost category was the largest category (78.65%). The study found that the burden of alcohol harmful effects is greater in deprived groups and these burdens do not simply arise from deprived groups but are also experienced more by these groups. The study was limited by a lack of data availability in certain areas, leading to less-precise cost estimates.

  4. Quantum analogue computing.

    Science.gov (United States)

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  5. Trends in computer hardware and software.

    Science.gov (United States)

    Frankenfeld, F M

    1993-04-01

    Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.

  6. Cloud Computing: An Overview

    Science.gov (United States)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  7. Factors Influencing Organization Adoption Decision On Cloud Computing

    OpenAIRE

    Ailar Rahimli

    2013-01-01

    Cloud computing is a developing field, using by organization that require to computing resource to provide the organizational computing needs. The goal of this research is evaluate the factors that influence on organization decision to adopt the cloud computing in Malaysia. Factors that relate to cloud computing adoption that include : need for cloud computing, cost effectiveness, security effectiveness of cloud computing and reliability. This paper evaluated the factors that influence on ado...

  8. Optimization of solar cell contacts by system cost-per-watt minimization

    Science.gov (United States)

    Redfield, D.

    1977-01-01

    New, and considerably altered, optimum dimensions for solar-cell metallization patterns are found using the recently developed procedure whose optimization criterion is the minimum cost-per-watt effect on the entire photovoltaic system. It is also found that the optimum shadow fraction by the fine grid is independent of metal cost and resistivity as well as cell size. The optimum thickness of the fine grid metal depends on all these factors, and in familiar cases it should be appreciably greater than that found by less complete analyses. The optimum bus bar thickness is much greater than those generally used. The cost-per-watt penalty due to the need for increased amounts of metal per unit area on larger cells is determined quantitatively and thereby provides a criterion for the minimum benefits that must be obtained in other process steps to make larger cells cost effective.

  9. The folly of using RCCs and RVUs for intermediate product costing.

    Science.gov (United States)

    Young, David W

    2007-04-01

    Two measures for computing the cost of intermediate projects--a ratio of cost to charges and relative value units--are highly flawed and can have serious financial implications for the hospitals that use them. Full-cost accounting, using the principles of activity-based costing, enables hospitals to measure their costs more accurately, both for competitive bidding purposes and to manage them more effectively.

  10. Prospects for Computational Fluid Dynamics in Room Air Contaminant Control

    DEFF Research Database (Denmark)

    Nielsen, Peter V.

    The fluid dynamics research is strongly influenced by the increasing computer power which has been available for the last decades. This development is obvious from the curve in figure 1 which shows the computation cost as a function of years. It is obvious that the cost for a given job will decre...

  11. Harvesting systems and costs for southern pine in the 1980s

    Science.gov (United States)

    Frederick W. Cubbage; James E. Granskog

    1981-01-01

    Timber harvesting systems and their costs are a major concern for the forest products industries. In this paper, harvest costs per cord are estimated, using computer simulation, for current southern pine harvesting systems. The estimations represent a range of mechanization levels. The sensitivity of systems to factors affecting harvest costs - machine costs, fuel...

  12. Cost effective campaigning in social networks

    Science.gov (United States)

    Kotnis, Bhushan; Kuri, Joy

    2016-05-01

    Campaigners are increasingly using online social networking platforms for promoting products, ideas and information. A popular method of promoting a product or even an idea is incentivizing individuals to evangelize the idea vigorously by providing them with referral rewards in the form of discounts, cash backs, or social recognition. Due to budget constraints on scarce resources such as money and manpower, it may not be possible to provide incentives for the entire population, and hence incentives need to be allocated judiciously to appropriate individuals for ensuring the highest possible outreach size. We aim to do the same by formulating and solving an optimization problem using percolation theory. In particular, we compute the set of individuals that are provided incentives for minimizing the expected cost while ensuring a given outreach size. We also solve the problem of computing the set of individuals to be incentivized for maximizing the outreach size for given cost budget. The optimization problem turns out to be non trivial; it involves quantities that need to be computed by numerically solving a fixed point equation. Our primary contribution is, that for a fairly general cost structure, we show that the optimization problems can be solved by solving a simple linear program. We believe that our approach of using percolation theory to formulate an optimization problem is the first of its kind.

  13. How should fixed costs in the network be covered?

    International Nuclear Information System (INIS)

    1999-01-01

    The report examines how tariffs that are only meant to cover costs in the transmission network should be formulated. Should the tariffs be based on power figures, such as for instance installed production capacity, or on an energy figure such as total annual energy production? Tariffs based on the producers' installed production capacity will in the long run cause the prises to rise under peak loads while tariffs based on generated energy elevate the consumer price by a small amount for all load segments. Thus, tariffs based on installed power capacity cause greater price distortion and greater socio-economic losses than energy tariffs. There are good arguments that fixed costs should mainly be paid by consumers with inelastic demand

  14. 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography Accuracy in the Staging of Non-Small Cell Lung Cancer: Review and Cost-Effectiveness

    International Nuclear Information System (INIS)

    Gómez León, Nieves; Escalona, Sofía; Bandrés, Beatriz; Belda, Cristobal; Callejo, Daniel; Blasco, Juan Antonio

    2014-01-01

    Aim of the performed clinical study was to compare the accuracy and cost-effectiveness of PET/CT in the staging of non-small cell lung cancer (NSCLC). Material and Methods. Cross-sectional and prospective study including 103 patients with histologically confirmed NSCLC. All patients were examined using PET/CT with intravenous contrast medium. Those with disease stage ≤IIB underwent surgery (n = 40). Disease stage was confirmed based on histology results, which were compared with those of PET/CT and positron emission tomography (PET) and computed tomography (CT) separately. 63 patients classified with ≥IIIA disease stage by PET/CT did not undergo surgery. The cost-effectiveness of PET/CT for disease classification was examined using a decision tree analysis. Results. Compared with histology, the accuracy of PET/CT for disease staging has a positive predictive value of 80%, a negative predictive value of 95%, a sensitivity of 94%, and a specificity of 82%. For PET alone, these values are 53%, 66%, 60%, and 50%, whereas for CT alone they are 68%, 86%, 76%, and 72%, respectively. Incremental cost-effectiveness of PET/CT over CT alone was €17,412 quality-adjusted life-year (QALY). Conclusion. In our clinical study, PET/CT using intravenous contrast medium was an accurate and cost-effective method for staging of patients with NSCLC

  15. 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography Accuracy in the Staging of Non-Small Cell Lung Cancer: Review and Cost-Effectiveness

    International Nuclear Information System (INIS)

    Leon, N.G.; Bandrs, B.; Escalona, S.; Callejo, D.; Blasco, J.A.; Belda, C.; Blasco, J.A.

    2014-01-01

    Aim of the performed clinical study was to compare the accuracy and cost-effectiveness of PET/CT in the staging of non-small cell lung cancer (NSCLC). Material and Methods. Cross-sectional and prospective study including 103 patients with histologically confirmed NSCLC. All patients were examined using PET/CT with intravenous contrast medium. Those with disease stage ≤IIB underwent surgery (η=40). Disease stage was confirmed based on histology results, which were compared with those of PET/CT and positron emission tomography (PET) and computed tomography (CT) separately. 63 patients classified with ≥IIIA disease stage by PET/CT did not undergo surgery. The cost-effectiveness of PET/CT for disease classification was examined using a decision tree analysis. Results. Compared with histology, the accuracy of PET/CT for disease staging has a positive predictive value of 80%, a negative predictive value of 95%, a sensitivity of 94%, and a specificity of 82%. For PET alone, these values are 53%, 66%, 60%, and 50%, whereas for CT alone they are 68%, 86%, 76%, and 72%, respectively. Incremental cost-effectiveness of PET/CT over CT alone was €17,412 quality-adjusted life-year (QALY). Conclusion. In our clinical study, PET/CT using intravenous contrast medium was an accurate and cost-effective method for staging of patients with NSCLC

  16. Computer generated movies to display biotelemetry data

    International Nuclear Information System (INIS)

    White, G.C.

    1979-01-01

    The three dimensional nature of biotelemetry data (x, y, time) makes them difficult to comprehend. Graphic displays provide a means of extracting information and analyzing biotelemetry data. The extensive computer graphics facilities at Los Alamos Scientific Laboratory have been utilized to analyze elk biotelemetry data. Fixes have been taken weekly for 14 months on 14 animals'. The inadequacy of still graphic displays to portray the time dimension of this data has lead to the use of computer generated movies to help grasp time relationships. A computer movie of the data from one animal demonstrates habitat use as a function of time, while a movie of 2 or more animals illustrates the correlations between the animals movements. About 2 hours of computer time were required to generate the movies for each animal for 1 year of data. The cost of the movies is quite small relative to the cost of collecting the data, so that computer generated movies are a reasonable method to depict biotelemetry data

  17. Studi Perbandingan Layanan Cloud Computing

    OpenAIRE

    Afdhal, Afdhal

    2013-01-01

    In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud d...

  18. Study of basic computer competence among public health nurses in Taiwan.

    Science.gov (United States)

    Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling

    2004-03-01

    Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.

  19. Laboratory cost and utilization containment.

    Science.gov (United States)

    Steiner, J W; Root, J M; White, D C

    1991-01-01

    The authors analyzed laboratory costs and utilization in 3,771 cases of Medicare inpatients admitted to a New England academic medical center ("the Hospital") from October 1, 1989 to September 30, 1990. The data were derived from the Hospital's Decision Resource System comprehensive data base. The authors established a historical reference point for laboratory costs as a percentage of total inpatient costs using 1981-82 Medicare claims data and cost report information. Inpatient laboratory costs were estimated at 9.5% of total inpatient costs for pre-Diagnostic Related Groups (DRGs) Medicare discharges. Using this reference point and adjusting for the Hospital's 1990 case mix, the "expected" laboratory cost was 9.3% of total cost. In fact, the cost averaged 11.5% (i.e., 24% above the expected cost level), and costs represented an even greater percentage of DRG reimbursement at 12.9%. If we regard the reimbursement as a total cost target (to eliminate losses from Medicare), then that 12.9% is 39% above the "expected" laboratory proportion of 9.3%. The Hospital lost an average of $1,091 on each DRG inpatient. The laboratory contributed 29% to this loss per case. Compared to other large hospitals, the Hospital was slightly (3%) above the mean direct cost per on-site test and significantly (58%) above the mean number of inpatient tests per inpatient day compared to large teaching hospitals. The findings suggest that careful laboratory cost analyses will become increasingly important as the proportion of patients reimbursed in a fixed manner grows. The future may hold a prospective zero-based laboratory budgeting process based on predictable patterns of DRG admissions or other fixed-reimbursement admission and laboratory utilization patterns.

  20. Minimizing total costs of forest roads with computer-aided design ...

    Indian Academy of Sciences (India)

    imum total road costs, while conforming to design specifications, environmental ..... quality, and enhancing fish and wildlife habitat, an appropriate design ..... Soil, Water and Timber Management: Forest Engineering Solutions in Response to.

  1. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  2. A maintenance and operations cost model for DSN

    Science.gov (United States)

    Burt, R. W.; Kirkbride, H. L.

    1977-01-01

    A cost model for the DSN is developed which is useful in analyzing the 10-year Life Cycle Cost of the Bent Pipe Project. The philosophy behind the development and the use made of a computer data base are detailed; the applicability of this model to other projects is discussed.

  3. Contemporary Costs Associated With Transcatheter Aortic Valve Replacement: A Propensity-Matched Cost Analysis.

    Science.gov (United States)

    Ailawadi, Gorav; LaPar, Damien J; Speir, Alan M; Ghanta, Ravi K; Yarboro, Leora T; Crosby, Ivan K; Lim, D Scott; Quader, Mohammed A; Rich, Jeffrey B

    2016-01-01

    The Placement of Aortic Transcatheter Valve (PARTNER) trial suggested an economic advantage for transcatheter aortic valve replacement (TAVR) for high-risk patients. The purpose of this study was to evaluate the cost effectiveness of TAVR in the "real world" by comparing TAVR with surgical aortic valve replacement (SAVR) in intermediate-risk and high-risk patients. A multiinstitutional database of The Society of Thoracic Surgeons (STS) (2011 to 2013) linked with estimated cost data was evaluated for isolated TAVR and SAVR operations (n = 5,578). TAVR-treated patients (n = 340) were 1:1 propensity matched with SAVR-treated patients (n = 340). Patients undergoing SAVR were further stratified into intermediate-risk (SAVR-IR: predicted risk of mortality [PROM] 4% to 8%) and high-risk (SAVR-HR: PROM >8%) cohorts. Median STS PROM for TAVR was 6.32% compared with 6.30% for SAVR (SAVR-IR 4.6% and SAVR-HR 12.4%). A transfemoral TAVR approach was most common (61%). Mortality was higher for TAVR (10%) compared with SAVR (6%, p costs compared with SAVR ($69,921 vs $33,598, p cost of TAVR was largely driven by the cost of the valve (all p cost savings versus TAVR. TAVR was associated with greater total costs and mortality compared with SAVR in intermediate-risk and high-risk patients while conferring lower major morbidity and improved resource use. Increased cost of TAVR appears largely related to the cost of the valve. Until the price of TAVR valves decreases, these data suggest that TAVR may not provide the most cost-effective strategy, particularly for intermediate-risk patients. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  4. Radiotherapy Monte Carlo simulation using cloud computing technology.

    Science.gov (United States)

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  5. Radiotherapy Monte Carlo simulation using cloud computing technology

    International Nuclear Information System (INIS)

    Poole, C.M.; Cornelius, I.; Trapp, J.V.; Langton, C.M.

    2012-01-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  6. ABC estimation of unit costs for emergency department services.

    Science.gov (United States)

    Holmes, R L; Schroeder, R E

    1996-04-01

    Rapid evolution of the health care industry forces managers to make cost-effective decisions. Typical hospital cost accounting systems do not provide emergency department managers with the information needed, but emergency department settings are so complex and dynamic as to make the more accurate activity-based costing (ABC) system prohibitively expensive. Through judicious use of the available traditional cost accounting information and simple computer spreadsheets. managers may approximate the decision-guiding information that would result from the much more costly and time-consuming implementation of ABC.

  7. Effects of computer-assisted oral anticoagulant therapy

    DEFF Research Database (Denmark)

    Rasmussen, Rune Skovgaard; Corell, Pernille; Madsen, Poul

    2012-01-01

    : Patients randomized to computer-assisted anticoagulation and the CoaguChek® system reached the therapeutic target range after 8 days compared to 14 days by prescriptions from physicians (p = 0.04). Time spent in the therapeutic target range did not differ between groups. The median INR value measured...... prescribed by physicians, and the total time spent within the therapeutic target range was similar. Thus computer-assisted oral anticoagulant therapy may reduce the cost of anticoagulation therapy without lowering the quality. INR values measured by CoaguChek® were reliable compared to measurements......UNLABELLED: BACKGROUND: Computer-assistance and self-monitoring lower the cost and may improve the quality of anticoagulation therapy. The main purpose of this clinical investigation was to use computer-assisted oral anticoagulant therapy to improve the time to reach and the time spent within...

  8. Hysteroscopic polypectomy prior to infertility treatment: A cost analysis and systematic review.

    Science.gov (United States)

    Mouhayar, Youssef; Yin, Ophelia; Mumford, Sunni L; Segars, James H

    2017-06-01

    The cost of fertility treatment is expensive and interventions that reduce cost can lead to greater efficiency and fewer embryos transferred. Endometrial polyps contribute to infertility and are frequently removed prior to infertility treatment. It is unclear whether polypectomy reduces fertility treatment cost and if so, the magnitude of cost reduction afforded by the procedure. The aim of this study was to determine whether performing office or operative hysteroscopic polypectomy prior to infertility treatment would be cost-effective. PubMed, Embase, and Cochrane libraries were used to identify publications reporting pregnancy rates after hysteroscopic polypectomy. Studies were required to have a polypectomy treatment group and control group of patients with polyps that were not resected. The charges of infertility treatments and polypectomy were obtained through infertility organizations and a private healthcare cost reporting website. These charges were applied to a decision tree model over the range of pregnancy rates observed in the representative studies to calculate an average cost per clinical or ongoing pregnancy. A sensitivity analysis was conducted to assess cost savings of polypectomy over a range of pregnancy rates and polypectomy costs. Pre-treatment office or operative hysteroscopic polypectomy ultimately saved €6658 ($7480) and €728 ($818), respectively, of the average cost per clinical pregnancy in women treated with four cycles of intrauterine insemination. Polypectomy prior to intrauterine insemination was cost-effective for clinical pregnancy rates greater than 30.2% for office polypectomy and 52.6% for operative polypectomy and for polypectomy price <€4414 ($4959). Office polypectomy or operative polypectomy saved €15,854 ($17,813) and €6644 ($7465), respectively, from the average cost per ongoing pregnancy for in vitro fertilization/intracytoplasmic sperm injection treated women and was cost-effective for ongoing pregnancy rates

  9. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  10. Magnetic-fusion energy and computers

    International Nuclear Information System (INIS)

    Killeen, J.

    1982-01-01

    The application of computers to magnetic fusion energy research is essential. In the last several years the use of computers in the numerical modeling of fusion systems has increased substantially. There are several categories of computer models used to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are also in use. To meet the needs of the fusion program, the National Magnetic Fusion Energy Computer Center has been established at the Lawrence Livermore National Laboratory. A large central computing facility is linked to smaller computer centers at each of the major MFE laboratories by a communication network. In addition to providing cost effective computing services, the NMFECC environment stimulates collaboration and the sharing of computer codes among the various fusion research groups

  11. Magnetic fusion energy and computers

    International Nuclear Information System (INIS)

    Killeen, J.

    1982-01-01

    The application of computers to magnetic fusion energy research is essential. In the last several years the use of computers in the numerical modeling of fusion systems has increased substantially. There are several categories of computer models used to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are also in use. To meet the needs of the fusion program, the National Magnetic Fusion Energy Computer Center has been established at the Lawrence Livermore National Laboratory. A large central computing facility is linked to smaller computer centers at each of the major MFE laboratories by a communication network. In addition to providing cost effective computing services, the NMFECC environment stimulates collaboration and the sharing of computer codes among the various fusion research groups

  12. 47 CFR 32.6124 - General purpose computers expense.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false General purpose computers expense. 32.6124... General purpose computers expense. This account shall include the costs of personnel whose principal job is the physical operation of general purpose computers and the maintenance of operating systems. This...

  13. Cost-Effectiveness of Old and New Technologies for Aneuploidy Screening.

    Science.gov (United States)

    Sinkey, Rachel G; Odibo, Anthony O

    2016-06-01

    Cost-effectiveness analyses allow assessment of whether marginal gains from new technology are worth increased costs. Several studies have examined cost-effectiveness of Down syndrome (DS) screening and found it to be cost-effective. Noninvasive prenatal screening also appears to be cost-effective among high-risk women with respect to DS screening, but not for the general population. Chromosomal microarray (CMA) is a genetic sequencing method superior to but more expensive than karyotype. In light of CMAs greater ability to detect genetic abnormalities, it is cost-effective when used for prenatal diagnosis of an anomalous fetus. This article covers methodology and salient issues of cost-effectiveness. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Conducting Computer Security Assessments at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    Computer security is increasingly recognized as a key component in nuclear security. As technology advances, it is anticipated that computer and computing systems will be used to an even greater degree in all aspects of plant operations including safety and security systems. A rigorous and comprehensive assessment process can assist in strengthening the effectiveness of the computer security programme. This publication outlines a methodology for conducting computer security assessments at nuclear facilities. The methodology can likewise be easily adapted to provide assessments at facilities with other radioactive materials

  15. Market power and state costs of HIV/AIDS drugs.

    Science.gov (United States)

    Leibowitz, Arleen A; Sood, Neeraj

    2007-03-01

    We examine whether U.S. states can use their market power to reduce the costs of supplying prescription drugs to uninsured and underinsured persons with HIV through a public program, the AIDS Drug Assistance Program (ADAP). Among states that purchase drugs from manufacturers and distribute them directly to clients, those that purchase a greater volume pay lower average costs per prescription. Among states depending on retail pharmacies to distribute drugs and then claiming rebates from manufacturers, those that contract with smaller numbers of pharmacy networks have lower average costs. Average costs per prescription do not differ between the two purchase methods.

  16. Radioimmunoassay evaluation and quality control by use of a simple computer program for a low cost desk top calculator

    International Nuclear Information System (INIS)

    Schwarz, S.

    1980-01-01

    A simple computer program for the data processing and quality control of radioimmunoassays is presented. It is written for low cost programmable desk top calculator (Hewlett Packard 97), which can be afforded by smaller laboratories. The untreated counts from the scintillation spectrometer are entered manually; the printout gives the following results: initial data, logit-log transformed calibration points, parameters of goodness of fit and of the position of the standard curve, control and unknown samples dose estimates (mean value from single dose interpolations and scatter of replicates) together with the automatic calculation of within assay variance and, by use of magnetic cards holding the control parameters of all previous assays, between assay variance. (orig.) [de

  17. Adaptive pacing, cognitive behaviour therapy, graded exercise, and specialist medical care for chronic fatigue syndrome: a cost-effectiveness analysis.

    Directory of Open Access Journals (Sweden)

    Paul McCrone

    Full Text Available The PACE trial compared the effectiveness of adding adaptive pacing therapy (APT, cognitive behaviour therapy (CBT, or graded exercise therapy (GET, to specialist medical care (SMC for patients with chronic fatigue syndrome. This paper reports the relative cost-effectiveness of these treatments in terms of quality adjusted life years (QALYs and improvements in fatigue and physical function.Resource use was measured and costs calculated. Healthcare and societal costs (healthcare plus lost production and unpaid informal care were combined with QALYs gained, and changes in fatigue and disability; incremental cost-effectiveness ratios (ICERs were computed.SMC patients had significantly lower healthcare costs than those receiving APT, CBT and GET. If society is willing to value a QALY at £30,000 there is a 62.7% likelihood that CBT is the most cost-effective therapy, a 26.8% likelihood that GET is most cost effective, 2.6% that APT is most cost-effective and 7.9% that SMC alone is most cost-effective. Compared to SMC alone, the incremental healthcare cost per QALY was £18,374 for CBT, £23,615 for GET and £55,235 for APT. From a societal perspective CBT has a 59.5% likelihood of being the most cost-effective, GET 34.8%, APT 0.2% and SMC alone 5.5%. CBT and GET dominated SMC, while APT had a cost per QALY of £127,047. ICERs using reductions in fatigue and disability as outcomes largely mirrored these findings.Comparing the four treatments using a health care perspective, CBT had the greatest probability of being the most cost-effective followed by GET. APT had a lower probability of being the most cost-effective option than SMC alone. The relative cost-effectiveness was even greater from a societal perspective as additional cost savings due to reduced need for informal care were likely.

  18. Time-driven activity-based costing of low-dose-rate and high-dose-rate brachytherapy for low-risk prostate cancer.

    Science.gov (United States)

    Ilg, Annette M; Laviana, Aaron A; Kamrava, Mitchell; Veruttipong, Darlene; Steinberg, Michael; Park, Sang-June; Burke, Michael A; Niedzwiecki, Douglas; Kupelian, Patrick A; Saigal, Christopher

    Cost estimates through traditional hospital accounting systems are often arbitrary and ambiguous. We used time-driven activity-based costing (TDABC) to determine the true cost of low-dose-rate (LDR) and high-dose-rate (HDR) brachytherapy for prostate cancer and demonstrate opportunities for cost containment at an academic referral center. We implemented TDABC for patients treated with I-125, preplanned LDR and computed tomography based HDR brachytherapy with two implants from initial consultation through 12-month followup. We constructed detailed process maps for provision of both HDR and LDR. Personnel, space, equipment, and material costs of each step were identified and used to derive capacity cost rates, defined as price per minute. Each capacity cost rate was then multiplied by the relevant process time and products were summed to determine total cost of care. The calculated cost to deliver HDR was greater than LDR by $2,668.86 ($9,538 vs. $6,869). The first and second HDR treatment day cost $3,999.67 and $3,955.67, whereas LDR was delivered on one treatment day and cost $3,887.55. The greatest overall cost driver for both LDR and HDR was personnel at 65.6% ($4,506.82) and 67.0% ($6,387.27) of the total cost. After personnel costs, disposable materials contributed the second most for LDR ($1,920.66, 28.0%) and for HDR ($2,295.94, 24.0%). With TDABC, the true costs to deliver LDR and HDR from the health system perspective were derived. Analysis by physicians and hospital administrators regarding the cost of care afforded redesign opportunities including delivering HDR as one implant. Our work underscores the need to assess clinical outcomes to understand the true difference in value between these modalities. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  19. Greater-confinement disposal

    International Nuclear Information System (INIS)

    Trevorrow, L.E.; Schubert, J.P.

    1989-01-01

    Greater-confinement disposal (GCD) is a general term for low-level waste (LLW) disposal technologies that employ natural and/or engineered barriers and provide a degree of confinement greater than that of shallow-land burial (SLB) but possibly less than that of a geologic repository. Thus GCD is associated with lower risk/hazard ratios than SLB. Although any number of disposal technologies might satisfy the definition of GCD, eight have been selected for consideration in this discussion. These technologies include: (1) earth-covered tumuli, (2) concrete structures, both above and below grade, (3) deep trenches, (4) augered shafts, (5) rock cavities, (6) abandoned mines, (7) high-integrity containers, and (8) hydrofracture. Each of these technologies employ several operations that are mature,however, some are at more advanced stages of development and demonstration than others. Each is defined and further described by information on design, advantages and disadvantages, special equipment requirements, and characteristic operations such as construction, waste emplacement, and closure

  20. A plan for administrative computing at ANL FY1991 through FY1993

    Energy Technology Data Exchange (ETDEWEB)

    Caruthers, L.E. (ed.); O' Brien, D.E.; Bretscher, M.E.; Hischier, R.C.; Moore, N.J.; Slade, R.G.

    1990-10-01

    In July of 1988, Argonne National Laboratory management approved the restructuring of Computing Services into the Computing and Telecommunications Division, part of the Physical Research area of the Laboratory. One major area of the Computing and Telecommunications Division is Management Information Systems (MIS). A significant aspect of Management Information Systems' work is the development of proposals for new and enhanced administrative computing systems based on an analysis of informational needs. This document represent the outcome of the planning process for FY1991 through FY1993. The introduction of the FY1991 through FY1993 Long-Range Plan assesses the state of administrative computing at ANL and the implications of FY1991 funding recommendations. It includes a history of MIS planning for administrative data processing. This document discusses the strategy and goals which are an important part of administrative data processing plans for the Laboratory. It also describes the management guidelines established by the Administrative Data Processing Oversight Committee for the proposal and implementation of administrative computing systems. Summaries of the proposals for new or enhanced administrative computing systems presented by individual divisions or departments with assistance of Management Information Systems, to the Administrative Data Processing Oversight Committee are given. The detailed tables in this paper give information on how much the resources to develop and implement a given systems will cost its users. The tables include development costs, computing/operations costs, software and hardware costs, and efforts costs. They include both systems funded by Laboratory General Expense and systems funded by the users themselves.