Higher-order techniques in computational electromagnetics
Graglia, Roberto D
2016-01-01
Higher-Order Techniques in Computational Electromagnetics explains 'high-order' techniques that can significantly improve the accuracy, computational cost, and reliability of computational techniques for high-frequency electromagnetics, such as antennas, microwave devices and radar scattering applications.
Cost Efficiency in Public Higher Education.
Robst, John
This study used the frontier cost function framework to examine cost efficiency in public higher education. The frontier cost function estimates the minimum predicted cost for producing a given amount of output. Data from the annual Almanac issues of the "Chronicle of Higher Education" were used to calculate state level enrollments at two-year and…
Trombella, Jerry
2011-01-01
As concern over rapidly rising college costs and tuition sticker prices have increased, a variety of research has been conducted to determine potential causes. Most of this research has focused on factors unique to higher education. In contrast, cost disease theory attempts to create a comparative context to explain cost increases in higher…
Activity-Based Costing Systems for Higher Education.
Day, Dennis H.
1993-01-01
Examines traditional costing models utilized in higher education and pinpoints shortcomings related to proper identification of costs. Describes activity-based costing systems as a superior alternative for cost identification, measurement, and allocation. (MLF)
Implementation of cloud computing in higher education
Asniar; Budiawan, R.
2016-04-01
Cloud computing research is a new trend in distributed computing, where people have developed service and SOA (Service Oriented Architecture) based application. This technology is very useful to be implemented, especially for higher education. This research is studied the need and feasibility for the suitability of cloud computing in higher education then propose the model of cloud computing service in higher education in Indonesia that can be implemented in order to support academic activities. Literature study is used as the research methodology to get a proposed model of cloud computing in higher education. Finally, SaaS and IaaS are cloud computing service that proposed to be implemented in higher education in Indonesia and cloud hybrid is the service model that can be recommended.
A cost modelling system for cloud computing
Ajeh, Daniel; Ellman, Jeremy; Keogh, Shelagh
2014-01-01
An advance in technology unlocks new opportunities for organizations to increase their productivity, efficiency and process automation while reducing the cost of doing business as well. The emergence of cloud computing addresses these prospects through the provision of agile systems that are scalable, flexible and reliable as well as cost effective. Cloud computing has made hosting and deployment of computing resources cheaper and easier with no up-front charges but pay per-use flexible payme...
Cloud Computing-An Ultimate Technique to Minimize Computing cost for Developing Countries
Narendra Kumar; Shikha Jain
2012-01-01
The presented paper deals with how remotely managed computing and IT resources can be beneficial in the developing countries like India and Asian sub-continent countries. This paper not only defines the architectures and functionalities of cloud computing but also indicates strongly about the current demand of Cloud computing to achieve organizational and personal level of IT supports in very minimal cost with high class flexibility. The power of cloud can be used to reduce the cost of IT - r...
Cost/Benefit Analysis of Leasing Versus Purchasing Computers
National Research Council Canada - National Science Library
Arceneaux, Alan
1997-01-01
.... In constructing this model, several factors were considered, including: The purchase cost of computer equipment, annual lease payments, depreciation costs, the opportunity cost of purchasing, tax revenue implications and various leasing terms...
Incremental ALARA cost/benefit computer analysis
International Nuclear Information System (INIS)
Hamby, P.
1987-01-01
Commonwealth Edison Company has developed and is testing an enhanced Fortran Computer Program to be used for cost/benefit analysis of Radiation Reduction Projects at its six nuclear power facilities and Corporate Technical Support Groups. This paper describes a Macro-Diven IBM Mainframe Program comprised of two different types of analyses-an Abbreviated Program with fixed costs and base values, and an extended Engineering Version for a detailed, more through and time-consuming approach. The extended engineering version breaks radiation exposure costs down into two components-Health-Related Costs and Replacement Labor Costs. According to user input, the program automatically adjust these two cost components and applies the derivation to company economic analyses such as replacement power costs, carrying charges, debt interest, and capital investment cost. The results from one of more program runs using different parameters may be compared in order to determine the most appropriate ALARA dose reduction technique. Benefits of this particular cost / benefit analysis technique includes flexibility to accommodate a wide range of user data and pre-job preparation, as well as the use of proven and standardized company economic equations
Computer-Supported Collaborative Learning in Higher Education
Roberts, Tim, Ed.
2005-01-01
"Computer-Supported Collaborative Learning in Higher Education" provides a resource for researchers and practitioners in the area of computer-supported collaborative learning (also known as CSCL); particularly those working within a tertiary education environment. It includes articles of relevance to those interested in both theory and practice in…
International Nuclear Information System (INIS)
Bierschbach, M.C.; Mencinsky, G.J.
1993-10-01
With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the US Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personnel computer, provides estimates for the cost of decommissioning PWR plant stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning
Is higher nursing home quality more costly?
Giorgio, L Di; Filippini, M; Masiero, G
2016-11-01
Widespread issues regarding quality in nursing homes call for an improved understanding of the relationship with costs. This relationship may differ in European countries, where care is mainly delivered by nonprofit providers. In accordance with the economic theory of production, we estimate a total cost function for nursing home services using data from 45 nursing homes in Switzerland between 2006 and 2010. Quality is measured by means of clinical indicators regarding process and outcome derived from the minimum data set. We consider both composite and single quality indicators. Contrary to most previous studies, we use panel data and control for omitted variables bias. This allows us to capture features specific to nursing homes that may explain differences in structural quality or cost levels. Additional analysis is provided to address simultaneity bias using an instrumental variable approach. We find evidence that poor levels of quality regarding outcome, as measured by the prevalence of severe pain and weight loss, lead to higher costs. This may have important implications for the design of payment schemes for nursing homes.
International Nuclear Information System (INIS)
Pendergrass, J.H.
1978-01-01
Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data
Cost-effectiveness analysis of computer-based assessment
Directory of Open Access Journals (Sweden)
Pauline Loewenberger
2003-12-01
Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.
Computational cost of isogeometric multi-frontal solvers on parallel distributed memory machines
Woźniak, Maciej
2015-02-01
This paper derives theoretical estimates of the computational cost for isogeometric multi-frontal direct solver executed on parallel distributed memory machines. We show theoretically that for the Cp-1 global continuity of the isogeometric solution, both the computational cost and the communication cost of a direct solver are of order O(log(N)p2) for the one dimensional (1D) case, O(Np2) for the two dimensional (2D) case, and O(N4/3p2) for the three dimensional (3D) case, where N is the number of degrees of freedom and p is the polynomial order of the B-spline basis functions. The theoretical estimates are verified by numerical experiments performed with three parallel multi-frontal direct solvers: MUMPS, PaStiX and SuperLU, available through PETIGA toolkit built on top of PETSc. Numerical results confirm these theoretical estimates both in terms of p and N. For a given problem size, the strong efficiency rapidly decreases as the number of processors increases, becoming about 20% for 256 processors for a 3D example with 1283 unknowns and linear B-splines with C0 global continuity, and 15% for a 3D example with 643 unknowns and quartic B-splines with C3 global continuity. At the same time, one cannot arbitrarily increase the problem size, since the memory required by higher order continuity spaces is large, quickly consuming all the available memory resources even in the parallel distributed memory version. Numerical results also suggest that the use of distributed parallel machines is highly beneficial when solving higher order continuity spaces, although the number of processors that one can efficiently employ is somehow limited.
Collaborating to Cut Costs in Higher Education
Hassett, Tracy
2017-01-01
Tuition prices at colleges and universities are high. It is also true that salaries and benefits are the single biggest chunk of every higher education institution's (HEI) budget. And one of the largest and most difficult costs to contain is group employee health insurance. The situation is particularly difficult for smaller New England HEIs…
Acute costs and predictors of higher treatment costs of trauma in New South Wales, Australia.
Curtis, Kate; Lam, Mary; Mitchell, Rebecca; Black, Deborah; Taylor, Colman; Dickson, Cara; Jan, Stephen; Palmer, Cameron S; Langcake, Mary; Myburgh, John
2014-01-01
Accurate economic data are fundamental for improving current funding models and ultimately in promoting the efficient delivery of services. The financial burden of a high trauma casemix to designated trauma centres in Australia has not been previously determined, and there is some evidence that the episode funding model used in Australia results in the underfunding of trauma. To describe the costs of acute trauma admissions in trauma centres, identify predictors of higher treatment costs and cost variance in New South Wales (NSW), Australia. Data linkage of admitted trauma patient and financial data provided by 12 Level 1 NSW trauma centres for the 08/09 financial year was performed. Demographic, injury details and injury scores were obtained from trauma registries. Individual patient general ledger costs (actual trauma patient costs), Australian Refined Diagnostic Related Groups (AR-DRG) and state-wide average costs (which form the basis of funding) were obtained. The actual costs incurred by the hospital were then compared with the state-wide AR-DRG average costs. Multivariable multiple linear regression was used for identifying predictors of costs. There were 17,522 patients, the average per patient cost was $10,603 and the median was $4628 (interquartile range: $2179-10,148). The actual costs incurred by trauma centres were on average $134 per bed day above AR-DRG costs-determined costs. Falls, road trauma and violence were the highest causes of total cost. Motor cyclists and pedestrians had higher median costs than motor vehicle occupants. As a result of greater numbers, patients with minor injury had comparable total costs with those generated by patients with severe injury. However the median cost of severely injured patients was nearly four times greater. The count of body regions injured, sex, length of stay, serious traumatic brain injury and admission to the Intensive Care Unit were significantly associated with increased costs (p<0.001). This
Computer Software for Life Cycle Cost.
1987-04-01
34 111. 1111I .25 IL4 jj 16 MICROCOPY RESOLUTION TEST CHART hut FILE C AIR CoMMNAMN STFF COLLG STUJDET PORTO i COMpUTER SOFTWARE FOR LIFE CYCLE CO879...obsolete), physical life (utility before physically wearing out), or application life (utility in a given function)." (7:5) The costs are usually
Cloud Computing and Information Technology Resource Cost Management for SMEs
DEFF Research Database (Denmark)
Kuada, Eric; Adanu, Kwame; Olesen, Henning
2013-01-01
This paper analyzes the decision-making problem confronting SMEs considering the adoption of cloud computing as an alternative to in-house computing services provision. The economics of choosing between in-house computing and a cloud alternative is analyzed by comparing the total economic costs...... in determining the relative value of cloud computing....
Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.
Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P
2010-12-22
Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.
An approximate fractional Gaussian noise model with computational cost
Sørbye, Sigrunn H.
2017-09-18
Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood-based approach is ${\\\\mathcal O}(n^{2})$, exploiting the Toeplitz structure of the covariance matrix. In most realistic cases, we do not observe the fGn process directly but only through indirect Gaussian observations, so the Toeplitz structure is easily lost and the computational cost increases to ${\\\\mathcal O}(n^{3})$. This paper presents an approximate fGn model of ${\\\\mathcal O}(n)$ computational cost, both with direct or indirect Gaussian observations, with or without conditioning. This is achieved by approximating fGn with a weighted sum of independent first-order autoregressive processes, fitting the parameters of the approximation to match the autocorrelation function of the fGn model. The resulting approximation is stationary despite being Markov and gives a remarkably accurate fit using only four components. The performance of the approximate fGn model is demonstrated in simulations and two real data examples.
Costs of cloud computing for a biometry department. A case study.
Knaus, J; Hieke, S; Binder, H; Schwarzer, G
2013-01-01
"Cloud" computing providers, such as the Amazon Web Services (AWS), offer stable and scalable computational resources based on hardware virtualization, with short, usually hourly, billing periods. The idea of pay-as-you-use seems appealing for biometry research units which have only limited access to university or corporate data center resources or grids. This case study compares the costs of an existing heterogeneous on-site hardware pool in a Medical Biometry and Statistics department to a comparable AWS offer. The "total cost of ownership", including all direct costs, is determined for the on-site hardware, and hourly prices are derived, based on actual system utilization during the year 2011. Indirect costs, which are difficult to quantify are not included in this comparison, but nevertheless some rough guidance from our experience is given. To indicate the scale of costs for a methodological research project, a simulation study of a permutation-based statistical approach is performed using AWS and on-site hardware. In the presented case, with a system utilization of 25-30 percent and 3-5-year amortization, on-site hardware can result in smaller costs, compared to hourly rental in the cloud dependent on the instance chosen. Renting cloud instances with sufficient main memory is a deciding factor in this comparison. Costs for on-site hardware may vary, depending on the specific infrastructure at a research unit, but have only moderate impact on the overall comparison and subsequent decision for obtaining affordable scientific computing resources. Overall utilization has a much stronger impact as it determines the actual computing hours needed per year. Taking this into ac count, cloud computing might still be a viable option for projects with limited maturity, or as a supplement for short peaks in demand.
Rattel, Julina A; Miedl, Stephan F; Blechert, Jens; Wilhelm, Frank H
2017-09-01
Theoretical models specifying the underlying mechanisms of the development and maintenance of anxiety and related disorders state that fear responses acquired through classical Pavlovian conditioning are maintained by repeated avoidance behaviour; thus, it is assumed that avoidance prevents fear extinction. The present study investigated behavioural avoidance decisions as a function of avoidance costs in a naturalistic fear conditioning paradigm. Ecologically valid avoidance costs - manipulated between participant groups - were represented via time-delays during a detour in a gamified computer task. After differential acquisitions of shock-expectancy to a predictive conditioned stimulus (CS+), participants underwent extinction where they could either take a risky shortcut, while anticipating shock signaled by the CS+, or choose a costly avoidance option (lengthy detour); thus, they were faced with an approach-avoidance conflict. Groups with higher avoidance costs (longer detours) showed lower proportions of avoiders. Avoiders gave heightened shock-expectancy ratings post-extinction, demonstrating 'protecting from extinction', i.e. failure to extinguish. Moreover, there was an indirect effect of avoidance costs on protection from extinction through avoidance behaviour. No moderating role of trait-anxiety was found. Theoretical implications of avoidance behaviour are discussed, considering the involvement of instrumental learning in the maintenance of fear responses. Copyright © 2016 Elsevier Ltd. All rights reserved.
Cost-Benefit Analysis of Computer Resources for Machine Learning
Champion, Richard A.
2007-01-01
Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.
The Hidden Cost of Buying a Computer.
Johnson, Michael
1983-01-01
In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)
Costing Principles in Higher Education and Their Application (First Revision).
Sterns, A. A.
This document provides a reason for applying known cost-accounting methodology within the realm of higher education and attempts to make the known techniques viable for sets of objectives within the university environment. The plan developed here is applied to a department, the lowest level in the university hierarchy, and demonstrates costs in…
Special issue of Higher-Order and Symbolic Computation
DEFF Research Database (Denmark)
solicited from papers presented at ASIAPEPM 02, the 2002 SIGPLAN Symposium on Partial Evaluation and Semantics-Based Program Manipulation [1]. The four articles were subjected to the usual process of journal reviewing. "Cost-Augmented Partial Evaluation of Functional Logic Programs" extends previous......The present issue is dedicated to Partial Evaluation and Semantics-Based Program Manipulation. Its first two articles were solicited from papers presented at PEPM 02, the 2002 ACMSIGPLANWorkshop on Partial Evaluation and Semantics-Based Program Manipulation [2], and its last two articles were...... narrowing-driven techniques of partial evaluation for functional-logic programs by the inclusion of abstract computation costs into the partial-evaluation process. A preliminary version of this work was presented at PEPM 02. "Specialization Scenarios: A Pragmatic Approach to Declaring Program Specialization...
Directory of Open Access Journals (Sweden)
Masuod Ferdosi
2016-10-01
Full Text Available Background: Hospital managers need to have accurate information about actual costs to make efficient and effective decisions. In activity based costing method, first, activities are recognized and then direct and indirect costs are computed based on allocation methods. The aim of this study was to compute the cost price for cataract surgery by Activity Based Costing (ABC method at Hazrat-e-Zahra Hospital, Isfahan University of Medical Sciences. Methods: This was a cross- sectional study for computing the costs of cataract surgery by activity based costing technique in Hazrat-e-Zahra Hospital in Isfahan University of Medical Sciences, 2014. Data were collected through interview and direct observation and analyzed by Excel software. Results: According to the results of this study, total cost in cataract surgery was 8,368,978 Rials. Personnel cost included 62.2% (5,213,574 Rials of total cost of cataract surgery that is the highest share of surgery costs. The cost of consumables was 7.57% (1,992,852 Rials of surgery costs. Conclusion: Based on the results, there was different between cost price of the services and public Tariff which appears as hazards or financial crises to the hospital. Therefore, it is recommended to use the right methods to compute the costs relating to Activity Based Costing. Cost price of cataract surgery can be reduced by strategies such as decreasing the cost of consumables.
Low cost spacecraft computers: Oxymoron or future trend?
Manning, Robert M.
1993-01-01
Over the last few decades, application of current terrestrial computer technology in embedded spacecraft control systems has been expensive and wrought with many technical challenges. These challenges have centered on overcoming the extreme environmental constraints (protons, neutrons, gamma radiation, cosmic rays, temperature, vibration, etc.) that often preclude direct use of commercial off-the-shelf computer technology. Reliability, fault tolerance and power have also greatly constrained the selection of spacecraft control system computers. More recently, new constraints are being felt, cost and mass in particular, that have again narrowed the degrees of freedom spacecraft designers once enjoyed. This paper discusses these challenges, how they were previously overcome, how future trends in commercial computer technology will simplify (or hinder) selection of computer technology for spacecraft control applications, and what spacecraft electronic system designers can do now to circumvent them.
International Nuclear Information System (INIS)
Wee, Jung-Ho
2007-01-01
This paper compares the total cost of direct methanol fuel cell (DMFC) and lithium (Li)-ion battery systems when applied as the power supply for laptop computers in the Korean environment. The average power output and operational time of the laptop computers were assumed to be 20 W and 3000 h, respectively. Considering the status of their technologies and with certain conditions assumed, the total costs were calculated to be US$140 for the Li-ion battery and US$362 for DMFC. The manufacturing costs of the DMFC and Li-ion battery systems were calculated to be $16.65 W -1 and $0.77 W h -1 , and the energy consumption costs to be $0.00051 W h -1 and $0.00032 W h -1 , respectively. The higher fuel consumption cost of the DMFC system was due to the methanol (MeOH) crossover loss. Therefore, the requirements for DMFCs to be able to compete with Li-ion batteries in terms of energy cost include reducing the crossover level to at an order magnitude of -9 and the MeOH price to under $0.5 kg -1 . Under these conditions, if the DMFC manufacturing cost could be reduced to $6.30 W -1 , then the DMFC system would become at least as competitive as the Li-ion battery system for powering laptop computers in Korea. (author)
International Nuclear Information System (INIS)
Bierschbach, M.C.
1994-12-01
With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning
Cost-effectiveness of PET and PET/Computed Tomography
DEFF Research Database (Denmark)
Gerke, Oke; Hermansson, Ronnie; Hess, Søren
2015-01-01
measure by means of incremental cost-effectiveness ratios when considering the replacement of the standard regimen by a new diagnostic procedure. This article discusses economic assessments of PET and PET/computed tomography reported until mid-July 2014. Forty-seven studies on cancer and noncancer...
Price-Cost Ratios in Higher Education: Subsidy Structure and Policy Implications
Xie, Yan
2010-01-01
The diversity of US institutions of higher education is manifested in many ways. This study looks at that diversity from the economic perspective by studying the subsidy structure through the distribution of institutional price-cost ratio (PCR), defined as the sum of net tuition price divided by total supplier cost and equals to one minus…
Low-cost computer mouse for the elderly or disabled in Taiwan.
Chen, C-C; Chen, W-L; Chen, B-N; Shih, Y-Y; Lai, J-S; Chen, Y-L
2014-01-01
A mouse is an important communication interface between a human and a computer, but it is still difficult to use for the elderly or disabled. To develop a low-cost computer mouse auxiliary tool. The principal structure of the low-cost mouse auxiliary tool is the IR (infrared ray) array module and the Wii icon sensor module, which combine with reflective tape and the SQL Server database. This has several benefits including cheap hardware cost, fluent control, prompt response, adaptive adjustment and portability. Also, it carries the game module with the function of training and evaluation; to the trainee, it is really helpful to upgrade the sensitivity of consciousness/sense and the centralization of attention. The intervention phase/maintenance phase, with regard to clicking accuracy and use of time, p value (p< 0.05) reach the level of significance. The development of the low cost adaptive computer mouse auxiliary tool was completed during the study and was also verified as having the characteristics of low cost, easy operation and the adaptability. To patients with physical disabilities, if they have independent control action parts of their limbs, the mouse auxiliary tool is suitable for them to use, i.e. the user only needs to paste the reflective tape by the independent control action parts of the body to operate the mouse auxiliary tool.
Directory of Open Access Journals (Sweden)
Mohammed F. Hadi
2012-01-01
Full Text Available It is argued here that more accurate though more compute-intensive alternate algorithms to certain computational methods which are deemed too inefficient and wasteful when implemented within serial codes can be more efficient and cost-effective when implemented in parallel codes designed to run on today's multicore and many-core environments. This argument is most germane to methods that involve large data sets with relatively limited computational density—in other words, algorithms with small ratios of floating point operations to memory accesses. The examples chosen here to support this argument represent a variety of high-order finite-difference time-domain algorithms. It will be demonstrated that a three- to eightfold increase in floating-point operations due to higher-order finite-differences will translate to only two- to threefold increases in actual run times using either graphical or central processing units of today. It is hoped that this argument will convince researchers to revisit certain numerical techniques that have long been shelved and reevaluate them for multicore usability.
Plant process computer replacements - techniques to limit installation schedules and costs
International Nuclear Information System (INIS)
Baker, M.D.; Olson, J.L.
1992-01-01
Plant process computer systems, a standard fixture in all nuclear power plants, are used to monitor and display important plant process parameters. Scanning thousands of field sensors and alarming out-of-limit values, these computer systems are heavily relied on by control room operators. The original nuclear steam supply system (NSSS) vendor for the power plant often supplied the plant process computer. Designed using sixties and seventies technology, a plant's original process computer has been obsolete for some time. Driven by increased maintenance costs and new US Nuclear Regulatory Commission regulations such as NUREG-0737, Suppl. 1, many utilities have replaced their process computers with more modern computer systems. Given that computer systems are by their nature prone to rapid obsolescence, this replacement cycle will likely repeat. A process computer replacement project can be a significant capital expenditure and must be performed during a scheduled refueling outage. The object of the installation process is to install a working system on schedule. Experience gained by supervising several computer replacement installations has taught lessons that, if applied, will shorten the schedule and limit the risk of costly delays. Examples illustrating this technique are given. This paper and these examples deal only with the installation process and assume that the replacement computer system has been adequately designed, and development and factory tested
Manual of phosphoric acid fuel cell power plant cost model and computer program
Lu, C. Y.; Alkasab, K. A.
1984-01-01
Cost analysis of phosphoric acid fuel cell power plant includes two parts: a method for estimation of system capital costs, and an economic analysis which determines the levelized annual cost of operating the system used in the capital cost estimation. A FORTRAN computer has been developed for this cost analysis.
Hsiang, E; Little, K M; Haguma, P; Hanrahan, C F; Katamba, A; Cattamanchi, A; Davis, J L; Vassall, A; Dowdy, D
2016-09-01
Initial cost-effectiveness evaluations of Xpert(®) MTB/RIF for tuberculosis (TB) diagnosis have not fully accounted for the realities of implementation in peripheral settings. To evaluate costs and diagnostic outcomes of Xpert testing implemented at various health care levels in Uganda. We collected empirical cost data from five health centers utilizing Xpert for TB diagnosis, using an ingredients approach. We reviewed laboratory and patient records to assess outcomes at these sites and10 sites without Xpert. We also estimated incremental cost-effectiveness of Xpert testing; our primary outcome was the incremental cost of Xpert testing per newly detected TB case. The mean unit cost of an Xpert test was US$21 based on a mean monthly volume of 54 tests per site, although unit cost varied widely (US$16-58) and was primarily determined by testing volume. Total diagnostic costs were 2.4-fold higher in Xpert clinics than in non-Xpert clinics; however, Xpert only increased diagnoses by 12%. The diagnostic costs of Xpert averaged US$119 per newly detected TB case, but were as high as US$885 at the center with the lowest volume of tests. Xpert testing can detect TB cases at reasonable cost, but may double diagnostic budgets for relatively small gains, with cost-effectiveness deteriorating with lower testing volumes.
Development of computer program for estimating decommissioning cost - 59037
International Nuclear Information System (INIS)
Kim, Hak-Soo; Park, Jong-Kil
2012-01-01
The programs for estimating the decommissioning cost have been developed for many different purposes and applications. The estimation of decommissioning cost is required a large amount of data such as unit cost factors, plant area and its inventory, waste treatment, etc. These make it difficult to use manual calculation or typical spreadsheet software such as Microsoft Excel. The cost estimation for eventual decommissioning of nuclear power plants is a prerequisite for safe, timely and cost-effective decommissioning. To estimate the decommissioning cost more accurately and systematically, KHNP, Korea Hydro and Nuclear Power Co. Ltd, developed a decommissioning cost estimating computer program called 'DeCAT-Pro', which is Decommission-ing Cost Assessment Tool - Professional. (Hereinafter called 'DeCAT') This program allows users to easily assess the decommissioning cost with various decommissioning options. Also, this program provides detailed reporting for decommissioning funding requirements as well as providing detail project schedules, cash-flow, staffing plan and levels, and waste volumes by waste classifications and types. KHNP is planning to implement functions for estimating the plant inventory using 3-D technology and for classifying the conditions of radwaste disposal and transportation automatically. (authors)
Addressing the computational cost of large EIT solutions
International Nuclear Information System (INIS)
Boyle, Alistair; Adler, Andy; Borsic, Andrea
2012-01-01
Electrical impedance tomography (EIT) is a soft field tomography modality based on the application of electric current to a body and measurement of voltages through electrodes at the boundary. The interior conductivity is reconstructed on a discrete representation of the domain using a finite-element method (FEM) mesh and a parametrization of that domain. The reconstruction requires a sequence of numerically intensive calculations. There is strong interest in reducing the cost of these calculations. An improvement in the compute time for current problems would encourage further exploration of computationally challenging problems such as the incorporation of time series data, wide-spread adoption of three-dimensional simulations and correlation of other modalities such as CT and ultrasound. Multicore processors offer an opportunity to reduce EIT computation times but may require some restructuring of the underlying algorithms to maximize the use of available resources. This work profiles two EIT software packages (EIDORS and NDRM) to experimentally determine where the computational costs arise in EIT as problems scale. Sparse matrix solvers, a key component for the FEM forward problem and sensitivity estimates in the inverse problem, are shown to take a considerable portion of the total compute time in these packages. A sparse matrix solver performance measurement tool, Meagre-Crowd, is developed to interface with a variety of solvers and compare their performance over a range of two- and three-dimensional problems of increasing node density. Results show that distributed sparse matrix solvers that operate on multiple cores are advantageous up to a limit that increases as the node density increases. We recommend a selection procedure to find a solver and hardware arrangement matched to the problem and provide guidance and tools to perform that selection. (paper)
Addressing the computational cost of large EIT solutions.
Boyle, Alistair; Borsic, Andrea; Adler, Andy
2012-05-01
Electrical impedance tomography (EIT) is a soft field tomography modality based on the application of electric current to a body and measurement of voltages through electrodes at the boundary. The interior conductivity is reconstructed on a discrete representation of the domain using a finite-element method (FEM) mesh and a parametrization of that domain. The reconstruction requires a sequence of numerically intensive calculations. There is strong interest in reducing the cost of these calculations. An improvement in the compute time for current problems would encourage further exploration of computationally challenging problems such as the incorporation of time series data, wide-spread adoption of three-dimensional simulations and correlation of other modalities such as CT and ultrasound. Multicore processors offer an opportunity to reduce EIT computation times but may require some restructuring of the underlying algorithms to maximize the use of available resources. This work profiles two EIT software packages (EIDORS and NDRM) to experimentally determine where the computational costs arise in EIT as problems scale. Sparse matrix solvers, a key component for the FEM forward problem and sensitivity estimates in the inverse problem, are shown to take a considerable portion of the total compute time in these packages. A sparse matrix solver performance measurement tool, Meagre-Crowd, is developed to interface with a variety of solvers and compare their performance over a range of two- and three-dimensional problems of increasing node density. Results show that distributed sparse matrix solvers that operate on multiple cores are advantageous up to a limit that increases as the node density increases. We recommend a selection procedure to find a solver and hardware arrangement matched to the problem and provide guidance and tools to perform that selection.
Academic Computing Facilities and Services in Higher Education--A Survey.
Warlick, Charles H.
1986-01-01
Presents statistics about academic computing facilities based on data collected over the past six years from 1,753 institutions in the United States, Canada, Mexico, and Puerto Rico for the "Directory of Computing Facilities in Higher Education." Organizational, functional, and financial characteristics are examined as well as types of…
An approximate fractional Gaussian noise model with computational cost
Sø rbye, Sigrunn H.; Myrvoll-Nilsen, Eirik; Rue, Haavard
2017-01-01
Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood
Kaganoff, Tessa
This document presents a review of cost-containment initiatives relevant to higher education institutions. Originally commissioned to examine cost containment initiatives carried out by institutions affiliated with the Foundation for Independent Higher Education (FIHE), the paper was expanded to include a sector-wide review of three types of…
Adaptive Radar Signal Processing-The Problem of Exponential Computational Cost
National Research Council Canada - National Science Library
Rangaswamy, Muralidhar
2003-01-01
.... Extensions to handle the case of non-Gaussian clutter statistics are presented. Current challenges of limited training data support, computational cost, and severely heterogeneous clutter backgrounds are outlined...
Rustemeyer, Jan; Melenberg, Alex; Sari-Rieger, Aynur
2014-12-01
This study aims to evaluate the additional costs incurred by using a computer-aided design/computer-aided manufacturing (CAD/CAM) technique for reconstructing maxillofacial defects by analyzing typical cases. The medical charts of 11 consecutive patients who were subjected to the CAD/CAM technique were considered, and invoices from the companies providing the CAD/CAM devices were reviewed for every case. The number of devices used was significantly correlated with cost (r = 0.880; p costs were found between cases in which prebent reconstruction plates were used (€3346.00 ± €29.00) and cases in which they were not (€2534.22 ± €264.48; p costs of two, three and four devices, even when ignoring the cost of reconstruction plates. Additional fees provided by statutory health insurance covered a mean of 171.5% ± 25.6% of the cost of the CAD/CAM devices. Since the additional fees provide financial compensation, we believe that the CAD/CAM technique is suited for wide application and not restricted to complex cases. Where additional fees/funds are not available, the CAD/CAM technique might be unprofitable, so the decision whether or not to use it remains a case-to-case decision with respect to cost versus benefit. Copyright © 2014 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
The performance of low-cost commercial cloud computing as an alternative in computational chemistry.
Thackston, Russell; Fortenberry, Ryan C
2015-05-05
The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services. © 2015 Wiley Periodicals, Inc.
Computer-aided voice training in higher education: participants ...
African Journals Online (AJOL)
The training of performance singing in a multi lingual, multi cultural educational context presents unique problems and requires inventive teaching strategies. Computer-aided training offers objective visual feedback of the voice production that can be implemented as a teaching aid in higher education. This article reports on ...
A survey of cost accounting in service-oriented computing
de Medeiros, Robson W.A.; Rosa, Nelson S.; Campos, Glaucia M.M.; Ferreira Pires, Luis
Nowadays, companies are increasingly offering their business services through computational services on the Internet in order to attract more customers and increase their revenues. However, these services have financial costs that need to be managed in order to maximize profit. Several models and
Dennis, J. Richard; Thomson, David
This paper is concerned with a low cost alternative for providing computer experience to secondary school students. The brief discussion covers the programmable calculator and its relevance for teaching the concepts and the rudiments of computer programming and for computer problem solving. A list of twenty-five programming activities related to…
Software Requirements for a System to Compute Mean Failure Cost
Energy Technology Data Exchange (ETDEWEB)
Aissa, Anis Ben [University of Tunis, Belvedere, Tunisia; Abercrombie, Robert K [ORNL; Sheldon, Frederick T [ORNL; Mili, Ali [New Jersey Insitute of Technology
2010-01-01
In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder. We also demonstrated this infrastructure through the results of security breakdowns for the ecommerce case. In this paper, we illustrate this infrastructure by an application that supports the computation of the Mean Failure Cost (MFC) for each stakeholder.
Directory of Open Access Journals (Sweden)
Leah P. Hollis
2015-06-01
Full Text Available Workplace bullying has a detrimental effect on employees, yet few studies have examined its impact on personnel in American higher education administration. Therefore, two central research questions guided this study: (a What is the extent of workplace bullying in higher education administration? and (b What is the cost of workplace bullying specifically to higher education administration? Participants from 175 four-year colleges and universities were surveyed to reveal that 62% of higher education administrators had experienced or witnessed workplace bullying in the 18 months prior to the study. Race and gender were not parameters considered in the sample. A total of 401 (n = 401 higher education respondents completed the instrument from various departments on a campus: academic affairs, student affairs, athletics, development/advancement, admissions/financial aid, information technology, arts faculty, sciences faculty, and executives. Employment disengagement served as the theoretical lens to analyze the financial cost to higher education when employees mentally disengage from organizational missions and objectives. With this lens, the study examined staff hours lost through employee disengagement and the associated costs.
Hulten, Edward; Goehler, Alexander; Bittencourt, Marcio Sommer; Bamberg, Fabian; Schlett, Christopher L; Truong, Quynh A; Nichols, John; Nasir, Khurram; Rogers, Ian S; Gazelle, Scott G; Nagurney, John T; Hoffmann, Udo; Blankstein, Ron
2013-09-01
Coronary computed tomographic angiography (cCTA) allows rapid, noninvasive exclusion of obstructive coronary artery disease (CAD). However, concern exists whether implementation of cCTA in the assessment of patients presenting to the emergency department with acute chest pain will lead to increased downstream testing and costs compared with alternative strategies. Our aim was to compare observed actual costs of usual care (UC) with projected costs of a strategy including early cCTA in the evaluation of patients with acute chest pain in the Rule Out Myocardial Infarction Using Computer Assisted Tomography I (ROMICAT I) study. We compared cost and hospital length of stay of UC observed among 368 patients enrolled in the ROMICAT I study with projected costs of management based on cCTA. Costs of UC were determined by an electronic cost accounting system. Notably, UC was not influenced by cCTA results because patients and caregivers were blinded to the cCTA results. Costs after early implementation of cCTA were estimated assuming changes in management based on cCTA findings of the presence and severity of CAD. Sensitivity analysis was used to test the influence of key variables on both outcomes and costs. We determined that in comparison with UC, cCTA-guided triage, whereby patients with no CAD are discharged, could reduce total hospital costs by 23% (Pcost increases such that when the prevalence of ≥ 50% stenosis is >28% to 33%, the use of cCTA becomes more costly than UC. cCTA may be a cost-saving tool in acute chest pain populations that have a prevalence of potentially obstructive CAD cost would be anticipated in populations with higher prevalence of disease.
A low-cost vector processor boosting compute-intensive image processing operations
Adorf, Hans-Martin
1992-01-01
Low-cost vector processing (VP) is within reach of everyone seriously engaged in scientific computing. The advent of affordable add-on VP-boards for standard workstations complemented by mathematical/statistical libraries is beginning to impact compute-intensive tasks such as image processing. A case in point in the restoration of distorted images from the Hubble Space Telescope. A low-cost implementation is presented of the standard Tarasko-Richardson-Lucy restoration algorithm on an Intel i860-based VP-board which is seamlessly interfaced to a commercial, interactive image processing system. First experience is reported (including some benchmarks for standalone FFT's) and some conclusions are drawn.
Can a Costly Intervention Be Cost-effective?
Foster, E. Michael; Jones, Damon
2009-01-01
Objectives To examine the cost-effectiveness of the Fast Track intervention, a multi-year, multi-component intervention designed to reduce violence among at-risk children. A previous report documented the favorable effect of intervention on the highest-risk group of ninth-graders diagnosed with conduct disorder, as well as self-reported delinquency. The current report addressed the cost-effectiveness of the intervention for these measures of program impact. Design Costs of the intervention were estimated using program budgets. Incremental cost-effectiveness ratios were computed to determine the cost per unit of improvement in the 3 outcomes measured in the 10th year of the study. Results Examination of the total sample showed that the intervention was not cost-effective at likely levels of policymakers' willingness to pay for the key outcomes. Subsequent analysis of those most at risk, however, showed that the intervention likely was cost-effective given specified willingness-to-pay criteria. Conclusions Results indicate that the intervention is cost-effective for the children at highest risk. From a policy standpoint, this finding is encouraging because such children are likely to generate higher costs for society over their lifetimes. However, substantial barriers to cost-effectiveness remain, such as the ability to effectively identify and recruit such higher-risk children in future implementations. PMID:17088509
Aggarwal, Anju; Monsivais, Pablo; Cook, Andrea J.; Drewnowski, Adam
2014-01-01
Shopping at low-cost supermarkets has been associated with higher obesity rates. This study examined whether attitudes toward healthy eating are independently associated with diet quality among shoppers at low-cost, medium-cost, and high-cost supermarkets. Data on socioeconomic status (SES), attitudes toward healthy eating, and supermarket choice were collected using a telephone survey of a representative sample of adult residents of King County, WA. Dietary intake data were based on a food frequency questionnaire. Thirteen supermarket chains were stratified into three categories: low, medium, and high cost, based on a market basket of 100 commonly eaten foods. Diet-quality measures were energy density, mean adequacy ratio, and total servings of fruits and vegetables. The analytical sample consisted of 963 adults. Multivariable regressions with robust standard error examined relations between diet quality, supermarket type, attitudes, and SES. Shopping at higher-cost supermarkets was associated with higher-quality diets. These associations persisted after adjusting for SES, but were eliminated after taking attitudinal measures into account. Supermarket shoppers with positive attitudes toward healthy eating had equally higher-quality diets, even if they shopped at low-, medium-, or high-cost supermarkets, independent of SES and other covariates. These findings imply that shopping at low-cost supermarkets does not prevent consumers from having high-quality diets, as long as they attach importance to good nutrition. Promoting nutrition-education strategies among supermarkets, particularly those catering to low-income groups, can help to improve diet quality. PMID:23916974
Low cost highly available digital control computer
International Nuclear Information System (INIS)
Silvers, M.W.
1986-01-01
When designing digital controllers for critical plant control it is important to provide several features. Among these are reliability, availability, maintainability, environmental protection, and low cost. An examination of several applications has lead to a design that can be produced for approximately $20,000 (1000 control points). This design is compatible with modern concepts in distributed and hierarchical control. The canonical controller element is a dual-redundant self-checking computer that communicates with a cross-strapped, electrically isolated input/output system. The input/output subsystem comprises multiple intelligent input/output cards. These cards accept commands from the primary processor which are validated, executed, and acknowledged. Each card may be hot replaced to facilitate sparing. The implementation of the dual-redundant computer architecture is discussed. Called the FS-86, this computer can be used for a variety of applications. It has most recently found application in the upgrade of San Francisco's Bay Area Rapid Transit (BART) train control currently in progress and has been proposed for feedwater control in a boiling water reactor
Comparative cost analysis -- computed tomography vs. alternative diagnostic procedures, 1977-1980
International Nuclear Information System (INIS)
Gempel, P.A.; Harris, G.H.; Evans, R.G.
1977-12-01
In comparing the total national cost of utilizing computed tomography (CT) for medically indicated diagnoses with that of conventional x-ray, ultrasonography, nuclear medicine, and exploratory surgery, this investigation concludes that there was little, if any, added net cost from CT use in 1977 or will there be in 1980. Computed tomography, generally recognized as a reliable and useful diagnostic modality, has the potential to reduce net costs provided that an optimal number of units can be made available to physicians and patients to achieve projected reductions in alternative procedures. This study examines the actual cost impact of CT on both cranial and body diagnostic procedures. For abdominal and mediastinal disorders, CT scanning is just beginning to emerge as a diagnostic modality. As such, clinical experience is somewhat limited and the authors assume that no significant reduction in conventional procedures took place in 1977. It is estimated that the approximately 375,000 CT body procedures performed in 1977 represent only a 5 percent cost increase over use of other diagnostic modalities. It is projected that 2,400,000 CT body procedures will be performed in 1980 and, depending on assumptions used, total body diagnostic costs will increase only slightly or be reduced. Thirty-one tables appear throughout the text presenting cost data broken down by types of diagnostic procedures used and projections by years. Appendixes present technical cost components for diagnostic procedures, the comparative efficacy of CT as revealed in abstracts of published literature, selected medical diagnoses, and references
International Nuclear Information System (INIS)
Eikefjord, E.; Askildsen, J.E.; Roervik, J.
2008-01-01
Background: It is important to compare the cost and effectiveness of multidetector computed tomography (MDCT) and intravenous urography (IVU) to determine the most cost-effective alternative for the initial investigation of acute ureterolithiasis. Purpose: To analyze the task-specific variable costs combined with the diagnostic effect of MDCT and IVU for patients with acute flank pain, and to determine which is most cost effective. Material and Methods: 119 patients with acute flank pain suggestive of stone disease (ureterolithiasis) were examined by both MDCT and IVU. Variable costs related to medical equipment, consumption material, equipment control, and personnel were calculated. The diagnostic effect was assessed. Results: The variable costs of MDCT versus IVU were EUR 32 and EUR 117, respectively. This significant difference was mainly due to savings in examination time, higher annual examination frequency, lower material costs, and no use of contrast media. As for diagnostic effect, MDCT proved considerably more accurate in the diagnosis of stone disease than IVU and markedly more accurate concerning differential diagnoses. Conclusion: MDCT had lower differential costs and a higher capacity to determine correctly stone disease and differential diagnoses, as compared to IVU, in patients with acute flank pain. Consequently, MDCT is a dominant alternative to IVU when evaluated exclusively from a cost-effective perspective
Aggarwal, Anju; Monsivais, Pablo; Cook, Andrea J; Drewnowski, Adam
2014-02-01
Shopping at low-cost supermarkets has been associated with higher obesity rates. This study examined whether attitudes toward healthy eating are independently associated with diet quality among shoppers at low-cost, medium-cost, and high-cost supermarkets. Data on socioeconomic status (SES), attitudes toward healthy eating, and supermarket choice were collected using a telephone survey of a representative sample of adult residents of King County, WA. Dietary intake data were based on a food frequency questionnaire. Thirteen supermarket chains were stratified into three categories: low, medium, and high cost, based on a market basket of 100 commonly eaten foods. Diet-quality measures were energy density, mean adequacy ratio, and total servings of fruits and vegetables. The analytical sample consisted of 963 adults. Multivariable regressions with robust standard error examined relations between diet quality, supermarket type, attitudes, and SES. Shopping at higher-cost supermarkets was associated with higher-quality diets. These associations persisted after adjusting for SES, but were eliminated after taking attitudinal measures into account. Supermarket shoppers with positive attitudes toward healthy eating had equally higher-quality diets, even if they shopped at low-, medium-, or high-cost supermarkets, independent of SES and other covariates. These findings imply that shopping at low-cost supermarkets does not prevent consumers from having high-quality diets, as long as they attach importance to good nutrition. Promoting nutrition-education strategies among supermarkets, particularly those catering to low-income groups, can help to improve diet quality. Copyright © 2014 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Client-server computer architecture saves costs and eliminates bottlenecks
International Nuclear Information System (INIS)
Darukhanavala, P.P.; Davidson, M.C.; Tyler, T.N.; Blaskovich, F.T.; Smith, C.
1992-01-01
This paper reports that workstation, client-server architecture saved costs and eliminated bottlenecks that BP Exploration (Alaska) Inc. experienced with mainframe computer systems. In 1991, BP embarked on an ambitious project to change technical computing for its Prudhoe Bay, Endicott, and Kuparuk operations on Alaska's North Slope. This project promised substantial rewards, but also involved considerable risk. The project plan called for reservoir simulations (which historically had run on a Cray Research Inc. X-MP supercomputer in the company's Houston data center) to be run on small computer workstations. Additionally, large Prudhoe Bay, Endicott, and Kuparuk production and reservoir engineering data bases and related applications also would be moved to workstations, replacing a Digital Equipment Corp. VAX cluster in Anchorage
hPIN/hTAN: Low-Cost e-Banking Secure against Untrusted Computers
Li, Shujun; Sadeghi, Ahmad-Reza; Schmitz, Roland
We propose hPIN/hTAN, a low-cost token-based e-banking protection scheme when the adversary has full control over the user's computer. Compared with existing hardware-based solutions, hPIN/hTAN depends on neither second trusted channel, nor secure keypad, nor computationally expensive encryption module.
USAGE AND MAGNETIZATION OF CLOUD COMPUTING IN HIGHER STUDIES – RAJASTHAN
Directory of Open Access Journals (Sweden)
Ranjan Upadhyaya
2013-07-01
Full Text Available The Young India is a doorstep of another revolution of Cloud Computing Technology and the whole world adores the true colors of Indian Information revolution in the Global Recession. The India biggest and heavily densely populated country (1.6 Million according 20011 census surveys India comprises of new age aspirants roughly 50% to 60% and out of these only 30% are Cloud Computing savvy. The uphill task lies ahead for the motherland is to train the new breads so that they can get their livelihoods and well connect them to the outer world. The inspiration of late Rajiv Gandhi’s and Prof Yashpal dream is propagating into the reality but still more work is mingled up. The submergence of the Cloud Computing revolution is taking its all time cost and bring a lot more changes which was never expected or though off in our India. Cloud computing the ladder for success for the uncultivated breeds in our nation. The nation is marching ahead with the Sculpture of ubiquitous Cloud Computing in this liberalization, privatization and globalization era.
Higher costs confirmed for US supercollider
Vaughan, C
1990-01-01
American Secratary of Energy, James Watkins told Congress that the SSC will cost at least one to two billion dollars more than its estimated cost. He admitted that the final cost may be so high that the collider is not worth building (3 paragraphs).
Low-cost autonomous perceptron neural network inspired by quantum computation
Zidan, Mohammed; Abdel-Aty, Abdel-Haleem; El-Sadek, Alaa; Zanaty, E. A.; Abdel-Aty, Mahmoud
2017-11-01
Achieving low cost learning with reliable accuracy is one of the important goals to achieve intelligent machines to save time, energy and perform learning process over limited computational resources machines. In this paper, we propose an efficient algorithm for a perceptron neural network inspired by quantum computing composite from a single neuron to classify inspirable linear applications after a single training iteration O(1). The algorithm is applied over a real world data set and the results are outer performs the other state-of-the art algorithms.
Higher order correlations in computed particle distributions
International Nuclear Information System (INIS)
Hanerfeld, H.; Herrmannsfeldt, W.; Miller, R.H.
1989-03-01
The rms emittances calculated for beam distributions using computer simulations are frequently dominated by higher order aberrations. Thus there are substantial open areas in the phase space plots. It has long been observed that the rms emittance is not an invariant to beam manipulations. The usual emittance calculation removes the correlation between transverse displacement and transverse momentum. In this paper, we explore the possibility of defining higher order correlations that can be removed from the distribution to result in a lower limit to the realizable emittance. The intent is that by inserting the correct combinations of linear lenses at the proper position, the beam may recombine in a way that cancels the effects of some higher order forces. An example might be the non-linear transverse space charge forces which cause a beam to spread. If the beam is then refocused so that the same non-linear forces reverse the inward velocities, the resulting phase space distribution may reasonably approximate the original distribution. The approach to finding the location and strength of the proper lens to optimize the transported beam is based on work by Bruce Carlsten of Los Alamos National Laboratory. 11 refs., 4 figs
A low cost computer-controlled electrochemical measurement system for education and research
International Nuclear Information System (INIS)
Cottis, R.A.
1989-01-01
With the advent of low cost computers of significant processing power, it has become economically attractive, as well as offering practical advantages, to replace conventional electrochemical instrumentation with computer-based equipment. For example, the equipment to be described can perform all of the functions required for the measurement of a potentiodynamic polarization curve, replacing the conventional arrangement of sweep generator, potentiostat and chart recorder at a cost (based on the purchase cost of parts) which is less than that of most chart recorders alone. Additionally the use of computer control at a relatively low level provides a versatility (assuming the development of suitable software) which cannot easily be matched by conventional instruments. As a result of these considerations a simple computer-controlled electrochemical measurement system has been developed, with a primary aim being its use in teaching an MSc class in corrosion science and engineering, with additional applications in MSc and PhD research. For education reasons the design of the user interface has tried to make the internal operation of the unit as obvious as possible, and thereby minimize the tendency for students to treat the unit as a 'black box' with incomprehensible inner workings. This has resulted in a unit in which the three main components of function generator, potentiostat and recorder are presented as independent areas on the front panel, and can be configured by the user in exactly the same way as conventional instruments. (author) 11 figs
A low cost computer-controlled electrochemical measurement system for education and research
Energy Technology Data Exchange (ETDEWEB)
Cottis, R A [Manchester Univ. (UK). Inst. of Science and Technology
1989-01-01
With the advent of low cost computers of significant processing power, it has become economically attractive, as well as offering practical advantages, to replace conventional electrochemical instrumentation with computer-based equipment. For example, the equipment to be described can perform all of the functions required for the measurement of a potentiodynamic polarization curve, replacing the conventional arrangement of sweep generator, potentiostat and chart recorder at a cost (based on the purchase cost of parts) which is less than that of most chart recorders alone. Additionally the use of computer control at a relatively low level provides a versatility (assuming the development of suitable software) which cannot easily be matched by conventional instruments. As a result of these considerations a simple computer-controlled electrochemical measurement system has been developed, with a primary aim being its use in teaching an MSc class in corrosion science and engineering, with additional applications in MSc and PhD research. For education reasons the design of the user interface has tried to make the internal operation of the unit as obvious as possible, and thereby minimize the tendency for students to treat the unit as a 'black box' with incomprehensible inner workings. This has resulted in a unit in which the three main components of function generator, potentiostat and recorder are presented as independent areas on the front panel, and can be configured by the user in exactly the same way as conventional instruments. (author) 11 figs.
Speklé, Erwin M; Heinrich, Judith; Hoozemans, Marco J M; Blatter, Birgitte M; van der Beek, Allard J; van Dieën, Jaap H; van Tulder, Maurits W
2010-11-11
The costs of arm, shoulder and neck symptoms are high. In order to decrease these costs employers implement interventions aimed at reducing these symptoms. One frequently used intervention is the RSI QuickScan intervention programme. It establishes a risk profile of the target population and subsequently advises interventions following a decision tree based on that risk profile. The purpose of this study was to perform an economic evaluation, from both the societal and companies' perspective, of the RSI QuickScan intervention programme for computer workers. In this study, effectiveness was defined at three levels: exposure to risk factors, prevalence of arm, shoulder and neck symptoms, and days of sick leave. The economic evaluation was conducted alongside a randomised controlled trial (RCT). Participating computer workers from 7 companies (N = 638) were assigned to either the intervention group (N = 320) or the usual care group (N = 318) by means of cluster randomisation (N = 50). The intervention consisted of a tailor-made programme, based on a previously established risk profile. At baseline, 6 and 12 month follow-up, the participants completed the RSI QuickScan questionnaire. Analyses to estimate the effect of the intervention were done according to the intention-to-treat principle. To compare costs between groups, confidence intervals for cost differences were computed by bias-corrected and accelerated bootstrapping. The mean intervention costs, paid by the employer, were 59 euro per participant in the intervention and 28 euro in the usual care group. Mean total health care and non-health care costs per participant were 108 euro in both groups. As to the cost-effectiveness, improvement in received information on healthy computer use as well as in their work posture and movement was observed at higher costs. With regard to the other risk factors, symptoms and sick leave, only small and non-significant effects were found. In this study, the RSI Quick
Modelling the Intention to Adopt Cloud Computing Services: A Transaction Cost Theory Perspective
Directory of Open Access Journals (Sweden)
Ogan Yigitbasioglu
2014-11-01
Full Text Available This paper uses transaction cost theory to study cloud computing adoption. A model is developed and tested with data from an Australian survey. According to the results, perceived vendor opportunism and perceived legislative uncertainty around cloud computing were significantly associated with perceived cloud computing security risk. There was also a significant negative relationship between perceived cloud computing security risk and the intention to adopt cloud services. This study also reports on adoption rates of cloud computing in terms of applications, as well as the types of services used.
International Nuclear Information System (INIS)
Daniska, Vladimir; Rehak, Ivan; Vasko, Marek; Ondra, Frantisek; Bezak, Peter; Pritrsky, Jozef; Zachar, Matej; Necas, Vladimir
2011-01-01
The document 'A Proposed Standardised List of Items for Costing Purposes' was issues in 1999 by OECD/NEA, IAEA and European Commission (EC) for promoting the harmonisation in decommissioning costing. It is a systematic list of decommissioning activities classified in chapters 01 to 11 with three numbered levels. Four cost group are defined for cost at each level. Document constitutes the standardised matrix of decommissioning activities and cost groups with definition of content of items. Knowing what is behind the items makes the comparison of cost for decommissioning projects transparent. Two approaches are identified for use of the standardised cost structure. First approach converts the cost data from existing specific cost structures into the standardised cost structure for the purpose of cost presentation. Second approach uses the standardised cost structure as the base for the cost calculation structure; the calculated cost data are formatted in the standardised cost format directly; several additional advantages may be identified in this approach. The paper presents the costing methodology based on the standardised cost structure and lessons learnt from last ten years of the implementation of the standardised cost structure as the cost calculation structure in the computer code OMEGA. Code include also on-line management of decommissioning waste, decay of radioactively, evaluation of exposure, generation and optimisation of the Gantt chart of a decommissioning project, which makes the OMEGA code an effective tool for planning and optimisation of decommissioning processes. (author)
Ismail, Mohd Nazri
2009-01-01
In 21st century, convergences of technologies and services in heterogeneous environment have contributed multi-traffic. This scenario will affect computer network on learning system in higher educational Institutes. Implementation of various services can produce different types of content and quality. Higher educational institutes should have a good computer network infrastructure to support usage of various services. The ability of computer network should consist of i) higher bandwidth; ii) ...
Efficiency, Costs, Rankings and Heterogeneity: The Case of US Higher Education
Agasisti, Tommaso; Johnes, Geraint
2015-01-01
Among the major trends in the higher education (HE) sector, the development of rankings as a policy and managerial tool is of particular relevance. However, despite the diffusion of these instruments, it is still not clear how they relate with traditional performance measures, like unit costs and efficiency scores. In this paper, we estimate a…
Scilab software as an alternative low-cost computing in solving the linear equations problem
Agus, Fahrul; Haviluddin
2017-02-01
Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.
Cost-effective computational method for radiation heat transfer in semi-crystalline polymers
Boztepe, Sinan; Gilblas, Rémi; de Almeida, Olivier; Le Maoult, Yannick; Schmidt, Fabrice
2018-05-01
This paper introduces a cost-effective numerical model for infrared (IR) heating of semi-crystalline polymers. For the numerical and experimental studies presented here semi-crystalline polyethylene (PE) was used. The optical properties of PE were experimentally analyzed under varying temperature and the obtained results were used as input in the numerical studies. The model was built based on optically homogeneous medium assumption whereas the strong variation in the thermo-optical properties of semi-crystalline PE under heating was taken into account. Thus, the change in the amount radiative energy absorbed by the PE medium was introduced in the model induced by its temperature-dependent thermo-optical properties. The computational study was carried out considering an iterative closed-loop computation, where the absorbed radiation was computed using an in-house developed radiation heat transfer algorithm -RAYHEAT- and the computed results was transferred into the commercial software -COMSOL Multiphysics- for solving transient heat transfer problem to predict temperature field. The predicted temperature field was used to iterate the thermo-optical properties of PE that varies under heating. In order to analyze the accuracy of the numerical model experimental analyses were carried out performing IR-thermographic measurements during the heating of the PE plate. The applicability of the model in terms of computational cost, number of numerical input and accuracy was highlighted.
Leah P. Hollis
2015-01-01
Workplace bullying has a detrimental effect on employees, yet few studies have examined its impact on personnel in American higher education administration. Therefore, two central research questions guided this study: (a) What is the extent of workplace bullying in higher education administration? and (b) What is the cost of workplace bullying specifically to higher education administration? Participants from 175 four-...
Computational cost of isogeometric multi-frontal solvers on parallel distributed memory machines
Woźniak, Maciej; Paszyński, Maciej R.; Pardo, D.; Dalcin, Lisandro; Calo, Victor M.
2015-01-01
This paper derives theoretical estimates of the computational cost for isogeometric multi-frontal direct solver executed on parallel distributed memory machines. We show theoretically that for the Cp-1 global continuity of the isogeometric solution
Hybrid Cloud Computing Architecture Optimization by Total Cost of Ownership Criterion
Directory of Open Access Journals (Sweden)
Elena Valeryevna Makarenko
2014-12-01
Full Text Available Achieving the goals of information security is a key factor in the decision to outsource information technology and, in particular, to decide on the migration of organizational data, applications, and other resources to the infrastructure, based on cloud computing. And the key issue in the selection of optimal architecture and the subsequent migration of business applications and data to the cloud organization information environment is the question of the total cost of ownership of IT infrastructure. This paper focuses on solving the problem of minimizing the total cost of ownership cloud.
Alacbay, Armand; Barden, Danielle
2017-01-01
With recent research from the Institute for Higher Education Policy showing that college is unaffordable for as many as 70% of working- and middle-class students, concerns about college costs are mounting. The cost of operating an institution of higher education, with very few exceptions, is reflected in the price of attendance that students,…
Development of a computer program for the cost analysis of spent fuel management
International Nuclear Information System (INIS)
Choi, Heui Joo; Lee, Jong Youl; Choi, Jong Won; Cha, Jeong Hun; Whang, Joo Ho
2009-01-01
So far, a substantial amount of spent fuels have been generated from the PWR and CANDU reactors. They are being temporarily stored at the nuclear power plant sites. It is expected that the temporary storage facility will be full of spent fuels by around 2016. The government plans to solve the problem by constructing an interim storage facility soon. The radioactive management act was enacted in 2008 to manage the spent fuels safety in Korea. According to the act, the radioactive waste management fund which will be used for the transportation, interim storage, and the final disposal of spent fuels has been established. The cost for the management of spent fuels is surprisingly high and could include a lot of uncertainty. KAERI and Kyunghee University have developed cost estimation tools to evaluate the cost for a spent fuel management based on an engineering design and calculation. It is not easy to develop a tool for a cost estimation under the situation that the national policy on a spent fuel management has not yet been fixed at all. Thus, the current version of the computer program is based on the current conceptual design of each management system. The main purpose of this paper is to introduce the computer program developed for the cost analysis of a spent fuel management. In order to show the application of the program, a spent fuel management scenario is prepared, and the cost for the scenario is estimated
Hocine, Benlaria; Sofiane, Mostéfaoui
2017-01-01
This study aims to measure the social and individual rates of return for investment in higher education at Adrar University. The approach adopted looks for investigating the costs and benefits of the human capital. The study found that the economic feasibility of investment in higher education exists at both the individual and social levels, where…
Lu, Yung-Hsiang; Chen, Ku-Hsieh
2013-01-01
This paper aims at appraising the cost efficiency and technology of institutions of higher technological and vocational education. Differing from conventional literature, it considers the potential influence of inherent discrepancies in output quality and characteristics of school systems for institutes of technology (ITs) and universities of…
Directory of Open Access Journals (Sweden)
Ibnteesam Pondor
2017-09-01
Full Text Available Food price is a determining factor of food choices; however its relationship with diet quality is unclear in Malaysia. This study aimed to examine socio-economic characteristics and daily dietary cost (DDC in relation to diet quality in the state of Selangor, Malaysia. Dietary intake was assessed using a Food Frequency Questionnaire (FFQ and diet quality was estimated using a Malaysian Healthy Eating Index (M-HEI. DDC in Malaysian Ringgit (RM was calculated from dietary intake and national food prices. Linear regression models were fitted to determine associations between DDC and M-HEI scores and predictors of diet quality. The mean M-HEI score of respondents was 61.31 ± 10.88 and energy adjusted DDC was RM10.71/2000 kcal (USD 2.49. The highest quintile of adjusted DDC had higher M-HEI scores for all respondents (Q1: 57.14 ± 10.07 versus Q5: 63.26 ± 11.54, p = 0.001. There were also positive associations between DDC and M-HEI scores for fruits (p < 0.001 and vegetables (p = 0.017 for all respondents. Predictors of diet quality included carbohydrate (β = 0290; p < 0.001 and fat intakes (β = −0.242; p < 0.001 and energy adjusted DDC (β = 0.196; p < 0.001. Higher dietary cost is associated with healthy eating among Malaysian adults.
A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing
Guerrero, Ginés D.; Imbernón, Baldomero; García, José M.
2014-01-01
Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC) platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs) has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO). This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor. PMID:25025055
Cheaper fuel and higher health costs among the poor in rural Nepal
Energy Technology Data Exchange (ETDEWEB)
Pant, Krishna Prasad [Ministry of Agriculture and Cooperatives, Vidhya Lane, Devnagar, Kathmandu (Nepal)], email: kppant@yahoo.com
2012-03-15
Biomass fuels are used by the majority of resource poor households in low-income countries. Though biomass fuels, such as dung-briquette and firewood are apparently cheaper than the modern fuels indoor pollution from burning biomass fuels incurs high health costs. But, the health costs of these conventional fuels, mostly being indirect, are poorly understood. To address this gap, this study develops probit regression models using survey data generated through interviews from households using either dung-briquette or biogas as the primary source of fuel for cooking. The study investigates factors affecting the use of dung-briquette, assesses its impact on human health, and estimates the associated household health costs. Analysis suggests significant effects of dung-briquette on asthma and eye diseases. Despite of the perception of it being a cheap fuel, the annual health cost per household due to burning dung-briquette (US$ 16.94) is 61.3% higher than the annual cost of biogas (US$ 10.38), an alternative cleaner fuel for rural households. For reducing the use of dung-briquette and its indirect health costs, the study recommends three interventions: (1) educate women and aboriginal people, in particular, and make them aware of the benefits of switching to biogas; (2) facilitate tree planting in communal as well as private lands; and (3) create rural employment and income generation opportunities.
Casey, James B.
1998-01-01
Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…
Computational cost estimates for parallel shared memory isogeometric multi-frontal solvers
Woźniak, Maciej; Kuźnik, Krzysztof M.; Paszyński, Maciej R.; Calo, Victor M.; Pardo, D.
2014-01-01
In this paper we present computational cost estimates for parallel shared memory isogeometric multi-frontal solvers. The estimates show that the ideal isogeometric shared memory parallel direct solver scales as O( p2log(N/p)) for one dimensional problems, O(Np2) for two dimensional problems, and O(N4/3p2) for three dimensional problems, where N is the number of degrees of freedom, and p is the polynomial order of approximation. The computational costs of the shared memory parallel isogeometric direct solver are compared with those corresponding to the sequential isogeometric direct solver, being the latest equal to O(N p2) for the one dimensional case, O(N1.5p3) for the two dimensional case, and O(N2p3) for the three dimensional case. The shared memory version significantly reduces both the scalability in terms of N and p. Theoretical estimates are compared with numerical experiments performed with linear, quadratic, cubic, quartic, and quintic B-splines, in one and two spatial dimensions. © 2014 Elsevier Ltd. All rights reserved.
Computational cost estimates for parallel shared memory isogeometric multi-frontal solvers
Woźniak, Maciej
2014-06-01
In this paper we present computational cost estimates for parallel shared memory isogeometric multi-frontal solvers. The estimates show that the ideal isogeometric shared memory parallel direct solver scales as O( p2log(N/p)) for one dimensional problems, O(Np2) for two dimensional problems, and O(N4/3p2) for three dimensional problems, where N is the number of degrees of freedom, and p is the polynomial order of approximation. The computational costs of the shared memory parallel isogeometric direct solver are compared with those corresponding to the sequential isogeometric direct solver, being the latest equal to O(N p2) for the one dimensional case, O(N1.5p3) for the two dimensional case, and O(N2p3) for the three dimensional case. The shared memory version significantly reduces both the scalability in terms of N and p. Theoretical estimates are compared with numerical experiments performed with linear, quadratic, cubic, quartic, and quintic B-splines, in one and two spatial dimensions. © 2014 Elsevier Ltd. All rights reserved.
A nearly-linear computational-cost scheme for the forward dynamics of an N-body pendulum
Chou, Jack C. K.
1989-01-01
The dynamic equations of motion of an n-body pendulum with spherical joints are derived to be a mixed system of differential and algebraic equations (DAE's). The DAE's are kept in implicit form to save arithmetic and preserve the sparsity of the system and are solved by the robust implicit integration method. At each solution point, the predicted solution is corrected to its exact solution within given tolerance using Newton's iterative method. For each iteration, a linear system of the form J delta X = E has to be solved. The computational cost for solving this linear system directly by LU factorization is O(n exp 3), and it can be reduced significantly by exploring the structure of J. It is shown that by recognizing the recursive patterns and exploiting the sparsity of the system the multiplicative and additive computational costs for solving J delta X = E are O(n) and O(n exp 2), respectively. The formulation and solution method for an n-body pendulum is presented. The computational cost is shown to be nearly linearly proportional to the number of bodies.
What Does It Cost a University to Educate One Student
Directory of Open Access Journals (Sweden)
Maria Andrea Lotho Santiago
2007-02-01
Full Text Available A dilemma administrators continually face is whether to continue offering degree programs despite low student uptake, especially because producing reliable cost data to aid decision making can prove difficult. Often, a university determines a standard cost per credit or unit and uses this figure as a basis for computing the total cost of running a degree program. This is then compared to a revenue stream and the difference, whether positive or negative, is used in decision making. However, this method of computing costs, although appealing for its simplicity, may fail to capture the effects of economies that may arise as one school or college services another. In this paper, we use a basic cost accounting methodology applied to the higher education system of the Philippines to compute for a cost per degree per student for a sample of public and private universities. Although the methodology is more time consuming, the computed figures are deemed closer to actual costs and, thus, we argue, are more reliable as inputs to financial decision making.
The Ability of implementing Cloud Computing in Higher Education - KRG
Directory of Open Access Journals (Sweden)
Zanyar Ali Ahmed
2017-06-01
Full Text Available Cloud computing is a new technology. CC is an online service can store and retrieve information, without the requirement for physical access to the files on hard drives. The information is available on a system, server where it can be accessed by clients when it’s needed. Lack of the ICT infrastructure of universities of the Kurdistan Regional Government (KRG can use this new technology, because of economical advantages, enhanced data managements, better maintenance, high performance, improve availability and accessibility therefore achieving an easy maintenance of organizational institutes. The aim of this research is to find the ability and possibility to implement the cloud computing in higher education of the KRG. This research will help the universities to start establishing a cloud computing in their services. A survey has been conducted to evaluate the CC services that have been applied to KRG universities have by using cloud computing services. The results showed that the most of KRG universities are using SaaS. MHE-KRG universities and institutions are confronting many challenges and concerns in term of security, user privacy, lack of integration with current systems, and data and documents ownership.
The cognitive dynamics of computer science cost-effective large scale software development
De Gyurky, Szabolcs Michael; John Wiley & Sons
2006-01-01
This book has three major objectives: To propose an ontology for computer software; To provide a methodology for development of large software systems to cost and schedule that is based on the ontology; To offer an alternative vision regarding the development of truly autonomous systems.
Fixed-point image orthorectification algorithms for reduced computational cost
French, Joseph Clinton
Imaging systems have been applied to many new applications in recent years. With the advent of low-cost, low-power focal planes and more powerful, lower cost computers, remote sensing applications have become more wide spread. Many of these applications require some form of geolocation, especially when relative distances are desired. However, when greater global positional accuracy is needed, orthorectification becomes necessary. Orthorectification is the process of projecting an image onto a Digital Elevation Map (DEM), which removes terrain distortions and corrects the perspective distortion by changing the viewing angle to be perpendicular to the projection plane. Orthorectification is used in disaster tracking, landscape management, wildlife monitoring and many other applications. However, orthorectification is a computationally expensive process due to floating point operations and divisions in the algorithm. To reduce the computational cost of on-board processing, two novel algorithm modifications are proposed. One modification is projection utilizing fixed-point arithmetic. Fixed point arithmetic removes the floating point operations and reduces the processing time by operating only on integers. The second modification is replacement of the division inherent in projection with a multiplication of the inverse. The inverse must operate iteratively. Therefore, the inverse is replaced with a linear approximation. As a result of these modifications, the processing time of projection is reduced by a factor of 1.3x with an average pixel position error of 0.2% of a pixel size for 128-bit integer processing and over 4x with an average pixel position error of less than 13% of a pixel size for a 64-bit integer processing. A secondary inverse function approximation is also developed that replaces the linear approximation with a quadratic. The quadratic approximation produces a more accurate approximation of the inverse, allowing for an integer multiplication calculation
Cost-effective computations with boundary interface operators in elliptic problems
International Nuclear Information System (INIS)
Khoromskij, B.N.; Mazurkevich, G.E.; Nikonov, E.G.
1993-01-01
The numerical algorithm for fast computations with interface operators associated with the elliptic boundary value problems (BVP) defined on step-type domains is presented. The algorithm is based on the asymptotically almost optimal technique developed for treatment of the discrete Poincare-Steklov (PS) operators associated with the finite-difference Laplacian on rectangles when using the uniform grid with a 'displacement by h/2'. The approach can be regarded as an extension of the method proposed for the partial solution of the finite-difference Laplace equation to the case of displaced grids and mixed boundary conditions. It is shown that the action of the PS operator for the Dirichlet problem and mixed BVP can be computed with expenses of the order of O(Nlog 2 N) both for arithmetical operations and computer memory needs, where N is the number of unknowns on the rectangle boundary. The single domain algorithm is applied to solving the multidomain elliptic interface problems with piecewise constant coefficients. The numerical experiments presented confirm almost linear growth of the computational costs and memory needs with respect to the dimension of the discrete interface problem. 14 refs., 3 figs., 4 tabs
A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing
Directory of Open Access Journals (Sweden)
Ginés D. Guerrero
2014-01-01
Full Text Available Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO. This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor.
Low-cost addition-subtraction sequences for the final exponentiation computation in pairings
DEFF Research Database (Denmark)
Guzmán-Trampe, Juan E; Cruz-Cortéz, Nareli; Dominguez Perez, Luis
2014-01-01
In this paper, we address the problem of finding low cost addition–subtraction sequences for situations where a doubling step is significantly cheaper than a non-doubling one. One application of this setting appears in the computation of the final exponentiation step of the reduced Tate pairing d...
2010-01-01
... Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions (a... loan cost rate for various transactions, as well as instructions, explanations, and examples for.... (2) Term of the transaction. For purposes of total annual loan cost disclosures, the term of a...
User manual for PACTOLUS: a code for computing power costs.
Energy Technology Data Exchange (ETDEWEB)
Huber, H.D.; Bloomster, C.H.
1979-02-01
PACTOLUS is a computer code for calculating the cost of generating electricity. Through appropriate definition of the input data, PACTOLUS can calculate the cost of generating electricity from a wide variety of power plants, including nuclear, fossil, geothermal, solar, and other types of advanced energy systems. The purpose of PACTOLUS is to develop cash flows and calculate the unit busbar power cost (mills/kWh) over the entire life of a power plant. The cash flow information is calculated by two principal models: the Fuel Model and the Discounted Cash Flow Model. The Fuel Model is an engineering cost model which calculates the cash flow for the fuel cycle costs over the project lifetime based on input data defining the fuel material requirements, the unit costs of fuel materials and processes, the process lead and lag times, and the schedule of the capacity factor for the plant. For nuclear plants, the Fuel Model calculates the cash flow for the entire nuclear fuel cycle. For fossil plants, the Fuel Model calculates the cash flow for the fossil fuel purchases. The Discounted Cash Flow Model combines the fuel costs generated by the Fuel Model with input data on the capital costs, capital structure, licensing time, construction time, rates of return on capital, tax rates, operating costs, and depreciation method of the plant to calculate the cash flow for the entire lifetime of the project. The financial and tax structure for both investor-owned utilities and municipal utilities can be simulated through varying the rates of return on equity and debt, the debt-equity ratios, and tax rates. The Discounted Cash Flow Model uses the principal that the present worth of the revenues will be equal to the present worth of the expenses including the return on investment over the economic life of the project. This manual explains how to prepare the input data, execute cases, and interpret the output results. (RWR)
User manual for PACTOLUS: a code for computing power costs
International Nuclear Information System (INIS)
Huber, H.D.; Bloomster, C.H.
1979-02-01
PACTOLUS is a computer code for calculating the cost of generating electricity. Through appropriate definition of the input data, PACTOLUS can calculate the cost of generating electricity from a wide variety of power plants, including nuclear, fossil, geothermal, solar, and other types of advanced energy systems. The purpose of PACTOLUS is to develop cash flows and calculate the unit busbar power cost (mills/kWh) over the entire life of a power plant. The cash flow information is calculated by two principal models: the Fuel Model and the Discounted Cash Flow Model. The Fuel Model is an engineering cost model which calculates the cash flow for the fuel cycle costs over the project lifetime based on input data defining the fuel material requirements, the unit costs of fuel materials and processes, the process lead and lag times, and the schedule of the capacity factor for the plant. For nuclear plants, the Fuel Model calculates the cash flow for the entire nuclear fuel cycle. For fossil plants, the Fuel Model calculates the cash flow for the fossil fuel purchases. The Discounted Cash Flow Model combines the fuel costs generated by the Fuel Model with input data on the capital costs, capital structure, licensing time, construction time, rates of return on capital, tax rates, operating costs, and depreciation method of the plant to calculate the cash flow for the entire lifetime of the project. The financial and tax structure for both investor-owned utilities and municipal utilities can be simulated through varying the rates of return on equity and debt, the debt-equity ratios, and tax rates. The Discounted Cash Flow Model uses the principal that the present worth of the revenues will be equal to the present worth of the expenses including the return on investment over the economic life of the project. This manual explains how to prepare the input data, execute cases, and interpret the output results with the updated version of PACTOLUS. 11 figures, 2 tables
12 CFR 219.3 - Cost reimbursement.
2010-01-01
... that the financial institution use programming or other higher level technical services of a computer... (private sector) set out in the Employment Cost Trends section of the National Compensation Survey (http... PROVIDING FINANCIAL RECORDS; RECORDKEEPING REQUIREMENTS FOR CERTAIN FINANCIAL RECORDS (REGULATION S...
Nanyonjo, Agnes; Bagorogoza, Benson; Kasteng, Frida; Ayebale, Godfrey; Makumbi, Fredrick; Tomson, Göran; Källander, Karin
2015-08-28
Integrated community case management (iCCM) relies on community health workers (CHWs) managing children with malaria, pneumonia, diarrhoea, and referring children when management is not possible. This study sought to establish the cost per sick child referred to seek care from a higher-level health facility by a CHW and to estimate caregivers' willingness to pay (WTP) for referral. Caregivers of 203 randomly selected children referred to higher-level health facilities by CHWs were interviewed in four Midwestern Uganda districts. Questionnaires and document reviews were used to capture direct, indirect and opportunity costs incurred by caregivers, CHWs and health facilities managing referred children. WTP for referral was assessed through the 'bidding game' approach followed by an open-ended question on maximum WTP. Descriptive analysis was conducted for factors associated with referral completion and WTP using logistic and linear regression methods, respectively. The cost per case referred to higher-level health facilities was computed from a societal perspective. Reasons for referral included having fever with a negative malaria test (46.8%), danger signs (29.6%) and drug shortage (37.4%). Among the referred, less than half completed referral (45.8%). Referral completion was 2.8 times higher among children with danger signs (p = 0.004) relative to those without danger signs, and 0.27 times lower among children who received pre-referral treatment (p average cost per case referred was US$ 4.89 and US$7.35 per case completing referral. For each unit cost per case referred, caregiver out of pocket expenditure contributed 33.7%, caregivers' and CHWs' opportunity costs contributed 29.2% and 5.1% respectively and health facility costs contributed 39.6%. The mean (SD) out of pocket expenditure was US$1.65 (3.25). The mean WTP for referral was US$8.25 (14.70) and was positively associated with having received pre-referral treatment, completing referral and increasing
International Nuclear Information System (INIS)
Pai, Archana; Bose, Sukanta; Dhurandhar, Sanjeev
2002-01-01
We extend a coherent network data-analysis strategy developed earlier for detecting Newtonian waveforms to the case of post-Newtonian (PN) waveforms. Since the PN waveform depends on the individual masses of the inspiralling binary, the parameter-space dimension increases by one from that of the Newtonian case. We obtain the number of templates and estimate the computational costs for PN waveforms: for a lower mass limit of 1M o-dot , for LIGO-I noise and with 3% maximum mismatch, the online computational speed requirement for single detector is a few Gflops; for a two-detector network it is hundreds of Gflops and for a three-detector network it is tens of Tflops. Apart from idealistic networks, we obtain results for realistic networks comprising of LIGO and VIRGO. Finally, we compare costs incurred in a coincidence detection strategy with those incurred in the coherent strategy detailed above
International Nuclear Information System (INIS)
Saber, Ahmed Yousuf; Chakraborty, Shantanu; Abdur Razzak, S.M.; Senjyu, Tomonobu
2009-01-01
This paper presents a modified particle swarm optimization (MPSO) for constrained economic load dispatch (ELD) problem. Real cost functions are more complex than conventional second order cost functions when multi-fuel operations, valve-point effects, accurate curve fitting, etc., are considering in deregulated changing market. The proposed modified particle swarm optimization (PSO) consists of problem dependent variable number of promising values (in velocity vector), unit vector and error-iteration dependent step length. It reliably and accurately tracks a continuously changing solution of the complex cost function and no extra concentration/effort is needed for the complex higher order cost polynomials in ELD. Constraint management is incorporated in the modified PSO. The modified PSO has balance between local and global searching abilities, and an appropriate fitness function helps to converge it quickly. To avoid the method to be frozen, stagnated/idle particles are reset. Sensitivity of the higher order cost polynomials is also analyzed visually to realize the importance of the higher order cost polynomials for the optimization of ELD. Finally, benchmark data sets and methods are used to show the effectiveness of the proposed method. (author)
High School Computer Science Education Paves the Way for Higher Education: The Israeli Case
Armoni, Michal; Gal-Ezer, Judith
2014-01-01
The gap between enrollments in higher education computing programs and the high-tech industry's demands is widely reported, and is especially prominent for women. Increasing the availability of computer science education in high school is one of the strategies suggested in order to address this gap. We look at the connection between exposure to…
Cost-Effectiveness of Computed Tomographic Colonography: A Prospective Comparison with Colonoscopy
International Nuclear Information System (INIS)
Arnesen, R.B.; Ginnerup-Pedersen, B.; Poulsen, P.B.; Benzon, K. von; Adamsen, S.; Laurberg, S.; Hart-Hansen, O.
2007-01-01
Purpose: To estimate the cost-effectiveness of detecting colorectal polyps with computed tomographic colonography (CTC) and subsequent polypectomy with primary colonoscopy (CC), using CC as the alternative strategy. Material and Methods: A marginal analysis was performed regarding 103 patients who had had CTC prior to same-day CC at two hospitals, H-I (n 53) and H-II (n = 50). The patients were randomly chosen from surveillance and symptomatic study populations (148 at H-I and 231 at H-II). Populations, organizations, and procedures were compared. Cost data on time consumption, medication, and minor equipment were collected prospectively, while data on salaries and major equipment were collected retrospectively. The effect was the (previously published) sensitivities of CTC and CC for detection of colorectal polyps ≥6 mm (H-I, n = 148) or ≥5 mm (H-II, n = 231). Results: Thirteen patients at each center had at least one colorectal polyp ≥6 mm or ≥5 mm. CTC was the cost-effective alternative at H-I (Euro 187 vs. Euro 211), while CC was the cost-effective alternative at H-II (Euro 239 vs. Euro 192). The cost-effectiveness (costs per finding) mainly depended on the sensitivity of CTC and CC, but the depreciation of equipment and the staff's use of time were highly influential as well. Conclusion: Detection of colorectal polyps ≥6 mm or ≥5 mm with CTC, followed by polypectomy by CC, can be performed cost-effectively at some institutions with the appropriate hardware and organization keywords
The Optimal Pricing of Computer Software and Other Products with High Switching Costs
Pekka Ahtiala
2004-01-01
The paper studies the determinants of the optimum prices of computer programs and their upgrades. It is based on the notion that because of the human capital invested in the use of a computer program by its user, this product has high switching costs, and on the finding that pirates are responsible for generating over 80 per cent of new software sales. A model to maximize the present value of the program to the program house is constructed to determine the optimal prices of initial programs a...
2011-01-01
Background Single reading with computer aided detection (CAD) is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£), year 2007/08) of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner). Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate) CAD is unlikely to be a cost effective alternative to double reading for mammography screening in UK. This study
Higher Education Cloud Computing in South Africa: Towards Understanding Trust and Adoption issues
Directory of Open Access Journals (Sweden)
Karl Van Der Schyff
2014-12-01
Full Text Available This paper sets out to study the views of key stakeholders on the issue of cloud information security within institutions of Higher Education. A specific focus is on understanding trust and the adoption of cloud computing in context of the unique operational requirements of South African universities. Contributions are made on both a methodological and theoretical level. Methodologically the study contributes by employing an Interpretivist approach and using Thematic Analysis in a topic area often studied quantitatively, thus affording researchers the opportunity to gain the necessary in-depth insight into how key stakeholders view cloud security and trust. A theoretical contribution is made in the form of a trust-centric conceptual framework that illustrates how the qualitative data relates to concepts innate to cloud computing trust and adoption. Both these contributions lend credence to the fact that there is a need to address cloud information security with a specific focus on the contextual elements that surround South African universities. The paper concludes with some considerations for implementing and investigating cloud computing services in Higher Education contexts in South Africa.
Pai, A; Dhurandhar, S V
2002-01-01
We extend a coherent network data-analysis strategy developed earlier for detecting Newtonian waveforms to the case of post-Newtonian (PN) waveforms. Since the PN waveform depends on the individual masses of the inspiralling binary, the parameter-space dimension increases by one from that of the Newtonian case. We obtain the number of templates and estimate the computational costs for PN waveforms: for a lower mass limit of 1M sub o sub - sub d sub o sub t , for LIGO-I noise and with 3% maximum mismatch, the online computational speed requirement for single detector is a few Gflops; for a two-detector network it is hundreds of Gflops and for a three-detector network it is tens of Tflops. Apart from idealistic networks, we obtain results for realistic networks comprising of LIGO and VIRGO. Finally, we compare costs incurred in a coincidence detection strategy with those incurred in the coherent strategy detailed above.
Energy- and cost-efficient lattice-QCD computations using graphics processing units
Energy Technology Data Exchange (ETDEWEB)
Bach, Matthias
2014-07-01
Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD
Energy- and cost-efficient lattice-QCD computations using graphics processing units
International Nuclear Information System (INIS)
Bach, Matthias
2014-01-01
Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD
Business Models of High Performance Computing Centres in Higher Education in Europe
Eurich, Markus; Calleja, Paul; Boutellier, Roman
2013-01-01
High performance computing (HPC) service centres are a vital part of the academic infrastructure of higher education organisations. However, despite their importance for research and the necessary high capital expenditures, business research on HPC service centres is mostly missing. From a business perspective, it is important to find an answer to…
Identifying Benefits and risks associated with utilizing cloud computing
Shayan, Jafar; Azarnik, Ahmad; Chuprat, Suriayati; Karamizadeh, Sasan; Alizadeh, Mojtaba
2014-01-01
Cloud computing is an emerging computing model where IT and computing operations are delivered as services in highly scalable and cost effective manner. Recently, embarking this new model in business has become popular. Companies in diverse sectors intend to leverage cloud computing architecture, platforms and applications in order to gain higher competitive advantages. Likewise other models, cloud computing brought advantages to attract business but meanwhile fostering cloud has led to some ...
International Nuclear Information System (INIS)
Lucidarme, Olivier; Cadi, Mehdi; Berger, Genevieve; Taieb, Julien; Poynard, Thierry; Grenier, Philippe; Beresniak, Ariel
2012-01-01
Objectives: To assess the cost-effectiveness of three colorectal-cancer (CRC) screening strategies in France: fecal-occult-blood tests (FOBT), computed-tomography-colonography (CTC) and optical-colonoscopy (OC). Methods: Ten-year simulation modeling was used to assess a virtual asymptomatic, average-risk population 50–74 years old. Negative OC was repeated 10 years later, and OC positive for advanced or non-advanced adenoma 3 or 5 years later, respectively. FOBT was repeated biennially. Negative CTC was repeated 5 years later. Positive CTC and FOBT led to triennial OC. Total cost and CRC rate after 10 years for each screening strategy and 0–100% adherence rates with 10% increments were computed. Transition probabilities were programmed using distribution ranges to account for uncertainty parameters. Direct medical costs were estimated using the French national health insurance prices. Probabilistic sensitivity analyses used 5000 Monte Carlo simulations generating model outcomes and standard deviations. Results: For a given adherence rate, CTC screening was always the most effective but not the most cost-effective. FOBT was the least effective but most cost-effective strategy. OC was of intermediate efficacy and the least cost-effective strategy. Without screening, treatment of 123 CRC per 10,000 individuals would cost €3,444,000. For 60% adherence, the respective costs of preventing and treating, respectively 49 and 74 FOBT-detected, 73 and 50 CTC-detected and 63 and 60 OC-detected CRC would be €2,810,000, €6,450,000 and €9,340,000. Conclusion: Simulation modeling helped to identify what would be the most effective (CTC) and cost-effective screening (FOBT) strategy in the setting of mass CRC screening in France.
Multi-Product Total Cost of Function for Higher Education: A Case of Bible Colleges.
Koshal, Rajindar K.; Koshal, Manjulika; Gupta, Ashok
2001-01-01
This study empirically estimates a multiproduct total cost function and output relationship for comprehensive U.S. universities. Statistical results for 184 Bible colleges suggest that there are both economies of scale and of scope in higher education. Additionally, product-specific economies of scope exist for all output levels and activities.…
Computer Aided Design of a Low-Cost Painting Robot
Directory of Open Access Journals (Sweden)
SYEDA MARIA KHATOON ZAIDI
2017-10-01
Full Text Available The application of robots or robotic systems for painting parts is becoming increasingly conventional; to improve reliability, productivity, consistency and to decrease waste. However, in Pakistan only highend Industries are able to afford the luxury of a robotic system for various purposes. In this study we propose an economical Painting Robot that a small-scale industry can install in their plant with ease. The importance of this robot is that being cost effective, it can easily be replaced in small manufacturing industries and therefore, eliminate health problems occurring to the individual in charge of painting parts on an everyday basis. To achieve this aim, the robot is made with local parts with only few exceptions, to cut costs; and the programming language is kept at a mediocre level. Image processing is used to establish object recognition and it can be programmed to paint various simple geometries. The robot is placed on a conveyer belt to maximize productivity. A four DoF (Degree of Freedom arm increases the working envelope and accessibility of painting different shaped parts with ease. This robot is capable of painting up, front, back, left and right sides of the part with a single colour. Initially CAD (Computer Aided Design models of the robot were developed which were analyzed, modified and improved to withstand loading condition and perform its task efficiently. After design selection, appropriate motors and materials were selected and the robot was developed. Throughout the development phase, minor problems and errors were fixed accordingly as they arose. Lastly the robot was integrated with the computer and image processing for autonomous control. The final results demonstrated that the robot is economical and reduces paint wastage.
Computer aided design of a low-cost painting robot
International Nuclear Information System (INIS)
Zaidi, S.M.; Janejo, F.; Mujtaba, S.B.
2017-01-01
The application of robots or robotic systems for painting parts is becoming increasingly conventional; to improve reliability, productivity, consistency and to decrease waste. However, in Pakistan only highend Industries are able to afford the luxury of a robotic system for various purposes. In this study we propose an economical Painting Robot that a small-scale industry can install in their plant with ease. The importance of this robot is that being cost effective, it can easily be replaced in small manufacturing industries and therefore, eliminate health problems occurring to the individual in charge of painting parts on an everyday basis. To achieve this aim, the robot is made with local parts with only few exceptions, to cut costs; and the programming language is kept at a mediocre level. Image processing is used to establish object recognition and it can be programmed to paint various simple geometries. The robot is placed on a conveyer belt to maximize productivity. A four DoF (Degree of Freedom) arm increases the working envelope and accessibility of painting different shaped parts with ease. This robot is capable of painting up, front, back, left and right sides of the part with a single colour. Initially CAD (Computer Aided Design) models of the robot were developed which were analyzed, modified and improved to withstand loading condition and perform its task efficiently. After design selection, appropriate motors and materials were selected and the robot was developed. Throughout the development phase, minor problems and errors were fixed accordingly as they arose. Lastly the robot was integrated with the computer and image processing for autonomous control. The final results demonstrated that the robot is economical and reduces paint wastage. (author)
Controlling costs without compromising quality: paying hospitals for total knee replacement.
Pine, Michael; Fry, Donald E; Jones, Barbara L; Meimban, Roger J; Pine, Gregory J
2010-10-01
Unit costs of health services are substantially higher in the United States than in any other developed country in the world, without a correspondingly healthier population. An alternative payment structure, especially for high volume, high cost episodes of care (eg, total knee replacement), is needed to reward high quality care and reduce costs. The National Inpatient Sample of administrative claims data was used to measure risk-adjusted mortality, postoperative length-of-stay, costs of routine care, adverse outcome rates, and excess costs of adverse outcomes for total knee replacements performed between 2002 and 2005. Empirically identified inefficient and ineffective hospitals were then removed to create a reference group of high-performance hospitals. Predictive models for outcomes and costs were recalibrated to the reference hospitals and used to compute risk-adjusted outcomes and costs for all hospitals. Per case predicted costs were computed and compared with observed costs. Of the 688 hospitals with acceptable data, 62 failed to meet effectiveness criteria and 210 were identified as inefficient. The remaining 416 high-performance hospitals had 13.4% fewer risk-adjusted adverse outcomes (4.56%-3.95%; P costs ($12,773-$11,512; P costs. A payment system based on the demonstrated performance of effective, efficient hospitals can produce sizable cost savings without jeopardizing quality. In this study, 96% of total excess hospital costs resulted from higher routine costs at inefficient hospitals, whereas only 4% was associated with ineffective care.
Directory of Open Access Journals (Sweden)
Karol Wajszczyk
2009-01-01
Full Text Available The study comprised research, development and computer programming works concerning the development of a concept for the IT tool to be used in the identification and analysis of logistics costs in agricultural enterprises in terms of the process-based approach. As a result of research and programming work an overall functional and IT concept of software was developed for the identification and analysis of logistics costs for agricultural enterprises.
A novel cost based model for energy consumption in cloud computing.
Horri, A; Dastghaibyfard, Gh
2015-01-01
Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment.
Omniscopes: Large area telescope arrays with only NlogN computational cost
International Nuclear Information System (INIS)
Tegmark, Max; Zaldarriaga, Matias
2010-01-01
We show that the class of antenna layouts for telescope arrays allowing cheap analysis hardware (with correlator cost scaling as NlogN rather than N 2 with the number of antennas N) is encouragingly large, including not only previously discussed rectangular grids but also arbitrary hierarchies of such grids, with arbitrary rotations and shears at each level. We show that all correlations for such a 2D array with an n-level hierarchy can be efficiently computed via a fast Fourier transform in not two but 2n dimensions. This can allow major correlator cost reductions for science applications requiring exquisite sensitivity at widely separated angular scales, for example, 21 cm tomography (where short baselines are needed to probe the cosmological signal and long baselines are needed for point source removal), helping enable future 21 cm experiments with thousands or millions of cheap dipolelike antennas. Such hierarchical grids combine the angular resolution advantage of traditional array layouts with the cost advantage of a rectangular fast Fourier transform telescope. We also describe an algorithm for how a subclass of hierarchical arrays can efficiently use rotation synthesis to produce global sky maps with minimal noise and a well-characterized synthesized beam.
International Nuclear Information System (INIS)
Capone, V; Esposito, R; Pardi, S; Taurino, F; Tortone, G
2012-01-01
Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.
Capone, V.; Esposito, R.; Pardi, S.; Taurino, F.; Tortone, G.
2012-12-01
Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.
On the role of cost-sensitive learning in multi-class brain-computer interfaces.
Devlaminck, Dieter; Waegeman, Willem; Wyns, Bart; Otte, Georges; Santens, Patrick
2010-06-01
Brain-computer interfaces (BCIs) present an alternative way of communication for people with severe disabilities. One of the shortcomings in current BCI systems, recently put forward in the fourth BCI competition, is the asynchronous detection of motor imagery versus resting state. We investigated this extension to the three-class case, in which the resting state is considered virtually lying between two motor classes, resulting in a large penalty when one motor task is misclassified into the other motor class. We particularly focus on the behavior of different machine-learning techniques and on the role of multi-class cost-sensitive learning in such a context. To this end, four different kernel methods are empirically compared, namely pairwise multi-class support vector machines (SVMs), two cost-sensitive multi-class SVMs and kernel-based ordinal regression. The experimental results illustrate that ordinal regression performs better than the other three approaches when a cost-sensitive performance measure such as the mean-squared error is considered. By contrast, multi-class cost-sensitive learning enables us to control the number of large errors made between two motor tasks.
Specialized computer architectures for computational aerodynamics
Stevenson, D. K.
1978-01-01
In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.
Public Concepts of the Values and Costs of Higher Education, 1963-1974. A Preliminary Analysis.
Minor, Michael J.; Murray, James R.
Statistical data are presented on interviews conducted through the Continuous National Survey (CNS) at the National Opinion Research Center in Chicago and based on results reprinted from "Public Concepts of the Values and Costs of Higher Education," by Angus Campbell and William C. Eckerman. The CNS results presented in this report are…
Cost-effectiveness of implementing computed tomography screening for lung cancer in Taiwan.
Yang, Szu-Chun; Lai, Wu-Wei; Lin, Chien-Chung; Su, Wu-Chou; Ku, Li-Jung; Hwang, Jing-Shiang; Wang, Jung-Der
2017-06-01
A screening program for lung cancer requires more empirical evidence. Based on the experience of the National Lung Screening Trial (NLST), we developed a method to adjust lead-time bias and quality-of-life changes for estimating the cost-effectiveness of implementing computed tomography (CT) screening in Taiwan. The target population was high-risk (≥30 pack-years) smokers between 55 and 75 years of age. From a nation-wide, 13-year follow-up cohort, we estimated quality-adjusted life expectancy (QALE), loss-of-QALE, and lifetime healthcare expenditures per case of lung cancer stratified by pathology and stage. Cumulative stage distributions for CT-screening and no-screening were assumed equal to those for CT-screening and radiography-screening in the NLST to estimate the savings of loss-of-QALE and additional costs of lifetime healthcare expenditures after CT screening. Costs attributable to screen-negative subjects, false-positive cases and radiation-induced lung cancer were included to obtain the incremental cost-effectiveness ratio from the public payer's perspective. The incremental costs were US$22,755 per person. After dividing this by savings of loss-of-QALE (1.16 quality-adjusted life year (QALY)), the incremental cost-effectiveness ratio was US$19,683 per QALY. This ratio would fall to US$10,947 per QALY if the stage distribution for CT-screening was the same as that of screen-detected cancers in the NELSON trial. Low-dose CT screening for lung cancer among high-risk smokers would be cost-effective in Taiwan. As only about 5% of our women are smokers, future research is necessary to identify the high-risk groups among non-smokers and increase the coverage. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
DEFF Research Database (Denmark)
Mishnaevsky, Leon; Freere, Peter; Sharma, Ranjan
2009-01-01
of experiments and computational investigations. Low cost testing machines have been designed, and employed for the systematic analysis of different sorts of Nepali wood, to be used for the wind turbine construction. At the same time, computational micromechanical models of deformation and strength of wood......This paper reports the latest results of the comprehensive program of experimental and computational analysis of strength and reliability of wooden parts of low cost wind turbines. The possibilities of prediction of strength and reliability of different types of wood are studied in the series...... are developed, which should provide the basis for microstructure-based correlating of observable and service properties of wood. Some correlations between microstructure, strength and service properties of wood have been established....
Cressman, Sonya; Lam, Stephen; Tammemagi, Martin C; Evans, William K; Leighl, Natasha B; Regier, Dean A; Bolbocean, Corneliu; Shepherd, Frances A; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R; Mayo, John R; McWilliams, Annette; Couture, Christian; English, John C; Goffin, John; Hwang, David M; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J; Goss, Glenwood D; Nicholas, Garth; Seely, Jean M; Sekhon, Harmanjatinder S; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D; Tan, Wan C; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J
2014-10-01
It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer's perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400-$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553-$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254-$52,200; p = 0.061). In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure.
Barlow, Brian T; McLawhorn, Alexander S; Westrich, Geoffrey H
2017-05-03
Dislocation remains a clinically important problem following primary total hip arthroplasty, and it is a common reason for revision total hip arthroplasty. Dual mobility (DM) implants decrease the risk of dislocation but can be more expensive than conventional implants and have idiosyncratic failure mechanisms. The purpose of this study was to investigate the cost-effectiveness of DM implants compared with conventional bearings for primary total hip arthroplasty. Markov model analysis was conducted from the societal perspective with use of direct and indirect costs. Costs, expressed in 2013 U.S. dollars, were derived from the literature, the National Inpatient Sample, and the Centers for Medicare & Medicaid Services. Effectiveness was expressed in quality-adjusted life years (QALYs). The model was populated with health state utilities and state transition probabilities derived from previously published literature. The analysis was performed for a patient's lifetime, and costs and effectiveness were discounted at 3% annually. The principal outcome was the incremental cost-effectiveness ratio (ICER), with a willingness-to-pay threshold of $100,000/QALY. Sensitivity analyses were performed to explore relevant uncertainty. In the base case, DM total hip arthroplasty showed absolute dominance over conventional total hip arthroplasty, with lower accrued costs ($39,008 versus $40,031 U.S. dollars) and higher accrued utility (13.18 versus 13.13 QALYs) indicating cost-savings. DM total hip arthroplasty ceased being cost-saving when its implant costs exceeded those of conventional total hip arthroplasty by $1,023, and the cost-effectiveness threshold for DM implants was $5,287 greater than that for conventional implants. DM was not cost-effective when the annualized incremental probability of revision from any unforeseen failure mechanism or mechanisms exceeded 0.29%. The probability of intraprosthetic dislocation exerted the most influence on model results. This model
Energy Technology Data Exchange (ETDEWEB)
Travassos, Paulo Cesar B.; Magalhaes, Luis Alexandre G., E-mail: pctravassos@ufrj.br [Universidade do Estado do Rio de Janeiro (IBRGA/UERJ), RJ (Brazil). Laboratorio de Ciencias Radiologicas; Augusto, Fernando M.; Sant' Yves, Thalis L.A.; Goncalves, Elicardo A.S. [Instituto Nacional de Cancer (INCA), Rio de Janeiro, RJ (Brazil); Botelho, Marina A. [Hospital Universitario Pedro Ernesto (UERJ), Rio de Janeiro, RJ (Brazil)
2012-08-15
This article presents the results obtained from a low cost phantom, used to analyze Computed Radiology (CR) equipment. The phantom was constructed to test a few parameters related to image quality, as described in [1-9]. Materials which can be easily purchased were used in the construction of the phantom, with total cost of approximately U$100.00. A bar pattern was placed only to verify the efficacy of the grids in the spatial resolution determination, and was not included in the budget because the data was acquired from the grids. (author)
Matching Cost Filtering for Dense Stereo Correspondence
Directory of Open Access Journals (Sweden)
Yimin Lin
2013-01-01
Full Text Available Dense stereo correspondence enabling reconstruction of depth information in a scene is of great importance in the field of computer vision. Recently, some local solutions based on matching cost filtering with an edge-preserving filter have been proved to be capable of achieving more accuracy than global approaches. Unfortunately, the computational complexity of these algorithms is quadratically related to the window size used to aggregate the matching costs. The recent trend has been to pursue higher accuracy with greater efficiency in execution. Therefore, this paper proposes a new cost-aggregation module to compute the matching responses for all the image pixels at a set of sampling points generated by a hierarchical clustering algorithm. The complexity of this implementation is linear both in the number of image pixels and the number of clusters. Experimental results demonstrate that the proposed algorithm outperforms state-of-the-art local methods in terms of both accuracy and speed. Moreover, performance tests indicate that parameters such as the height of the hierarchical binary tree and the spatial and range standard deviations have a significant influence on time consumption and the accuracy of disparity maps.
Thangasamy, Andrew; Horan, Deborah
2016-01-01
Undocumented students, many of Hispanic origin, face among the strictest cost barriers to higher education in the United States. Lack of legal status excludes them from most state and all federal financial aid programs. Furthermore, most states require them to pay out-of-state tuition rates at publicly supported institutions. In a new direction,…
Cloud Computing Adoption and Usage in Community Colleges
Behrend, Tara S.; Wiebe, Eric N.; London, Jennifer E.; Johnson, Emily C.
2011-01-01
Cloud computing is gaining popularity in higher education settings, but the costs and benefits of this tool have gone largely unexplored. The purpose of this study was to examine the factors that lead to technology adoption in a higher education setting. Specifically, we examined a range of predictors and outcomes relating to the acceptance of a…
Boevé, Anja J; Meijer, Rob R; Albers, Casper J; Beetsma, Yta; Bosker, Roel J
2015-01-01
The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration.
Velasco, Cesar; Pérez, Inaki; Podzamczer, Daniel; Llibre, Josep Maria; Domingo, Pere; González-García, Juan; Puig, Inma; Ayala, Pilar; Martín, Mayte; Trilla, Antoni; Lázaro, Pablo; Gatell, Josep Maria
2016-03-01
The financing of antiretroviral therapy (ART) is generally determined by the cost incurred in the previous year, the number of patients on treatment, and the evidence-based recommendations, but not the clinical characteristics of the population. To establish a score relating the cost of ART and patient clinical complexity in order to understand the costing differences between hospitals in the region that could be explained by the clinical complexity of their population. Retrospective analysis of patients receiving ART in a tertiary hospital between 2009 and 2011. Factors potentially associated with a higher cost of ART were assessed by bivariate and multivariate analysis. Two predictive models of "high-cost" were developed. The normalized estimated (adjusted for the complexity scores) costs were calculated and compared with the normalized real costs. In the Hospital Index, 631 (16.8%) of the 3758 patients receiving ART were responsible for a "high-cost" subgroup, defined as the highest 25% of spending on ART. Baseline variables that were significant predictors of high cost in the Clinic-B model in the multivariate analysis were: route of transmission of HIV, AIDS criteria, Spanish nationality, year of initiation of ART, CD4+ lymphocyte count nadir, and number of hospital admissions. The Clinic-B score ranged from 0 to 13, and the mean value (5.97) was lower than the overall mean value of the four hospitals (6.16). The clinical complexity of the HIV patient influences the cost of ART. The Clinic-B and Clinic-BF scores predicted patients with high cost of ART and could be used to compare and allocate costs corrected for the patient clinical complexity. Copyright © 2015 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Directory of Open Access Journals (Sweden)
Georges SARAFOPOULOS
2017-07-01
Full Text Available In this study we investigate the dynamics of a nonlinear Cournot- type duopoly game with differentiated goods, linear demand and a cost function that includes emission costs. The game is modeled with a system of two difference equations. Existence and stability of equilibria of this system are studied. We show that the model gives more complex chaotic and unpredictable trajectories as a consequence of change in the parameter of horizontal product differentiation and a higher (lower degree of product differentiation (weaker or fiercer competition destabilize (stabilize the economy. The chaotic features are justified numerically via computing Lyapunov numbers and sensitive dependence on initial conditions. Also, we show that in this case there are stable trajectories and a higher (lower degree of product differentiation does not tend to destabilize the economy.
Gekas, Jean; Gagné, Geneviève; Bujold, Emmanuel; Douillard, Daniel; Forest, Jean-Claude; Reinharz, Daniel; Rousseau, François
2009-02-13
To assess and compare the cost effectiveness of three different strategies for prenatal screening for Down's syndrome (integrated test, sequential screening, and contingent screenings) and to determine the most useful cut-off values for risk. Computer simulations to study integrated, sequential, and contingent screening strategies with various cut-offs leading to 19 potential screening algorithms. The computer simulation was populated with data from the Serum Urine and Ultrasound Screening Study (SURUSS), real unit costs for healthcare interventions, and a population of 110 948 pregnancies from the province of Québec for the year 2001. Cost effectiveness ratios, incremental cost effectiveness ratios, and screening options' outcomes. The contingent screening strategy dominated all other screening options: it had the best cost effectiveness ratio ($C26,833 per case of Down's syndrome) with fewer procedure related euploid miscarriages and unnecessary terminations (respectively, 6 and 16 per 100,000 pregnancies). It also outperformed serum screening at the second trimester. In terms of the incremental cost effectiveness ratio, contingent screening was still dominant: compared with screening based on maternal age alone, the savings were $C30,963 per additional birth with Down's syndrome averted. Contingent screening was the only screening strategy that offered early reassurance to the majority of women (77.81%) in first trimester and minimised costs by limiting retesting during the second trimester (21.05%). For the contingent and sequential screening strategies, the choice of cut-off value for risk in the first trimester test significantly affected the cost effectiveness ratios (respectively, from $C26,833 to $C37,260 and from $C35,215 to $C45,314 per case of Down's syndrome), the number of procedure related euploid miscarriages (from 6 to 46 and from 6 to 45 per 100,000 pregnancies), and the number of unnecessary terminations (from 16 to 26 and from 16 to 25 per 100
Directory of Open Access Journals (Sweden)
Nataliia A. Khmil
2016-01-01
Full Text Available In the present article foreign and domestic experience of integrating cloud computing into pedagogical process of higher educational establishments (H.E.E. has been generalized. It has been stated that nowadays a lot of educational services are hosted in the cloud, e.g. infrastructure as a service (IaaS, platform as a service (PaaS and software as a service (SaaS. The peculiarities of implementing cloud technologies by H.E.E. in Ukraine and abroad have been singled out; the products developed by the leading IT companies for using cloud computing in higher education system, such as Microsoft for Education, Google Apps for Education and Amazon AWS Educate have been reviewed. The examples of concrete types, methods and forms of learning and research work based on cloud services have been provided.
National Research Council Canada - National Science Library
Arbulu, Timothy D; Vosberg, Brian J
2007-01-01
The purpose of this MBA project was to conduct a feasibility study and a cost benefit analysis of using thin-client computer systems instead of traditional networks onboard United States Navy ships...
Chapman, Kathryn; Goldsbury, David; Watson, Wendy; Havill, Michelle; Wellard, Lyndal; Hughes, Clare; Bauman, Adrian; Allman-Farinelli, Margaret
2017-06-01
Fruit and vegetable (F&V) consumption is below recommendations, and cost may be a barrier to meeting recommendations. Limited evidence exists on individual perceptions about the cost, actual spending and consumption of F&V. This study investigated perceptions and beliefs about cost of F&V and whether this is a barrier to higher consumption. An online survey of Australian adults (n = 2474) measured F&V consumption; expenditure on F&V and food; and perceived barriers to consumption. Multivariable logistic regression examined associations between participants' responses about cost of F&V and demographic factors, and with actual consumption and expenditure on F&V. Cost was identified as a barrier for 29% of people not meeting recommended fruit servings and for 14% of people not meeting recommendations for vegetables. Cost was a more common barrier for those on lower incomes (fruit aOR 1.89; 95% CI 1.20-2.98 and vegetables aOR 2.94; 95% CI 1.97-4.39) and less common for older participants (fruit aOR 0.33; 95% CI 0.17-0.62 and vegetables aOR 0.31; 95% CI 0.18-0.52). There was no association between the perceived barriers and actual F&V spending. Twenty percent of participants said F&V were not affordable; 39% said cost made it difficult to buy F&V, and for 23% the cost of F&V meant they bought less than desired. A minority reported F&V were not affordable where they shopped and that cost was a barrier to higher consumption. However, it is apparent that young adults and those on low incomes eat less than they would like because of cost. Strategies that remove financial impediments to consumption are indicated for these population sub-groups. Copyright © 2017 Elsevier Ltd. All rights reserved.
Computational sensing of herpes simplex virus using a cost-effective on-chip microscope
Ray, Aniruddha
2017-07-03
Caused by the herpes simplex virus (HSV), herpes is a viral infection that is one of the most widespread diseases worldwide. Here we present a computational sensing technique for specific detection of HSV using both viral immuno-specificity and the physical size range of the viruses. This label-free approach involves a compact and cost-effective holographic on-chip microscope and a surface-functionalized glass substrate prepared to specifically capture the target viruses. To enhance the optical signatures of individual viruses and increase their signal-to-noise ratio, self-assembled polyethylene glycol based nanolenses are rapidly formed around each virus particle captured on the substrate using a portable interface. Holographic shadows of specifically captured viruses that are surrounded by these self-assembled nanolenses are then reconstructed, and the phase image is used for automated quantification of the size of each particle within our large field-of-view, ~30 mm2. The combination of viral immuno-specificity due to surface functionalization and the physical size measurements enabled by holographic imaging is used to sensitively detect and enumerate HSV particles using our compact and cost-effective platform. This computational sensing technique can find numerous uses in global health related applications in resource-limited environments.
Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.
2018-05-01
Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and Information Science courses. The model combines a Frontal Competitive Approach and a Project-Driven Learning (PDL) framework. The PDL framework has been developed by stating and solving three design problems: (i) enhance the diversity of project assignments on specific computation methods algorithmic approaches, (ii) balance similarity and dissimilarity of the project assignments, and (iii) develop a software assessment tool suitable for evaluating the technological maturity of students' project deliverables and thus reducing instructor's workload and possible overlook. The positive experience accumulated over 15 years shows that implementing the PCEM keeps students motivated to strive for success in rising to higher levels of their computational and software engineering skills.
A Comprehensive and Cost-Effective Computer Infrastructure for K-12 Schools
Warren, G. P.; Seaton, J. M.
1996-01-01
Since 1993, NASA Langley Research Center has been developing and implementing a low-cost Internet connection model, including system architecture, training, and support, to provide Internet access for an entire network of computers. This infrastructure allows local area networks which exceed 50 machines per school to independently access the complete functionality of the Internet by connecting to a central site, using state-of-the-art commercial modem technology, through a single standard telephone line. By locating high-cost resources at this central site and sharing these resources and their costs among the school districts throughout a region, a practical, efficient, and affordable infrastructure for providing scale-able Internet connectivity has been developed. As the demand for faster Internet access grows, the model has a simple expansion path that eliminates the need to replace major system components and re-train personnel. Observations of optical Internet usage within an environment, particularly school classrooms, have shown that after an initial period of 'surfing,' the Internet traffic becomes repetitive. By automatically storing requested Internet information on a high-capacity networked disk drive at the local site (network based disk caching), then updating this information only when it changes, well over 80 percent of the Internet traffic that leaves a location can be eliminated by retrieving the information from the local disk cache.
A practical technique for benefit-cost analysis of computer-aided design and drafting systems
International Nuclear Information System (INIS)
Shah, R.R.; Yan, G.
1979-03-01
Analysis of benefits and costs associated with the operation of Computer-Aided Design and Drafting Systems (CADDS) are needed to derive economic justification for acquiring new systems, as well as to evaluate the performance of existing installations. In practice, however, such analyses are difficult to perform since most technical and economic advantages of CADDS are ΣirreduciblesΣ, i.e. cannot be readily translated into monetary terms. In this paper, a practical technique for economic analysis of CADDS in a drawing office environment is presented. A Σworst caseΣ approach is taken since increase in productivity of existing manpower is the only benefit considered, while all foreseen costs are taken into account. Methods of estimating benefits and costs are described. The procedure for performing the analysis is illustrated by a case study based on the drawing office activities at Atomic Energy of Canada Limited. (auth)
An Alternative Method for Computing Unit Costs and Productivity Ratios. AIR 1984 Annual Forum Paper.
Winstead, Wayland H.; And Others
An alternative measure for evaluating the performance of academic departments was studied. A comparison was made with the traditional manner for computing unit costs and productivity ratios: prorating the salary and effort of each faculty member to each course level based on the personal mix of course taught. The alternative method used averaging…
Boevé, Anja J.; Meijer, Rob R.; Albers, Casper J.; Beetsma, Yta; Bosker, Roel J.
2015-01-01
The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration. PMID:26641632
Directory of Open Access Journals (Sweden)
Kevin Ten Haaf
2017-02-01
Full Text Available The National Lung Screening Trial (NLST results indicate that computed tomography (CT lung cancer screening for current and former smokers with three annual screens can be cost-effective in a trial setting. However, the cost-effectiveness in a population-based setting with >3 screening rounds is uncertain. Therefore, the objective of this study was to estimate the cost-effectiveness of lung cancer screening in a population-based setting in Ontario, Canada, and evaluate the effects of screening eligibility criteria.This study used microsimulation modeling informed by various data sources, including the Ontario Health Insurance Plan (OHIP, Ontario Cancer Registry, smoking behavior surveys, and the NLST. Persons, born between 1940 and 1969, were examined from a third-party health care payer perspective across a lifetime horizon. Starting in 2015, 576 CT screening scenarios were examined, varying by age to start and end screening, smoking eligibility criteria, and screening interval. Among the examined outcome measures were lung cancer deaths averted, life-years gained, percentage ever screened, costs (in 2015 Canadian dollars, and overdiagnosis. The results of the base-case analysis indicated that annual screening was more cost-effective than biennial screening. Scenarios with eligibility criteria that required as few as 20 pack-years were dominated by scenarios that required higher numbers of accumulated pack-years. In general, scenarios that applied stringent smoking eligibility criteria (i.e., requiring higher levels of accumulated smoking exposure were more cost-effective than scenarios with less stringent smoking eligibility criteria, with modest differences in life-years gained. Annual screening between ages 55-75 for persons who smoked ≥40 pack-years and who currently smoke or quit ≤10 y ago yielded an incremental cost-effectiveness ratio of $41,136 Canadian dollars ($33,825 in May 1, 2015, United States dollars per life-year gained
Low-cost, high-performance and efficiency computational photometer design
Siewert, Sam B.; Shihadeh, Jeries; Myers, Randall; Khandhar, Jay; Ivanov, Vitaly
2014-05-01
Researchers at the University of Alaska Anchorage and University of Colorado Boulder have built a low cost high performance and efficiency drop-in-place Computational Photometer (CP) to test in field applications ranging from port security and safety monitoring to environmental compliance monitoring and surveying. The CP integrates off-the-shelf visible spectrum cameras with near to long wavelength infrared detectors and high resolution digital snapshots in a single device. The proof of concept combines three or more detectors into a single multichannel imaging system that can time correlate read-out, capture, and image process all of the channels concurrently with high performance and energy efficiency. The dual-channel continuous read-out is combined with a third high definition digital snapshot capability and has been designed using an FPGA (Field Programmable Gate Array) to capture, decimate, down-convert, re-encode, and transform images from two standard definition CCD (Charge Coupled Device) cameras at 30Hz. The continuous stereo vision can be time correlated to megapixel high definition snapshots. This proof of concept has been fabricated as a fourlayer PCB (Printed Circuit Board) suitable for use in education and research for low cost high efficiency field monitoring applications that need multispectral and three dimensional imaging capabilities. Initial testing is in progress and includes field testing in ports, potential test flights in un-manned aerial systems, and future planned missions to image harsh environments in the arctic including volcanic plumes, ice formation, and arctic marine life.
Zaidel, Mark; Luo, XiaoHui
2010-01-01
This study investigates the efficiency of multimedia instruction at the college level by comparing the effectiveness of multimedia elements used in the computer supported learning with the cost of their preparation. Among the various technologies that advance learning, instructors and students generally identify interactive multimedia elements as…
Feeney, James M; Montgomery, Stephanie C; Wolf, Laura; Jayaraman, Vijay; Twohig, Michael
2016-09-01
Among transferred trauma patients, challenges with the transfer of radiographic studies include problems loading or viewing the studies at the receiving hospitals, and problems manipulating, reconstructing, or evalu- ating the transferred images. Cloud-based image transfer systems may address some ofthese problems. We reviewed the charts of patients trans- ferred during one year surrounding the adoption of a cloud computing data transfer system. We compared the rates of repeat imaging before (precloud) and af- ter (postcloud) the adoption of the cloud-based data transfer system. During the precloud period, 28 out of 100 patients required 90 repeat studies. With the cloud computing transfer system in place, three out of 134 patients required seven repeat films. There was a statistically significant decrease in the proportion of patients requiring repeat films (28% to 2.2%, P < .0001). Based on an annualized volume of 200 trauma patient transfers, the cost savings estimated using three methods of cost analysis, is between $30,272 and $192,453.
Takakuwa, Kevin M; Halpern, Ethan J; Shofer, Frances S
2011-02-01
The study aimed to examine time and imaging costs of 2 different imaging strategies for low-risk emergency department (ED) observation patients with acute chest pain or symptoms suggestive of acute coronary syndrome. We compared a "triple rule-out" (TRO) 64-section multidetector computed tomography protocol with nuclear stress testing. This was a prospective observational cohort study of consecutive ED patients who were enrolled in our chest pain observation protocol during a 16-month period. Our standard observation protocol included a minimum of 2 sets of cardiac enzymes at least 6 hours apart followed by a nuclear stress test. Once a week, observation patients were offered a TRO (to evaluate for coronary artery disease, thoracic dissection, and pulmonary embolus) multidetector computed tomography with the option of further stress testing for those patients found to have evidence of coronary artery disease. We analyzed 832 consecutive observation patients including 214 patients who underwent the TRO protocol. Mean total length of stay was 16.1 hours for TRO patients, 16.3 hours for TRO plus other imaging test, 22.6 hours for nuclear stress testing, 23.3 hours for nuclear stress testing plus other imaging tests, and 23.7 hours for nuclear stress testing plus TRO (P < .0001 for TRO and TRO + other test compared to stress test ± other test). Mean imaging times were 3.6, 4.4, 5.9, 7.5, and 6.6 hours, respectively (P < .05 for TRO and TRO + other test compared to stress test ± other test). Mean imaging costs were $1307 for TRO patients vs $945 for nuclear stress testing. Triple rule-out reduced total length of stay and imaging time but incurred higher imaging costs. A per-hospital analysis would be needed to determine if patient time savings justify the higher imaging costs. Copyright © 2011 Elsevier Inc. All rights reserved.
Dorenkamp, Marc; Bonaventura, Klaus; Sohns, Christian; Becker, Christoph R; Leber, Alexander W
2012-03-01
The study aims to determine the direct costs and comparative cost-effectiveness of latest-generation dual-source computed tomography (DSCT) and invasive coronary angiography for diagnosing coronary artery disease (CAD) in patients suspected of having this disease. The study was based on a previously elaborated cohort with an intermediate pretest likelihood for CAD and on complementary clinical data. Cost calculations were based on a detailed analysis of direct costs, and generally accepted accounting principles were applied. Based on Bayes' theorem, a mathematical model was used to compare the cost-effectiveness of both diagnostic approaches. Total costs included direct costs, induced costs and costs of complications. Effectiveness was defined as the ability of a diagnostic test to accurately identify a patient with CAD. Direct costs amounted to €98.60 for DSCT and to €317.75 for invasive coronary angiography. Analysis of model calculations indicated that cost-effectiveness grew hyperbolically with increasing prevalence of CAD. Given the prevalence of CAD in the study cohort (24%), DSCT was found to be more cost-effective than invasive coronary angiography (€970 vs €1354 for one patient correctly diagnosed as having CAD). At a disease prevalence of 49%, DSCT and invasive angiography were equally effective with costs of €633. Above a threshold value of disease prevalence of 55%, proceeding directly to invasive coronary angiography was more cost-effective than DSCT. With proper patient selection and consideration of disease prevalence, DSCT coronary angiography is cost-effective for diagnosing CAD in patients with an intermediate pretest likelihood for it. However, the range of eligible patients may be smaller than previously reported.
Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N
2009-06-01
One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).
Olmstead, Todd A; Ostrow, Cary D; Carroll, Kathleen M
2010-08-01
To determine the cost-effectiveness, from clinic and patient perspectives, of a computer-based version of cognitive-behavioral therapy (CBT4CBT) as an addition to regular clinical practice for substance dependence. PARTICIPANTS, DESIGN AND MEASUREMENTS: This cost-effectiveness study is based on a randomized clinical trial in which 77 individuals seeking treatment for substance dependence at an outpatient community setting were randomly assigned to treatment as usual (TAU) or TAU plus biweekly access to computer-based training in CBT (TAU plus CBT4CBT). The primary patient outcome measure was the total number of drug-free specimens provided during treatment. Incremental cost-effectiveness ratios (ICERs) and cost-effectiveness acceptability curves (CEACs) were used to determine the cost-effectiveness of TAU plus CBT4CBT relative to TAU alone. Results are presented from both the clinic and patient perspectives and are shown to be robust to (i) sensitivity analyses and (ii) a secondary objective patient outcome measure. The per patient cost of adding CBT4CBT to standard care was $39 ($27) from the clinic (patient) perspective. From the clinic (patient) perspective, TAU plus CBT4CBT is likely to be cost-effective when the threshold value to decision makers of an additional drug-free specimen is greater than approximately $21 ($15), and TAU alone is likely to be cost-effective when the threshold value is less than approximately $21 ($15). The ICERs for TAU plus CBT4CBT also compare favorably to ICERs reported elsewhere for other empirically validated therapies, including contingency management. TAU plus CBT4CBT appears to be a good value from both the clinic and patient perspectives. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Bergman Gert JD
2010-09-01
Full Text Available Abstract Background Shoulder complaints are common in primary care and have unfavourable long term prognosis. Our objective was to evaluate the clinical effectiveness of manipulative therapy of the cervicothoracic spine and the adjacent ribs in addition to usual medical care (UMC by the general practitioner in the treatment of shoulder complaints. Methods This economic evaluation was conducted alongside a randomized trial in primary care. Included were 150 patients with shoulder complaints and a dysfunction of the cervicothoracic spine and adjacent ribs. Patients were treated with UMC (NSAID's, corticosteroid injection or referral to physical therapy and were allocated at random (yes/no to manipulative therapy (manipulation and mobilization. Patient perceived recovery, severity of main complaint, shoulder pain, disability and general health were outcome measures. Data about direct and indirect costs were collected by means of a cost diary. Results Manipulative therapy as add-on to UMC accelerated recovery on all outcome measures included. At 26 weeks after randomization, both groups reported similar recovery rates (41% vs. 38%, but the difference between groups in improvement of severity of the main complaint, shoulder pain and disability sustained. Compared to the UMC group the total costs were higher in the manipulative group (€1167 vs. €555. This is explained mainly by the costs of the manipulative therapy itself and the higher costs due sick leave from work. The cost effectiveness ratio showed that additional manipulative treatment is more costly but also more effective than UMC alone. The cost-effectiveness acceptability curve shows that a 50%-probability of recovery with AMT within 6 months after initiation of treatment is achieved at €2876. Conclusion Manipulative therapy in addition to UMC accelerates recovery and is more effective than UMC alone on the long term, but is associated with higher costs. International Standard
Anzai, Yoshimi; Heilbrun, Marta E; Haas, Derek; Boi, Luca; Moshre, Kirk; Minoshima, Satoshi; Kaplan, Robert; Lee, Vivian S
2017-02-01
The lack of understanding of the real costs (not charge) of delivering healthcare services poses tremendous challenges in the containment of healthcare costs. In this study, we applied an established cost accounting method, the time-driven activity-based costing (TDABC), to assess the costs of performing an abdomen and pelvis computed tomography (AP CT) in an academic radiology department and identified opportunities for improved efficiency in the delivery of this service. The study was exempt from an institutional review board approval. TDABC utilizes process mapping tools from industrial engineering and activity-based costing. The process map outlines every step of discrete activity and duration of use of clinical resources, personnel, and equipment. By multiplying the cost per unit of capacity by the required task time for each step, and summing each component cost, the overall costs of AP CT is determined for patients in three settings, inpatient (IP), outpatient (OP), and emergency departments (ED). The component costs to deliver an AP CT study were as follows: radiologist interpretation: 40.1%; other personnel (scheduler, technologist, nurse, pharmacist, and transporter): 39.6%; materials: 13.9%; and space and equipment: 6.4%. The cost of performing CT was 13% higher for ED patients and 31% higher for inpatients (IP), as compared to that for OP. The difference in cost was mostly due to non-radiologist personnel costs. Approximately 80% of the direct costs of AP CT to the academic medical center are related to labor. Potential opportunities to reduce the costs include increasing the efficiency of utilization of CT, substituting lower cost resources when appropriate, and streamlining the ordering system to clarify medical necessity and clinical indications. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Pyshkin, P V; Luo, Da-Wei; Jing, Jun; You, J Q; Wu, Lian-Ao
2016-11-25
Holonomic quantum computation (HQC) may not show its full potential in quantum speedup due to the prerequisite of a long coherent runtime imposed by the adiabatic condition. Here we show that the conventional HQC can be dramatically accelerated by using external control fields, of which the effectiveness is exclusively determined by the integral of the control fields in the time domain. This control scheme can be realized with net zero energy cost and it is fault-tolerant against fluctuation and noise, significantly relaxing the experimental constraints. We demonstrate how to realize the scheme via decoherence-free subspaces. In this way we unify quantum robustness merits of this fault-tolerant control scheme, the conventional HQC and decoherence-free subspace, and propose an expedited holonomic quantum computation protocol.
Operating Dedicated Data Centers - Is It Cost-Effective?
Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.
2014-06-01
The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.
Accomplish the Application Area in Cloud Computing
Bansal, Nidhi; Awasthi, Amit
2012-01-01
In the cloud computing application area of accomplish, we find the fact that cloud computing covers a lot of areas are its main asset. At a top level, it is an approach to IT where many users, some even from different companies get access to shared IT resources such as servers, routers and various file extensions, instead of each having their own dedicated servers. This offers many advantages like lower costs and higher efficiency. Unfortunately there have been some high profile incidents whe...
Enabling Earth Science Through Cloud Computing
Hardman, Sean; Riofrio, Andres; Shams, Khawaja; Freeborn, Dana; Springer, Paul; Chafin, Brian
2012-01-01
Cloud Computing holds tremendous potential for missions across the National Aeronautics and Space Administration. Several flight missions are already benefiting from an investment in cloud computing for mission critical pipelines and services through faster processing time, higher availability, and drastically lower costs available on cloud systems. However, these processes do not currently extend to general scientific algorithms relevant to earth science missions. The members of the Airborne Cloud Computing Environment task at the Jet Propulsion Laboratory have worked closely with the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to integrate cloud computing into their science data processing pipeline. This paper details the efforts involved in deploying a science data system for the CARVE mission, evaluating and integrating cloud computing solutions with the system and porting their science algorithms for execution in a cloud environment.
Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing
Klems, Markus; Nimis, Jens; Tai, Stefan
On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.
Pondor, Ibnteesam; Gan, Wan Ying; Appannah, Geeta
2017-09-16
Food price is a determining factor of food choices; however its relationship with diet quality is unclear in Malaysia. This study aimed to examine socio-economic characteristics and daily dietary cost (DDC) in relation to diet quality in the state of Selangor, Malaysia. Dietary intake was assessed using a Food Frequency Questionnaire (FFQ) and diet quality was estimated using a Malaysian Healthy Eating Index (M-HEI). DDC in Malaysian Ringgit (RM) was calculated from dietary intake and national food prices. Linear regression models were fitted to determine associations between DDC and M-HEI scores and predictors of diet quality. The mean M-HEI score of respondents was 61.31 ± 10.88 and energy adjusted DDC was RM10.71/2000 kcal (USD 2.49). The highest quintile of adjusted DDC had higher M-HEI scores for all respondents (Q1: 57.14 ± 10.07 versus Q5: 63.26 ± 11.54, p = 0.001). There were also positive associations between DDC and M-HEI scores for fruits ( p diet quality included carbohydrate (β = 0290; p healthy eating among Malaysian adults.
Low rank approach to computing first and higher order derivatives using automatic differentiation
International Nuclear Information System (INIS)
Reed, J. A.; Abdel-Khalik, H. S.; Utke, J.
2012-01-01
This manuscript outlines a new approach for increasing the efficiency of applying automatic differentiation (AD) to large scale computational models. By using the principles of the Efficient Subspace Method (ESM), low rank approximations of the derivatives for first and higher orders can be calculated using minimized computational resources. The output obtained from nuclear reactor calculations typically has a much smaller numerical rank compared to the number of inputs and outputs. This rank deficiency can be exploited to reduce the number of derivatives that need to be calculated using AD. The effective rank can be determined according to ESM by computing derivatives with AD at random inputs. Reduced or pseudo variables are then defined and new derivatives are calculated with respect to the pseudo variables. Two different AD packages are used: OpenAD and Rapsodia. OpenAD is used to determine the effective rank and the subspace that contains the derivatives. Rapsodia is then used to calculate derivatives with respect to the pseudo variables for the desired order. The overall approach is applied to two simple problems and to MATWS, a safety code for sodium cooled reactors. (authors)
Endoscopic third ventriculostomy has no higher costs than ventriculoperitoneal shunt
Directory of Open Access Journals (Sweden)
Benicio Oton de Lima
2014-07-01
Full Text Available Objective: To evaluate the cost of endoscopic third ventriculostomy (ETV compared to ventriculoperitoneal shunt (VPS in the treatment of hydrocephalus in children. Method: We studied 103 children with hydrocephalus, 52 of which were treated with ETV and 51 with VPS in a prospective cohort. Treatment costs were compared within the first year after surgery, including subsequent surgery or hospitalization. Results: Twenty (38.4% of the 52 children treated with VPS needed another procedure due to shunt failure, compared to 11 (21.5% of 51 children in the ETV group. The average costs per patient in the group treated with ETV was USD$ 2,177,66±517.73 compared to USD$ 2,890.68±2,835.02 for the VPS group. Conclusions: In this series there was no significant difference in costs between the ETV and VPS groups.
Solving computationally expensive engineering problems
Leifsson, Leifur; Yang, Xin-She
2014-01-01
Computational complexity is a serious bottleneck for the design process in virtually any engineering area. While migration from prototyping and experimental-based design validation to verification using computer simulation models is inevitable and has a number of advantages, high computational costs of accurate, high-fidelity simulations can be a major issue that slows down the development of computer-aided design methodologies, particularly those exploiting automated design improvement procedures, e.g., numerical optimization. The continuous increase of available computational resources does not always translate into shortening of the design cycle because of the growing demand for higher accuracy and necessity to simulate larger and more complex systems. Accurate simulation of a single design of a given system may be as long as several hours, days or even weeks, which often makes design automation using conventional methods impractical or even prohibitive. Additional problems include numerical noise often pr...
POPCYCLE: a computer code for calculating nuclear and fossil plant levelized life-cycle power costs
International Nuclear Information System (INIS)
Hardie, R.W.
1982-02-01
POPCYCLE, a computer code designed to calculate levelized life-cycle power costs for nuclear and fossil electrical generating plants is described. Included are (1) derivations of the equations and a discussion of the methodology used by POPCYCLE, (2) a description of the input required by the code, (3) a listing of the input for a sample case, and (4) the output for a sample case
Cloud Computing in Higher Education Sector for Sustainable Development
Duan, Yuchao
2016-01-01
Cloud computing is considered a new frontier in the field of computing, as this technology comprises three major entities namely: software, hardware and network. The collective nature of all these entities is known as the Cloud. This research aims to examine the impacts of various aspects namely: cloud computing, sustainability, performance…
Towards higher reliability of CMS computing facilities
International Nuclear Information System (INIS)
Bagliesi, G; Bloom, K; Brew, C; Flix, J; Kreuzer, P; Sciabà, A
2012-01-01
The CMS experiment has adopted a computing system where resources are distributed worldwide in more than 50 sites. The operation of the system requires a stable and reliable behaviour of the underlying infrastructure. CMS has established procedures to extensively test all relevant aspects of a site and their capability to sustain the various CMS computing workflows at the required scale. The Site Readiness monitoring infrastructure has been instrumental in understanding how the system as a whole was improving towards LHC operations, measuring the reliability of sites when running CMS activities, and providing sites with the information they need to troubleshoot any problem. This contribution reviews the complete automation of the Site Readiness program, with the description of monitoring tools and their inclusion into the Site Status Board (SSB), the performance checks, the use of tools like HammerCloud, and the impact in improving the overall reliability of the Grid from the point of view of the CMS computing system. These results are used by CMS to select good sites to conduct workflows, in order to maximize workflows efficiencies. The performance against these tests seen at the sites during the first years of LHC running is as well reviewed.
Dong, Hengjin; Buxton, Martin
2006-01-01
The objective of this study is to apply a Markov model to compare cost-effectiveness of total knee replacement (TKR) using computer-assisted surgery (CAS) with that of TKR using a conventional manual method in the absence of formal clinical trial evidence. A structured search was carried out to identify evidence relating to the clinical outcome, cost, and effectiveness of TKR. Nine Markov states were identified based on the progress of the disease after TKR. Effectiveness was expressed by quality-adjusted life years (QALYs). The simulation was carried out initially for 120 cycles of a month each, starting with 1,000 TKRs. A discount rate of 3.5 percent was used for both cost and effectiveness in the incremental cost-effectiveness analysis. Then, a probabilistic sensitivity analysis was carried out using a Monte Carlo approach with 10,000 iterations. Computer-assisted TKR was a long-term cost-effective technology, but the QALYs gained were small. After the first 2 years, the incremental cost per QALY of computer-assisted TKR was dominant because of cheaper and more QALYs. The incremental cost-effectiveness ratio (ICER) was sensitive to the "effect of CAS," to the CAS extra cost, and to the utility of the state "Normal health after primary TKR," but it was not sensitive to utilities of other Markov states. Both probabilistic and deterministic analyses produced similar cumulative serious or minor complication rates and complex or simple revision rates. They also produced similar ICERs. Compared with conventional TKR, computer-assisted TKR is a cost-saving technology in the long-term and may offer small additional QALYs. The "effect of CAS" is to reduce revision rates and complications through more accurate and precise alignment, and although the conclusions from the model, even when allowing for a full probabilistic analysis of uncertainty, are clear, the "effect of CAS" on the rate of revisions awaits long-term clinical evidence.
Pratt, H M; Langlotz, C P; Feingold, E R; Schwartz, J S; Kundel, H L
1998-01-01
To determine the incremental cash flows associated with department-wide implementation of a picture archiving and communication system (PACS) and computed radiography (CR) at a large academic medical center. The authors determined all capital and operational costs associated with PACS implementation during an 8-year time horizon. Economic effects were identified, adjusted for time value, and used to calculate net present values (NPVs) for each section of the department of radiology and for the department as a whole. The chest-bone section used the most resources. Changes in cost assumptions for the chest-bone section had a dominant effect on the department-wide NPV. The base-case NPV (i.e., that determined by using the initial assumptions) was negative, indicating that additional net costs are incurred by the radiology department from PACS implementation. PACS and CR provide cost savings only when a 12-year hardware life span is assumed, when CR equipment is removed from the analysis, or when digitized long-term archives are compressed at a rate of 10:1. Full PACS-CR implementation would not provide cost savings for a large, subspecialized department. However, institutions that are committed to CR implementation (for whom CR implementation would represent a sunk cost) or institutions that are able to archive images by using image compression will experience cost savings from PACS.
DEFF Research Database (Denmark)
Kruse, Marie
2015-01-01
assessed using Danish national healthcare registers. Productivity costs were computed using duration analysis (Cox regression models). In a subanalysis, cost per severe traffic injury was computed for the 12 995 individuals that experienced a severe injury. RESULTS: The socioeconomic cost of a traffic...... injury was €1406 (2009 price level) in the first year, and €8950 over a 10-year period. Per 100 000 population, the 10-year cost was €6 565 668. A severe traffic injury costs €4969 per person in the first year, and €4 006 685 per 100 000 population over a 10-year period. Victims of traffic injuries...
Torres-Pomales, Wilfredo
2014-01-01
This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.
Operating dedicated data centers – is it cost-effective?
International Nuclear Information System (INIS)
Ernst, M; Hogue, R; Hollowell, C; Strecker-Kellog, W; Wong, A; Zaytsev, A
2014-01-01
The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.
The Cost-Accounting Mechanism in Higher Educational Institutions.
Lukoshkin, A. P.; Min'ko, E. V.
1990-01-01
Examines the need to increase expenditures per student at Soviet technical institutes. Proposes seeking financial assistance from enterprises employing technical specialists. Outlines an experimental program in cost accounting. Suggests stipend and wage allotments and explains some of the contractual obligations involved. (CH)
Energy Technology Data Exchange (ETDEWEB)
Stephan, G.; Van Nieuwkoop, R.; Wiedmer, T. (Institute for Applied Microeconomics, Univ. of Bern (Switzerland))
1992-01-01
Both distributional and allocational effects of limiting carbon dioxide emissions in a small and open economy are discussed. It starts from the assumption that Switzerland attempts to stabilize its greenhouse gas emissions over the next 25 years, and evaluates costs and benefits of the respective reduction programme. From a methodological viewpoint, it is illustrated how a computable general equilibrium approach can be adopted for identifying economic effects of cutting greenhouse gas emissions on the national level. From a political economy point of view it considers the social incidence of a greenhouse policy. It shows in particular that public acceptance can be increased and economic costs of greenhouse policies can be reduced, if carbon taxes are accompanied by revenue redistribution. 8 tabs., 1 app., 17 refs.
Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.
2018-01-01
Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and…
Zachariadou, K.; Yiasemides, K.; Trougkakos, N.
2012-01-01
We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental…
Compiler-Directed Transformation for Higher-Order Stencils
Energy Technology Data Exchange (ETDEWEB)
Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Straalen, Brian Van [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Colella, Phillip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
2015-07-20
As the cost of data movement increasingly dominates performance, developers of finite-volume and finite-difference solutions for partial differential equations (PDEs) are exploring novel higher-order stencils that increase numerical accuracy and computational intensity. This paper describes a new compiler reordering transformation applied to stencil operators that performs partial sums in buffers, and reuses the partial sums in computing multiple results. This optimization has multiple effect son improving stencil performance that are particularly important to higher-order stencils: exploits data reuse, reduces floating-point operations, and exposes efficient SIMD parallelism to backend compilers. We study the benefit of this optimization in the context of Geometric Multigrid (GMG), a widely used method to solvePDEs, using four different Jacobi smoothers built from 7-, 13-, 27-and 125-point stencils. We quantify performance, speedup, andnumerical accuracy, and use the Roofline model to qualify our results. Ultimately, we obtain over 4× speedup on the smoothers themselves and up to a 3× speedup on the multigrid solver. Finally, we demonstrate that high-order multigrid solvers have the potential of reducing total data movement and energy by several orders of magnitude.
Kankaanpää, Irja; Isomäki, Hannakaisa
2013-01-01
This paper reviews research literature on the production and commercialization of IT-enabled higher education in computer science. Systematic literature review (SLR) was carried out in order to find out to what extent this area has been studied, more specifically how much it has been studied and to what detail. The results of this paper make a…
Measuring the Cost of Quality in Higher Education: A Faculty Perspective
Ruhupatty, LeRoy; Maguad, Ben A.
2015-01-01
Most critical activities in colleges and universities are driven by financial considerations. It is thus important that revenues are found to support these activities or ways identified to streamline costs. One way to cut cost is to improve the efficiency of schools to address the issue of poor quality. In this paper, the cost of poor quality in…
Perez Montes, Diego A.; Añel Cabanelas, Juan A.; Wallom, David C. H.; Arribas, Alberto; Uhe, Peter; Caderno, Pablo V.; Pena, Tomas F.
2017-04-01
Cloud Computing is a technological option that offers great possibilities for modelling in geosciences. We have studied how two different climate models, HadAM3P-HadRM3P and CESM-WACCM, can be adapted in two different ways to run on Cloud Computing Environments from three different vendors: Amazon, Google and Microsoft. Also, we have evaluated qualitatively how the use of Cloud Computing can affect the allocation of resources by funding bodies and issues related to computing security, including scientific reproducibility. Our first experiments were developed using the well known ClimatePrediction.net (CPDN), that uses BOINC, over the infrastructure from two cloud providers, namely Microsoft Azure and Amazon Web Services (hereafter AWS). For this comparison we ran a set of thirteen month climate simulations for CPDN in Azure and AWS using a range of different virtual machines (VMs) for HadRM3P (50 km resolution over South America CORDEX region) nested in the global atmosphere-only model HadAM3P. These simulations were run on a single processor and took between 3 and 5 days to compute depending on the VM type. The last part of our simulation experiments was running WACCM over different VMS on the Google Compute Engine (GCE) and make a comparison with the supercomputer (SC) Finisterrae1 from the Centro de Supercomputacion de Galicia. It was shown that GCE gives better performance than the SC for smaller number of cores/MPI tasks but the model throughput shows clearly how the SC performance is better after approximately 100 cores (related with network speed and latency differences). From a cost point of view, Cloud Computing moves researchers from a traditional approach where experiments were limited by the available hardware resources to monetary resources (how many resources can be afforded). As there is an increasing movement and recommendation for budgeting HPC projects on this technology (budgets can be calculated in a more realistic way) we could see a shift on
Benefit-cost assessment programs: Costa Rica case study
International Nuclear Information System (INIS)
Clark, A.L.; Trocki, L.K.
1991-01-01
An assessment of mineral potential, in terms of types and numbers of deposits, approximate location and associated tonnage and grades, is a valuable input to a nation's economic planning and mineral policy development. This study provides a methodology for applying benefit-cost analysis to mineral resource assessment programs, both to determine the cost effectiveness of resource assessments and to ascertain future benefits to the nation. In a case study of Costa Rica, the benefit-cost ratio of a resource assessment program was computed to be a minimum of 4:1 ($10.6 million to $2.5 million), not including the economic benefits accuring from the creation of 800 mining sector and 1,200 support services jobs. The benefit-cost ratio would be considerably higher if presently proposed revisions of mineral policy were implemented and benefits could be defined for Costa Rica
Garrido, Gemma; Penadés, Rafael; Barrios, Maite; Aragay, Núria; Ramos, Irene; Vallès, Vicenç; Faixa, Carlota; Vendrell, Josep M
2017-08-01
The durability of computer-assisted cognitive remediation (CACR) therapy over time and the cost-effectiveness of treatment remains unclear. The aim of the current study is to investigate the effectiveness of CACR and to examine the use and cost of acute psychiatric admissions before and after of CACR. Sixty-seven participants were initially recruited. For the follow-up study a total of 33 participants were enrolled, 20 to the CACR condition group and 13 to the active control condition group. All participants were assessed at baseline, post-therapy and 12 months post-therapy on neuropsychology, QoL and self-esteem measurements. The use and cost of acute psychiatric admissions were collected retrospectively at four assessment points: baseline, 12 months post-therapy, 24 months post-therapy, and 36 months post-therapy. The results indicated that treatment effectiveness persisted in the CACR group one year post-therapy on neuropsychological and well-being outcomes. The CACR group showed a clear decrease in the use of acute psychiatric admissions at 12, 24 and 36 months post-therapy, which lowered the global costs the acute psychiatric admissions at 12, 24 and 36 months post-therapy. The CACR is durable over at least a 12-month period, and CACR may be helping to reduce health care costs for schizophrenia patients. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Sosinsky, Barrie
2010-01-01
The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit
Unenhanced computed tomography in acute renal colic reduces cost outside radiology department
DEFF Research Database (Denmark)
Lauritsen, J.; Andersen, J.R.; Nordling, J.
2008-01-01
BACKGROUND: Unenhanced multidetector computed tomography (UMDCT) is well established as the procedure of choice for radiologic evaluation of patients with renal colic. The procedure has both clinical and financial consequences for departments of surgery and radiology. However, the financial effect...... outside the radiology department is poorly elucidated. PURPOSE: To evaluate the financial consequences outside of the radiology department, a retrospective study comparing the ward occupation of patients examined with UMDCT to that of intravenous urography (IVU) was performed. MATERIAL AND METHODS......) saved the hospital USD 265,000 every 6 months compared to the use of IVU. CONCLUSION: Use of UMDCT compared to IVU in patients with renal colic leads to cost savings outside the radiology department Udgivelsesdato: 2008/12...
From computers to ubiquitous computing by 2010: health care.
Aziz, Omer; Lo, Benny; Pansiot, Julien; Atallah, Louis; Yang, Guang-Zhong; Darzi, Ara
2008-10-28
Over the past decade, miniaturization and cost reduction in semiconductors have led to computers smaller in size than a pinhead with powerful processing abilities that are affordable enough to be disposable. Similar advances in wireless communication, sensor design and energy storage have meant that the concept of a truly pervasive 'wireless sensor network', used to monitor environments and objects within them, has become a reality. The need for a wireless sensor network designed specifically for human body monitoring has led to the development of wireless 'body sensor network' (BSN) platforms composed of tiny integrated microsensors with on-board processing and wireless data transfer capability. The ubiquitous computing abilities of BSNs offer the prospect of continuous monitoring of human health in any environment, be it home, hospital, outdoors or the workplace. This pervasive technology comes at a time when Western world health care costs have sharply risen, reflected by increasing expenditure on health care as a proportion of gross domestic product over the last 20 years. Drivers of this rise include an ageing post 'baby boom' population, higher incidence of chronic disease and the need for earlier diagnosis. This paper outlines the role of pervasive health care technologies in providing more efficient health care.
Directory of Open Access Journals (Sweden)
Alejandro Álvarez-Navarro
2016-01-01
Full Text Available The process of sugar production is complex; anything that affects this chain has direct repercussions in the sugar production’s costs, it’s synthetic and decisive indicator for the taking of decisions. Currently the Cuban sugar factory determine this cost weekly, for that, its process of taking of decisions is affected. Looking for solutions to this problem, the present work, being part of a territorial project approved by CITMA, intended to calculate the cost of production daily, weekly, monthly and accumulated until indicated date, according to an adaptation to the methodology used by the National Costs System of sugarcane created by the MINAZ, it’s supported by a computer system denominated SACODI. This adaptation registers the physical and economic indicators of all direct and indirect expenses of the sugarcane and besides this information generates an economic-mathematical model of goal programming whose solution indicates the best balance in amount of sugar of the entities of the sugar factory, in short term. The implementation of the system in the sugar factory «Julio A. Mella» in Santiago de Cuba in the sugar-cane production 08-09 produced an estimate of decrease of the cost of until 3,5 % for the taking of better decisions.
Computer performance evaluation of FACOM 230-75 computer system, (2)
International Nuclear Information System (INIS)
Fujii, Minoru; Asai, Kiyoshi
1980-08-01
In this report are described computer performance evaluations for FACOM230-75 computers in JAERI. The evaluations are performed on following items: (1) Cost/benefit analysis of timesharing terminals, (2) Analysis of the response time of timesharing terminals, (3) Analysis of throughout time for batch job processing, (4) Estimation of current potential demands for computer time, (5) Determination of appropriate number of card readers and line printers. These evaluations are done mainly from the standpoint of cost reduction of computing facilities. The techniques adapted are very practical ones. This report will be useful for those people who are concerned with the management of computing installation. (author)
Costs of fire suppression forces based on cost-aggregation approach
Gonz& aacute; lez-Cab& aacute; Armando n; Charles W. McKetta; Thomas J. Mills
1984-01-01
A cost-aggregation approach has been developed for determining the cost of Fire Management Inputs (FMls)-the direct fireline production units (personnel and equipment) used in initial attack and large-fire suppression activities. All components contributing to an FMI are identified, computed, and summed to estimate hourly costs. This approach can be applied to any FMI...
Directory of Open Access Journals (Sweden)
Rizki Yudhi Dewantara
2017-06-01
Full Text Available Computer crime rate grow rapidly along with the development of the digital world that has touched almost all aspects of human life. Institutions of higher education cannot be separated from the problem of computer crime activities. The paper analyses the implementation of Indonesia Computer Crime Act (UU ITE NO.11 2008 in the Higher Education Institution in Indonesia. It aims to investigate the level of computer crimes that occurred in the higher education institution environment and the act (UU ITE 11, 2008 successfully applied to prevent the crime that would arise. In this research, the analysis using Descriptive Statistics, Binary logistic regression. This paper also describes the success implementation of the Information System Security Policy (ISSP as a computer crime prevention policy in higher education institution in Indonesia. In factor of act, clarity of objectives and purpose of the UU ITE 11, 2008 was low, the communication and socialization activities are still low to the society especially to the higher education institution, moreover the control process has been running on UU ITE 11, 2008, but at a low level. Keywords: computer crime, computer crime act, public policy implementation ABSTRAK Kejahatan Komputer berkembang pesat sejalan dengan perkembangan dunia digital, pada institusi perguruan tinggi tidak dapat dipisahkan dari bagian kejahatan computer. Penelitian ini merupakan analisis kesuksesan penerapan undang-undang kejahatan komputer (UU ITE 11, 2008 di institusi perguruan tinggi di Indonesia. Penelitian ini bertujuan untuk mengetahui tingkat kejahatan komputer yang terjadi pada lingkungan institusi perguruan tinggi dan kesuksesan penerapan undang-undang kejahatan komputer untuk mencegah tindakan kejahatan komputer yang mungkin dapat terjadi maupun menangani kejahatan yang sedang terjadi. Berdasarkan tujuan penelitian, digunakan pendekatan quantitative dengan beberapa uji statistic antara lain analisis statistic
Special issue of Higher-Order and Symbolic Computation
DEFF Research Database (Denmark)
Danvy, Olivier
, they should have a large range of applicability for a large class of specifications or programs. Only general ideas could become the basis for an automatic system for program development. Bob’s APTS system is indeed the incarnation of most of the techniques he proposed (cf. Leonard and Heitmeyer...... specification, expressed in SCR notation, into C. Two translation strategies are discussed in the paper. Both were implemented using Bob Paige’s APTS programtransformation system. “Computational Divided Differencing and Divided-Difference Arithmetics” uses an approach conceptually similar to the Computational...
Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources
International Nuclear Information System (INIS)
Evans, D; Fisk, I; Holzman, B; Pordes, R; Tiradani, A; Melo, A; Sheldon, P; Metson, S
2011-01-01
Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely 'on-demand' as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the 'base-line' needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.
Reduced computational cost in the calculation of worst case response time for real time systems
Urriza, José M.; Schorb, Lucas; Orozco, Javier D.; Cayssials, Ricardo
2009-01-01
Modern Real Time Operating Systems require reducing computational costs even though the microprocessors become more powerful each day. It is usual that Real Time Operating Systems for embedded systems have advance features to administrate the resources of the applications that they support. In order to guarantee either the schedulability of the system or the schedulability of a new task in a dynamic Real Time System, it is necessary to know the Worst Case Response Time of the Real Time tasks ...
Computer architecture fundamentals and principles of computer design
Dumas II, Joseph D
2005-01-01
Introduction to Computer ArchitectureWhat is Computer Architecture?Architecture vs. ImplementationBrief History of Computer SystemsThe First GenerationThe Second GenerationThe Third GenerationThe Fourth GenerationModern Computers - The Fifth GenerationTypes of Computer SystemsSingle Processor SystemsParallel Processing SystemsSpecial ArchitecturesQuality of Computer SystemsGenerality and ApplicabilityEase of UseExpandabilityCompatibilityReliabilitySuccess and Failure of Computer Architectures and ImplementationsQuality and the Perception of QualityCost IssuesArchitectural Openness, Market Timi
Adaptive grouping for the higher-order multilevel fast multipole method
DEFF Research Database (Denmark)
Borries, Oscar Peter; Jørgensen, Erik; Meincke, Peter
2014-01-01
An alternative parameter-free adaptive approach for the grouping of the basis function patterns in the multilevel fast multipole method is presented, yielding significant memory savings compared to the traditional Octree grouping for most discretizations, particularly when using higher-order basis...... functions. Results from both a uniformly and nonuniformly meshed scatterer are presented, showing how the technique is worthwhile even for regular meshes, and demonstrating that there is no loss of accuracy in spite of the large reduction in memory requirements and the relatively low computational cost....
Special issue of Higher-Order and Symbolic Computation
DEFF Research Database (Denmark)
Danvy, Olivier; Sabry, Amr
This issue of HOSC is dedicated to the general topic of continuations. It grew out of the third ACM SIGPLAN Workshop on Continuations (CW'01), which took place in London, UK on January 16, 2001 [3]. The notion of continuation is ubiquitous in many different areas of computer science, including...... and streamline Filinski's earlier work in the previous special issue of HOSC (then LISP and Symbolic Computation) that grew out of the first ACM SIGPLAN Workshop on Continuations [1, 2]. Hasegawa and Kakutani's article is the journal version of an article presented at FOSSACS 2001 and that received the EATCS...
Li, Ying
2016-09-16
Fault-tolerant quantum computing in systems composed of both Majorana fermions and topologically unprotected quantum systems, e.g., superconducting circuits or quantum dots, is studied in this Letter. Errors caused by topologically unprotected quantum systems need to be corrected with error-correction schemes, for instance, the surface code. We find that the error-correction performance of such a hybrid topological quantum computer is not superior to a normal quantum computer unless the topological charge of Majorana fermions is insusceptible to noise. If errors changing the topological charge are rare, the fault-tolerance threshold is much higher than the threshold of a normal quantum computer and a surface-code logical qubit could be encoded in only tens of topological qubits instead of about 1,000 normal qubits.
Higher-Order and Symbolic Computation
DEFF Research Database (Denmark)
Danvy, Olivier; Mason, Ian
2008-01-01
a series of implementaions that properly account for multiple invocations of the derivative-taking opeatro. In "Adapting Functional Programs to Higher-Order Logic," Scott Owens and Konrad Slind present a variety of examples of terminiation proofs of functional programs written in HOL proof systems. Since......-calculus programs, historically. The anaylsis determines the possible locations of ambients and mirrors the temporla sequencing of actions in the structure of types....
Zuckerman, Stephen; Skopec, Laura; Guterman, Stuart
2017-12-01
Medicare Advantage (MA), the program that allows people to receive their Medicare benefits through private health plans, uses a benchmark-and-bidding system to induce plans to provide benefits at lower costs. However, prior research suggests medical costs, profits, and other plan costs are not as low under this system as they might otherwise be. To examine how well the current system encourages MA plans to bid their lowest cost by examining the relationship between costs and bonuses (rebates) and the benchmarks Medicare uses in determining plan payments. Regression analysis using 2015 data for HMO and local PPO plans. Costs and rebates are higher for MA plans in areas with higher benchmarks, and plan costs vary less than benchmarks do. A one-dollar increase in benchmarks is associated with 32-cent-higher plan costs and a 52-cent-higher rebate, even when controlling for market and plan factors that can affect costs. This suggests the current benchmark-and-bidding system allows plans to bid higher than local input prices and other market conditions would seem to warrant. To incentivize MA plans to maximize efficiency and minimize costs, Medicare could change the way benchmarks are set or used.
Total life-cycle cost analysis of conventional and alternative fueled vehicles
International Nuclear Information System (INIS)
Cardullo, M.W.
1993-01-01
Total Life-Cycle Cost (TLCC) Analysis can indicate whether paying higher capital costs for advanced technology with low operating and/or environmental costs is advantageous over paying lower capital costs for conventional technology with higher operating and/or environmental costs. While minimizing total life-cycle cost is an important consideration, the consumer often identifies non-cost-related benefits or drawbacks that make more expensive options appear more attractive. The consumer is also likely to heavily weigh initial capital costs while giving limited consideration to operating and/or societal costs, whereas policy-makers considering external costs, such as those resulting from environmental impacts, may reach significantly different conclusions about which technologies are most advantageous to society. This paper summarizes a TLCC model which was developed to facilitate consideration of the various factors involved in both individual and societal policy decision making. The model was developed as part of a US Department of Energy Contract and has been revised to reflect changes necessary to make the model more realistic. The model considers capital, operating, salvage, and environmental costs for cars, vans, and buses using conventional and alternative fuels. The model has been developed to operate on an IBM or compatible personal computer platform using the commercial spreadsheet program MicroSoft Excell reg-sign Version 4 for Windows reg-sign and can be easily kept current because its modular structure allows straightforward access to embedded data sets for review and update
The ABCs of Activity-Based Costing: A Cost Containment and Reallocation Tool.
Turk, Frederick J.
1992-01-01
This article describes activity-based costing (ABC) and how this tool may help management understand the costs of major activities and identify possible alternatives. Also discussed are the traditional costing systems used by higher education and ways of applying ABC to higher education. (GLR)
New Federal Cost Accounting Regulations
Wolff, George J.; Handzo, Joseph J.
1973-01-01
Discusses a new set of indirect cost accounting procedures which must be followed by school districts wishing to recover any indirect costs of administering federal grants and contracts. Also discusses the amount of indirect costs that may be recovered, computing indirect costs, classifying project costs, and restricted grants. (Author/DN)
Costs and role of ultrasound follow-up of polytrauma patients after initial computed tomography
International Nuclear Information System (INIS)
Maurer, M.H.; Winkler, A.; Powerski, M.J.; Elgeti, F.; Huppertz, A.; Roettgen, R.; Marnitz, T.; Wichlas, F.
2012-01-01
Purpose: To assess the costs and diagnostic gain of abdominal ultrasound follow-up of polytrauma patients initially examined by whole-body computed tomography (CT). Materials and Methods: A total of 176 patients with suspected multiple trauma (126 men, 50 women; age 43.5 ± 17.4 years) were retrospectively analyzed with regard to supplementary and new findings obtained by ultrasound follow-up compared with the results of exploratory FAST (focused assessment with sonography for trauma) at admission and the findings of whole-body CT. A process model was used to document the staff, materials, and total costs of the ultrasound follow-up examinations. Results: FAST yielded 26 abdominal findings (organ injury and/or free intra-abdominal fluid) in 19 patients, while the abdominal scan of whole-body CT revealed 32 findings in 25 patients. FAST had 81 % sensitivity and 100 % specificity. Follow-up ultrasound examinations revealed new findings in 2 of the 25 patients with abdominal injuries detected with initial CT. In the 151 patients without abdominal injuries in the initial CT scan, ultrasound follow-up did not yield any supplementary or new findings. The total costs of an ultrasound follow-up examination were EUR 28.93. The total costs of all follow-up ultrasound examinations performed in the study population were EUR 5658.23. Conclusion: Follow-up abdominal ultrasound yields only a low overall diagnostic gain in polytrauma patients in whom initial CT fails to detect any abdominal injuries but incurs high personnel expenses for radiological departments. (orig.)
Directory of Open Access Journals (Sweden)
hamid reza bazi
2017-12-01
Full Text Available Cloud computing is a new technology that considerably helps Higher Education Institutions (HEIs to develop and create competitive advantage with inherent characteristics such as flexibility, scalability, accessibility, reliability, fault tolerant and economic efficiency. Due to the numerous advantages of cloud computing, and in order to take advantage of cloud computing infrastructure, services of universities and HEIs need to migrate to the cloud. However, this transition involves many challenges, one of which is lack or shortage of appropriate architecture for migration to the technology. Using a reliable architecture for migration ensures managers to mitigate risks in the cloud computing technology. Therefore, organizations always search for suitable cloud computing architecture. In previous studies, these important features have received less attention and have not been achieved in a comprehensive way. The aim of this study is to use a meta-synthesis method for the first time to analyze the previously published studies and to suggest appropriate hybrid cloud migration architecture (IUHEC. We reviewed many papers from relevant journals and conference proceedings. The concepts extracted from these papers are classified to related categories and sub-categories. Then, we developed our proposed hybrid architecture based on these concepts and categories. The proposed architecture was validated by a panel of experts and Lawshe’s model was used to determine the content validity. Due to its innovative yet user-friendly nature, comprehensiveness, and high security, this architecture can help HEIs have an effective migration to cloud computing environment.
Hojjat, Houmehr; Svider, Peter F; Folbe, Adam J; Raza, Syed N; Carron, Michael A; Shkoukani, Mahdi A; Merati, Albert L; Mayerhoff, Ross M
2017-02-01
To evaluate the cost-effectiveness of routine computed tomography (CT) in individuals with unilateral vocal fold paralysis (UVFP) STUDY DESIGN: Health Economics Decision Tree Analysis METHODS: A decision tree was constructed to determine the incremental cost-effectiveness ratio (ICER) of CT imaging in UVFP patients. Univariate sensitivity analysis was utilized to calculate what the probability of having an etiology of the paralysis discovered would have to be to make CT with contrast more cost-effective than no imaging. We used two studies examining findings in UVFP patients. The decision pathways were utilizing CT neck with intravenous contrast after diagnostic laryngoscopy versus laryngoscopy alone. The probability of detecting an etiology for UVFP and associated costs were extracted to construct the decision tree. The only incorrect diagnosis was missing a mass in the no-imaging decision branch, which rendered an effectiveness of 0. The ICER of using CT was $3,306, below most acceptable willingness-to-pay (WTP) thresholds. Additionally, univariate sensitivity analysis indicated that at the WTP threshold of $30,000, obtaining CT imaging was the most cost-effective choice when the probability of having a lesion was above 1.7%. Multivariate probabilistic sensitivity analysis with Monte Carlo simulations also showed that at the WTP of $30,000, CT scanning is more cost-effective, with 99.5% certainty. Particularly in the current healthcare environment characterized by increasing consciousness of utilization defensive medicine, economic evaluations represent evidence-based findings that can be employed to facilitate appropriate decision making and enhance physician-patient communication. This economic evaluation strongly supports obtaining CT imaging in patients with newly diagnosed UVFP. 2c. Laryngoscope, 2016 127:440-444, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Cost. 908.108 Section 908.108..., RENTAL VOUCHER, AND MODERATE REHABILITATION PROGRAMS § 908.108 Cost. (a) General. The costs of the... computer hardware or software, or both, the cost of contracting for those services, or the cost of...
The Cost of Chaos in the Curriculum. Perspectives on Higher Education
Capaldi Phillips, Elizabeth D.; Poliakoff, Michael B.
2015-01-01
ACTA's report "The Cost of Chaos in the Curriculum" reveals that the vast array of course choices given to college students is a cause of exploding costs and poor academic outcomes. And a bloated undergraduate curriculum is particularly detrimental to the success of students from lower socioeconomic backgrounds. The report documents how…
International Nuclear Information System (INIS)
Reimus, P.W.; Sevigny, N.L.; Schutz, M.E.; Heller, R.A.
1986-12-01
The MONITOR model is a FORTRAN 77 based computer code that provides parametric life-cycle cost estimates for a monitored retrievable storage (MRS) facility. MONITOR is very flexible in that it can estimate the costs of an MRS facility operating under almost any conceivable nuclear waste logistics scenario. The model can also accommodate input data of varying degrees of complexity and detail (ranging from very simple to more complex) which makes it ideal for use in the MRS program, where new designs and new cost data are frequently offered for consideration. MONITOR can be run as an independent program, or it can be interfaced with the Waste System Transportation and Economic Simulation (WASTES) model, a program that simulates the movement of waste through a complete nuclear waste disposal system. The WASTES model drives the MONITOR model by providing it with the annual quantities of waste that are received, stored, and shipped at the MRS facility. Three runs of MONITOR are documented in this report. Two of the runs are for Version 1 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2A (backup) version of the MRS cost estimate. In one of these runs MONITOR was run as an independent model, and in the other run MONITOR was run using an input file generated by the WASTES model. The two runs correspond to identical cases, and the fact that they gave identical results verified that the code performed the same calculations in both modes of operation. The third run was made for Version 2 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2B (integral) version of the MRS cost estimate. This run was made with MONITOR being run as an independent model. The results of several cases have been verified by hand calculations
A precise goniometer/tensiometer using a low cost single-board computer
Favier, Benoit; Chamakos, Nikolaos T.; Papathanasiou, Athanasios G.
2017-12-01
Measuring the surface tension and the Young contact angle of a droplet is extremely important for many industrial applications. Here, considering the booming interest for small and cheap but precise experimental instruments, we have constructed a low-cost contact angle goniometer/tensiometer, based on a single-board computer (Raspberry Pi). The device runs an axisymmetric drop shape analysis (ADSA) algorithm written in Python. The code, here named DropToolKit, was developed in-house. We initially present the mathematical framework of our algorithm and then we validate our software tool against other well-established ADSA packages, including the commercial ramé-hart DROPimage Advanced as well as the DropAnalysis plugin in ImageJ. After successfully testing for various combinations of liquids and solid surfaces, we concluded that our prototype device would be highly beneficial for industrial applications as well as for scientific research in wetting phenomena compared to the commercial solutions.
A Framework for Collaborative and Convenient Learning on Cloud Computing Platforms
Sharma, Deepika; Kumar, Vikas
2017-01-01
The depth of learning resides in collaborative work with more engagement and fun. Technology can enhance collaboration with a higher level of convenience and cloud computing can facilitate this in a cost effective and scalable manner. However, to deploy a successful online learning environment, elementary components of learning pedagogy must be…
Institute of Scientific and Technical Information of China (English)
韩征
2012-01-01
随着高职院校各项教育费用支出的增加,教育成本逐渐受到人们的关注.为了科学核算高职院校教育成本,分析了高职院校现行核算方法不能客观反映教育成本的原因,明确了教育成本核算的基本会计假设和原则.以高职院校经费预算项目核算为基础,结合高职院校财务现状和办学特点,提出了帐表结合法核算高职院校教育成本.该方法采用项目帐和费用分配表相结合的模式,可科学核算高职院校教育成本.%With the increase of various education expenses of higher vocational colleges, educa-tion cost is gradually getting noticed by people. In order to calculate education cost of higher vo-cational colleges scientifically, this paper puts forward that current calculation method can not re-flect the education cost of higher vocational colleges objectively and clarifies basic accounting pos-tulate as well as principles of education cost calculation. Based on calculation of fund budget i-tems of higher vocational colleges, this paper puts forward account table combined method to calculate education cost of higher vocational colleges according to their financial situation and school-running characteristics. This method adopts the model combining item account with cost allocation table, which can calculate education cost of higher vocational colleges scientifically.
Harris, Catherine R.; Osterberg, E. Charles; Sanford, Thomas; Alwaal, Amjad; Gaither, Thomas W.; McAninch, Jack W.; McCulloch, Charles E.; Breyer, Benjamin N.
2016-01-01
To determine which factors are associated with higher costs of urethroplasty procedure and whether these factors have been increasing over time. Identification of determinants of extreme costs may help reduce cost while maintaining quality.We conducted a retrospective analysis using the 2001-2010 Healthcare Cost and Utilization Project-Nationwide Inpatient Sample (HCUP-NIS). The HCUP-NIS captures hospital charges which we converted to cost using the HCUP cost-to-charge ratio. Log cost linear ...
Personal computers in high energy physics
International Nuclear Information System (INIS)
Quarrie, D.R.
1987-01-01
The role of personal computers within HEP is expanding as their capabilities increase and their cost decreases. Already they offer greater flexibility than many low-cost graphics terminals for a comparable cost and in addition they can significantly increase the productivity of physicists and programmers. This talk will discuss existing uses for personal computers and explore possible future directions for their integration into the overall computing environment. (orig.)
Computational Platform About Amazon Web Services (Aws Distributed Rendering
Directory of Open Access Journals (Sweden)
Gabriel Rojas-Albarracín
2017-09-01
Full Text Available Today has created a dynamic in which people require higher image quality in different media formats (games, movies, animations. Further definition usually requires image processing larger; this brings the need for increased computing power. This paper presents a case study in which the implementation of a low-cost platform on the Amazon cloud for parallel processing of images and animation.
Perry, Alexander R.
2002-06-01
Low Frequency Eddy Current (EC) probes are capable of measurement from 5 MHz down to DC through the use of Magnetoresistive (MR) sensors. Choosing components with appropriate electrical specifications allows them to be matched to the power and impedance characteristics of standard computer connectors. This permits direct attachment of the probe to inexpensive computers, thereby eliminating external power supplies, amplifiers and modulators that have heretofore precluded very low system purchase prices. Such price reduction is key to increased market penetration in General Aviation maintenance and consequent reduction in recurring costs. This paper examines our computer software CANDETECT, which implements this approach and permits effective probe operation. Results are presented to show the intrinsic sensitivity of the software and demonstrate its practical performance when seeking cracks in the underside of a thick aluminum multilayer structure. The majority of the General Aviation light aircraft fleet uses rivets and screws to attach sheet aluminum skin to the airframe, resulting in similar multilayer lap joints.
[Cost analysis for navigation in knee endoprosthetics].
Cerha, O; Kirschner, S; Günther, K-P; Lützner, J
2009-12-01
Total knee arthroplasty (TKA) is one of the most frequent procedures in orthopaedic surgery. The outcome depends on a range of factors including alignment of the leg and the positioning of the implant in addition to patient-associated factors. Computer-assisted navigation systems can improve the restoration of a neutral leg alignment. This procedure has been established especially in Europe and North America. The additional expenses are not reimbursed in the German DRG system (Diagnosis Related Groups). In the present study a cost analysis of computer-assisted TKA compared to the conventional technique was performed. The acquisition expenses of various navigation systems (5 and 10 year depreciation), annual costs for maintenance and software updates as well as the accompanying costs per operation (consumables, additional operating time) were considered. The additional operating time was determined on the basis of a meta-analysis according to the current literature. Situations with 25, 50, 100, 200 and 500 computer-assisted TKAs per year were simulated. The amount of the incremental costs of the computer-assisted TKA depends mainly on the annual volume and the additional operating time. A relevant decrease of the incremental costs was detected between 50 and 100 procedures per year. In a model with 100 computer-assisted TKAs per year an additional operating time of 14 mins and a 10 year depreciation of the investment costs, the incremental expenses amount to 300-395 depending on the navigation system. Computer-assisted TKA is associated with additional costs. From an economical point of view an amount of more than 50 procedures per year appears to be favourable. The cost-effectiveness could be estimated if long-term results will show a reduction of revisions or a better clinical outcome.
Cloud computing for comparative genomics.
Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J
2010-05-18
Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.
DEFF Research Database (Denmark)
Bogetoft, Peter; Hougaard, Jens Leth; Smilgins, Aleksandrs
2016-01-01
This paper deals with empirical computation of Aumann–Shapley cost shares for joint production. We show that if one uses a mathematical programing approach with its non-parametric estimation of the cost function there may be observations in the data set for which we have multiple Aumann–Shapley p...
Robin H. Kay; Sharon Lauricella
2011-01-01
Because of decreased prices, increased convenience, and wireless access, an increasing number of college and university students are using laptop computers in their classrooms. This recent trend has forced instructors to address the educational consequences of using these mobile devices. The purpose of the current study was to analyze and assess beneficial and challenging laptop behaviours in higher education classrooms. Both quantitative and qualitative data were collected from 177 undergrad...
Tuberculosis screening of travelers to higher-incidence countries: A cost-effectiveness analysis
Directory of Open Access Journals (Sweden)
Menzies Dick
2008-06-01
Full Text Available Abstract Background Travelers to countries with high tuberculosis incidence can acquire infection during travel. We sought to compare four screening interventions for travelers from low-incidence countries, who visit countries with varying tuberculosis incidence. Methods Decision analysis model: We considered hypothetical cohorts of 1,000 travelers, 21 years old, visiting Mexico, the Dominican Republic, or Haiti for three months. Travelers departed from and returned to the United States or Canada; they were born in the United States, Canada, or the destination countries. The time horizon was 20 years, with 3% annual discounting of future costs and outcomes. The analysis was conducted from the health care system perspective. Screening involved tuberculin skin testing (post-travel in three strategies, with baseline pre-travel tests in two, or chest radiography post-travel (one strategy. Returning travelers with tuberculin conversion (one strategy or other evidence of latent tuberculosis (three strategies were offered treatment. The main outcome was cost (in 2005 US dollars per tuberculosis case prevented. Results For all travelers, a single post-trip tuberculin test was most cost-effective. The associated cost estimate per case prevented ranged from $21,406 for Haitian-born travelers to Haiti, to $161,196 for US-born travelers to Mexico. In all sensitivity analyses, the single post-trip tuberculin test remained most cost-effective. For US-born travelers to Haiti, this strategy was associated with cost savings for trips over 22 months. Screening was more cost-effective with increasing trip duration and infection risk, and less so with poorer treatment adherence. Conclusion A single post-trip tuberculin skin test was the most cost-effective strategy considered, for travelers from the United States or Canada. The analysis did not evaluate the use of interferon-gamma release assays, which would be most relevant for travelers who received BCG
DECOST: computer routine for decommissioning cost and funding analysis
International Nuclear Information System (INIS)
Mingst, B.C.
1979-12-01
One of the major controversies surrounding the decommissioning of nuclear facilities is the lack of financial information on just what the eventual costs will be. The Nuclear Regulatory Commission has studies underway to analyze the costs of decommissioning of nuclear fuel cycle facilities and some other similar studies have also been done by other groups. These studies all deal only with the final cost outlays needed to finance decommissioning in an unchangeable set of circumstances. Funding methods and planning to reduce the costs and financial risks are usually not attempted. The DECOST program package is intended to fill this void and allow wide-ranging study of the various options available when planning for the decommissioning of nuclear facilities
International Nuclear Information System (INIS)
Spencer, VN
2001-01-01
An investigation has been conducted regarding the ability of clustered personal computers to improve the performance of executing software simulations for solving engineering problems. The power and utility of personal computers continues to grow exponentially through advances in computing capabilities such as newer microprocessors, advances in microchip technologies, electronic packaging, and cost effective gigabyte-size hard drive capacity. Many engineering problems require significant computing power. Therefore, the computation has to be done by high-performance computer systems that cost millions of dollars and need gigabytes of memory to complete the task. Alternately, it is feasible to provide adequate computing in the form of clustered personal computers. This method cuts the cost and size by linking (clustering) personal computers together across a network. Clusters also have the advantage that they can be used as stand-alone computers when they are not operating as a parallel computer. Parallel computing software to exploit clusters is available for computer operating systems like Unix, Windows NT, or Linux. This project concentrates on the use of Windows NT, and the Parallel Virtual Machine (PVM) system to solve an engineering dynamics problem in Fortran
Gupta, Nishant; Langenderfer, Dale; McCormack, Francis X; Schauer, Daniel P; Eckman, Mark H
2017-01-01
Patients without a known history of lung disease presenting with a spontaneous pneumothorax are generally diagnosed as having primary spontaneous pneumothorax. However, occult diffuse cystic lung diseases such as Birt-Hogg-Dubé syndrome (BHD), lymphangioleiomyomatosis (LAM), and pulmonary Langerhans cell histiocytosis (PLCH) can also first present with a spontaneous pneumothorax, and their early identification by high-resolution computed tomographic (HRCT) chest imaging has implications for subsequent management. The objective of our study was to evaluate the cost-effectiveness of HRCT chest imaging to facilitate early diagnosis of LAM, BHD, and PLCH. We constructed a Markov state-transition model to assess the cost-effectiveness of screening HRCT to facilitate early diagnosis of diffuse cystic lung diseases in patients presenting with an apparent primary spontaneous pneumothorax. Baseline data for prevalence of BHD, LAM, and PLCH and rates of recurrent pneumothoraces in each of these diseases were derived from the literature. Costs were extracted from 2014 Medicare data. We compared a strategy of HRCT screening followed by pleurodesis in patients with LAM, BHD, or PLCH versus conventional management with no HRCT screening. In our base case analysis, screening for the presence of BHD, LAM, or PLCH in patients presenting with a spontaneous pneumothorax was cost effective, with a marginal cost-effectiveness ratio of $1,427 per quality-adjusted life-year gained. Sensitivity analysis showed that screening HRCT remained cost effective for diffuse cystic lung diseases prevalence as low as 0.01%. HRCT image screening for BHD, LAM, and PLCH in patients with apparent primary spontaneous pneumothorax is cost effective. Clinicians should consider performing a screening HRCT in patients presenting with apparent primary spontaneous pneumothorax.
International Nuclear Information System (INIS)
Thomson, J.M.Z.; Maling, T.M.J.; Glocer, J.; Mark, S.; Abbott, C.
2001-01-01
The equivalent sensitivity of non-contrast computed tomography (NCCT) and intravenous urography (IVU) in the diagnosis of suspected ureteric colic has been established. Approximately 50% of patients with suspected ureteric colic do not have a nephro-urological cause for pain. Because many such patients require further imaging studies, NCCT may obviate the need for these studies and, in so doing, be more cost effective and involve less overall radiation exposure. The present study compares the total imaging cost and radiation dose of NCCT versus IVU in the diagnosis of acute flank pain. Two hundred and twenty-four patients (157 men; mean age 45 years; age range 19-79 years) with suspected renal colic were randomized either to NCCT or IVU. The number of additional diagnostic imaging studies, cost (IVU A$ 136; CTU A$ 173), radiation exposure and imaging times were compared. Of 119(53%) patients with renal obstruction, 105 had no nephro-urological causes of pain. For 21 (20%) of these patients an alternative diagnosis was made at the initial imaging, 10 of which were significant. Of 118 IVU patients, 28 (24%) required 32 additional imaging tests to reach a diagnosis, whereas seven of 106 (6%) NCCT patients required seven additional imaging studies. The average total diagnostic imaging cost for the NCCT group was A$181.94 and A$175.46 for the IVU group (P< 0.43). Mean radiation dose to diagnosis was 5.00 mSv (NCCT) versus 3.50 mSv (IVU) (P < 0.001). Mean imaging time was 30 min (NCCT) versus 75 min (IVU) (P < 0.001). Diagnostic imaging costs were remarkably similar. Although NCCT involves a higher radiation dose than IVU, its advantages of faster diagnosis, the avoidance of additional diagnostic imaging tests and its ability to diagnose other causes makes it the study of choice for acute flank pain at Christchurch Hospital. Copyright (2001) Blackwell Science Pty Ltd
Directory of Open Access Journals (Sweden)
Guvenc Kockaya
2011-01-01
Full Text Available Introduction: The current study was designed to estimate the direct cost of noncompliance of diabetes patients to the US health system. Understanding these expenses can inform screening and education budget policy regarding expenditure levels that can be calculated to be cost-beneficial. Materials and Method: The study was conducted in three parts. First, a computer search of National Institutes of Health websites and professional society websites for organizations with members that treat diabetes, and a PubMed search were performed to obtain the numbers required for calculations. Second, formulas were developed to estimate the risk of non-compliance and undiagnosed diabetes. Third, risk calculations were performed using the information obtained in part one and the formulas developed in part two. Results: Direct risk reduction for diabetes-related kidney disease, stroke, heart disease, and amputation were estimated for 100% compliance with diabetes treatment. Risk, case and yearly cost reduction calculated for a 100% compliance with diabetes treatment were 13.6%, 0.9 million and US$ 9.3 billion, respectively. Conclusion: Society, insurers, policy makers and other stakeholders could invest up to these amounts in screening, education and prevention efforts in an effort to reduce these costly and traumatic sequelae of noncompliant diabetes patients. Type: Original Research
DEFF Research Database (Denmark)
Nielsen, Lene H; Olsen, Jens; Markenvard, John
2013-01-01
group. The mean (SD) total costs per patient at the end of thefollow-up were 14% lower in the CTA group than in the ex-test group, € 1510 (3474) vs. €1777 (3746) (P = 0.03). CONCLUSION: Diagnostic assessment of symptomatic patients with a low-intermediate probability of CAD by CTA incurred lower costs......AIMS: The aim of this study was to investigate in patients with stable angina the effects on costs of frontline diagnostics by exercise-stress testing (ex-test) vs. coronary computed tomography angiography (CTA). METHODS AND RESULTS: In two coronary units at Lillebaelt Hospital, Denmark, 498...... patients were identified in whom either ex-test (n = 247) or CTA (n = 251) were applied as the frontline diagnostic strategy in symptomatic patients with a low-intermediate pre-test probability of coronary artery disease (CAD). During 12 months of follow-up, death, myocardial infarction and costs...
Advanced Computational Methods for Thermal Radiative Heat Transfer
Energy Technology Data Exchange (ETDEWEB)
Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.; Hogan, Roy E.,
2016-10-01
Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weapon resp onse in fire environments.
On the Clouds: A New Way of Computing
Directory of Open Access Journals (Sweden)
Yan Han
2010-06-01
Full Text Available This article introduces cloud computing and discusses the author’s experience “on the clouds.” The author reviews cloud computing services and providers, then presents his experience of running multiple systems (e.g., integrated library systems, content management systems, and repository software. He evaluates costs, discusses advantages, and addresses some issues about cloud computing. Cloud computing fundamentally changes the ways institutions and companies manage their computing needs. Libraries can take advantage of cloud computing to start an IT project with low cost, to manage computing resources cost-effectively, and to explore new computing possibilities.
Server Operation and Virtualization to Save Energy and Cost in Future Sustainable Computing
Directory of Open Access Journals (Sweden)
Jun-Ho Huh
2018-06-01
Full Text Available Since the introduction of the LTE (Long Term Evolution service, we have lived in a time of expanding amounts of data. The amount of data produced has increased every year with the increase of smart phone distribution in particular. Telecommunication service providers have to struggle to secure sufficient network capacity in order to maintain quick access to necessary data by consumers. Nonetheless, maintaining the maximum capacity and bandwidth at all times requires considerable cost and excessive equipment. Therefore, to solve such a problem, telecommunication service providers need to maintain an appropriate level of network capacity and to provide sustainable service to customers through a quick network development in case of shortage. So far, telecommunication service providers have bought and used the network equipment directly produced by network equipment manufacturers such as Ericsson, Nokia, Cisco, and Samsung. Since the equipment is specialized for networking, which satisfied consumers with their excellent performances, they are very costly because they are developed with advanced technologies. Moreover, it takes much time due to the purchase process wherein the telecommunication service providers place an order and the manufacturer produces and delivers. Accordingly, there are cases that require signaling and two-way data traffic as well as capacity because of the diversity of IoT devices. For these purposes, the need for NFV (Network Function Virtualization is raised. Equipment virtualization is performed so that it is operated on an x86-based compatible server instead of working on the network equipment manufacturer’s dedicated hardware. By operating in some compatible servers, it can reduce the wastage of hardware and cope with the change thanks to quick hardware development. This study proposed an efficient system of reducing cost in network server operation using such NFV technology and found that the cost was reduced by 24
Li, Sean S; Copeland-Halperin, Libby R; Kaminsky, Alexander J; Li, Jihui; Lodhi, Fahad K; Miraliakbari, Reza
2018-06-01
Computer-aided surgical simulation (CASS) has redefined surgery, improved precision and reduced the reliance on intraoperative trial-and-error manipulations. CASS is provided by third-party services; however, it may be cost-effective for some hospitals to develop in-house programs. This study provides the first cost analysis comparison among traditional (no CASS), commercial CASS, and in-house CASS for head and neck reconstruction. The costs of three-dimensional (3D) pre-operative planning for mandibular and maxillary reconstructions were obtained from an in-house CASS program at our large tertiary care hospital in Northern Virginia, as well as a commercial provider (Synthes, Paoli, PA). A cost comparison was performed among these modalities and extrapolated in-house CASS costs were derived. The calculations were based on estimated CASS use with cost structures similar to our institution and sunk costs were amortized over 10 years. Average operating room time was estimated at 10 hours, with an average of 2 hours saved with CASS. The hourly cost to the hospital for the operating room (including anesthesia and other ancillary costs) was estimated at $4,614/hour. Per case, traditional cases were $46,140, commercial CASS cases were $40,951, and in-house CASS cases were $38,212. Annual in-house CASS costs were $39,590. CASS reduced operating room time, likely due to improved efficiency and accuracy. Our data demonstrate that hospitals with similar cost structure as ours, performing greater than 27 cases of 3D head and neck reconstructions per year can see a financial benefit from developing an in-house CASS program. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Cloud computing for comparative genomics
Directory of Open Access Journals (Sweden)
Pivovarov Rimma
2010-05-01
Full Text Available Abstract Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD, to run within Amazon's Elastic Computing Cloud (EC2. We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.
Computational Sensing Using Low-Cost and Mobile Plasmonic Readers Designed by Machine Learning
Ballard, Zachary S.
2017-01-27
Plasmonic sensors have been used for a wide range of biological and chemical sensing applications. Emerging nanofabrication techniques have enabled these sensors to be cost-effectively mass manufactured onto various types of substrates. To accompany these advances, major improvements in sensor read-out devices must also be achieved to fully realize the broad impact of plasmonic nanosensors. Here, we propose a machine learning framework which can be used to design low-cost and mobile multispectral plasmonic readers that do not use traditionally employed bulky and expensive stabilized light sources or high-resolution spectrometers. By training a feature selection model over a large set of fabricated plasmonic nanosensors, we select the optimal set of illumination light-emitting diodes needed to create a minimum-error refractive index prediction model, which statistically takes into account the varied spectral responses and fabrication-induced variability of a given sensor design. This computational sensing approach was experimentally validated using a modular mobile plasmonic reader. We tested different plasmonic sensors with hexagonal and square periodicity nanohole arrays and revealed that the optimal illumination bands differ from those that are “intuitively” selected based on the spectral features of the sensor, e.g., transmission peaks or valleys. This framework provides a universal tool for the plasmonics community to design low-cost and mobile multispectral readers, helping the translation of nanosensing technologies to various emerging applications such as wearable sensing, personalized medicine, and point-of-care diagnostics. Beyond plasmonics, other types of sensors that operate based on spectral changes can broadly benefit from this approach, including e.g., aptamer-enabled nanoparticle assays and graphene-based sensors, among others.
Performance Assessment of a Custom, Portable, and Low-Cost Brain-Computer Interface Platform.
McCrimmon, Colin M; Fu, Jonathan Lee; Wang, Ming; Lopes, Lucas Silva; Wang, Po T; Karimi-Bidhendi, Alireza; Liu, Charles Y; Heydari, Payam; Nenadic, Zoran; Do, An Hong
2017-10-01
Conventional brain-computer interfaces (BCIs) are often expensive, complex to operate, and lack portability, which confines their use to laboratory settings. Portable, inexpensive BCIs can mitigate these problems, but it remains unclear whether their low-cost design compromises their performance. Therefore, we developed a portable, low-cost BCI and compared its performance to that of a conventional BCI. The BCI was assembled by integrating a custom electroencephalogram (EEG) amplifier with an open-source microcontroller and a touchscreen. The function of the amplifier was first validated against a commercial bioamplifier, followed by a head-to-head comparison between the custom BCI (using four EEG channels) and a conventional 32-channel BCI. Specifically, five able-bodied subjects were cued to alternate between hand opening/closing and remaining motionless while the BCI decoded their movement state in real time and provided visual feedback through a light emitting diode. Subjects repeated the above task for a total of 10 trials, and were unaware of which system was being used. The performance in each trial was defined as the temporal correlation between the cues and the decoded states. The EEG data simultaneously acquired with the custom and commercial amplifiers were visually similar and highly correlated ( ρ = 0.79). The decoding performances of the custom and conventional BCIs averaged across trials and subjects were 0.70 ± 0.12 and 0.68 ± 0.10, respectively, and were not significantly different. The performance of our portable, low-cost BCI is comparable to that of the conventional BCIs. Platforms, such as the one developed here, are suitable for BCI applications outside of a laboratory.
Computer technology and computer programming research and strategies
Antonakos, James L
2011-01-01
Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.
Hobden, Breanne; Bryant, Jamie; Carey, Mariko; Sanson-Fisher, Rob; Oldmeadow, Christopher
2017-08-01
Both computerised and telephone surveys have potential advantages for research data collection. The current study aimed to determine the: (i) feasibility, (ii) acceptability, and (iii) cost per completed survey of computer tablet versus telephone data collection for clients attending an outpatient drug and alcohol treatment clinic. Two-arm randomised controlled trial. Clients attending a drug and alcohol outpatient clinic in New South Wales, Australia, were randomised to complete a baseline survey via computer tablet in the clinic or via telephone interview within two weeks of their appointment. All participants completed a three-month follow-up survey via telephone. Consent and completion rates for the baseline survey were significantly higher in the computer tablet condition. The time taken to complete the computer tablet survey was lower (11min) than the telephone condition (17min). There were no differences in the proportion of consenters or completed follow-up surveys between the two conditions at the 3-month follow-up. Acceptability was high across both modes of data collection. The cost of the computer tablet condition was $67.52 greater per completed survey than the telephone condition. There is a trade-off between computer tablet and telephone data collection. While both data collection methods were acceptable to participants, the computer tablet condition resulted in higher consent and completion rates at baseline, therefore yielding greater external validity, and was quicker for participants to complete. Telephone data collection was however, more cost-effective. Researchers should carefully consider the mode of data collection that suits individual study needs. Copyright © 2017 Elsevier Ltd. All rights reserved.
The cost-effectiveness of methanol for reducing motor vehicle emissions and urban ozone
International Nuclear Information System (INIS)
Krupnick, A.J.; Walls, M.A.
1992-01-01
This article analyzes the costs and emissions characteristics of methanol vehicles. The cost-effectiveness of methanol - the cost per ton of reactive hydrocarbon emissions reduced - is calculated and compared to the cost-effectiveness of other hydrocarbon reduction strategies. Methanol is found to cost from $33,000 to nearly $60,000 per ton, while several other options are available for under $10,000 per ton. The cost per part-per-million reduction in peak ambient ozone levels is also computed for two cities, Houston and Philadelphia. Despite the greater improvement in ozone in Philadelphia than Houston, methanol is found to be more cost-effective in Houston. This result occurs because Houston's distribution and marketing costs are lower than Philadelphia's. The costs in both cities, however, are far higher than estimates of the benefits from acute health improvements. Finally, the reduction in ozone exposure in Los Angeles is estimated and the costs of the reduction compared with an estimate of acute health benefits. Again, the benefits fall far short of the costs. 51 refs., 5 tabs
Ubiquitous Computing: The Universal Use of Computers on College Campuses.
Brown, David G., Ed.
This book is a collection of vignettes from 13 universities where everyone on campus has his or her own computer. These 13 institutions have instituted "ubiquitous computing" in very different ways at very different costs. The chapters are: (1) "Introduction: The Ubiquitous Computing Movement" (David G. Brown); (2) "Dartmouth College" (Malcolm…
How to Bill Your Computer Services.
Dooskin, Herbert P.
1981-01-01
A computer facility billing procedure should be designed so that the full costs of a computer center operation are equitably charged to the users. Design criteria, costing methods, and management's role are discussed. (Author/MLF)
Effect of migration based on strategy and cost on the evolution of cooperation
International Nuclear Information System (INIS)
Li, Yan; Ye, Hang
2015-01-01
Highlights: •Propose a migration based on strategy and cost in the Prisoner’s Dilemma Game. •The level of cooperation without mutation is higher than that with mutation. •Increased costs have no effect on the level of cooperation without mutation. •The level of cooperation decreases with the increase in cost with mutation. •An optimal density value ρ resulting in the maximum level of cooperation exists. -- Abstract: Humans consider not only their own ability but also the environment around them during the process of migration. Based on this fact, we introduce migration based on strategy and cost into the Spatial Prisoner’s Dilemma Game on a two-dimensional grid. The migration means that agents cannot move when all of the neighbors are cooperators; otherwise, agents move with a probability related to payoff and cost. The result obtained by the computer simulation shows that the moving mechanism based on strategy and cost improves the level of cooperation in a wide parameter space. This occurs because movement based on strategy effectively keeps the cooperative clusters and because movement based on cost effectively regulates the rate of movement. Both types of movement provide a favorable guarantee for the evolution of stable cooperation under the mutation rate q = 0.0. In addition, we discuss the effectiveness of the migration mechanism in the evolution of cooperation under the mutation rate q = 0.001. The result indicates that a higher level of cooperation is obtained at a lower migration cost, whereas cooperation is suppressed at a higher migration cost. Our work may provide an effective method for understanding the emergence of cooperation in our society
Cloud Computing for radiologists.
Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit
2012-07-01
Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.
Cloud Computing for radiologists
International Nuclear Information System (INIS)
Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit
2012-01-01
Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future
Cloud computing for radiologists
Directory of Open Access Journals (Sweden)
Amit T Kharat
2012-01-01
Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.
DEFF Research Database (Denmark)
Bogetoft, Peter; Hougaard, Jens Leth; Smilgins, Aleksandrs
2016-01-01
This paper deals with empirical computation of Aumann–Shapley cost shares for joint production. We show that if one uses a mathematical programing approach with its non-parametric estimation of the cost function there may be observations in the data set for which we have multiple Aumann–Shapley p...... of assumptions concerning firm behavior. These assumptions enable us to connect inefficient with efficient production and thereby provide consistent ways of allocating the costs arising from inefficiency....
The role of dedicated data computing centers in the age of cloud computing
Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr
2017-10-01
Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.
Chiappa, Pierangelo
Bandwidth-hungry services, such as higher speed Internet, voice over IP (VoIP), and IPTV, allow people to exchange and store huge amounts of data among worldwide locations. In the age of global communications, domestic users, companies, and organizations around the world generate new contents making bandwidth needs grow exponentially, along with the need for new services. These bandwidth and connectivity demands represent a concern for operators who require innovative technologies to be ready for scaling. To respond efficiently to these demands, Alcatel-Lucent is fast moving toward photonic integration circuits technologies as the key to address best performances at the lowest "bit per second" cost. This article describes Alcatel-Lucent's contribution in strategic directions or achievements, as well as possible new developments.
Patrizi, Alfredo; Pennestrì, Ettore; Valentini, Pier Paolo
2016-01-01
The paper deals with the comparison between a high-end marker-based acquisition system and a low-cost marker-less methodology for the assessment of the human posture during working tasks. The low-cost methodology is based on the use of a single Microsoft Kinect V1 device. The high-end acquisition system is the BTS SMART that requires the use of reflective markers to be placed on the subject's body. Three practical working activities involving object lifting and displacement have been investigated. The operational risk has been evaluated according to the lifting equation proposed by the American National Institute for Occupational Safety and Health. The results of the study show that the risk multipliers computed from the two acquisition methodologies are very close for all the analysed activities. In agreement to this outcome, the marker-less methodology based on the Microsoft Kinect V1 device seems very promising to promote the dissemination of computer-aided assessment of ergonomics while maintaining good accuracy and affordable costs. PRACTITIONER’S SUMMARY: The study is motivated by the increasing interest for on-site working ergonomics assessment. We compared a low-cost marker-less methodology with a high-end marker-based system. We tested them on three different working tasks, assessing the working risk of lifting loads. The two methodologies showed comparable precision in all the investigations.
User's guide to SERICPAC: A computer program for calculating electric-utility avoided costs rates
Energy Technology Data Exchange (ETDEWEB)
Wirtshafter, R.; Abrash, M.; Koved, M.; Feldman, S.
1982-05-01
SERICPAC is a computer program developed to calculate average avoided cost rates for decentralized power producers and cogenerators that sell electricity to electric utilities. SERICPAC works in tandem with SERICOST, a program to calculate avoided costs, and determines the appropriate rates for buying and selling of electricity from electric utilities to qualifying facilities (QF) as stipulated under Section 210 of PURA. SERICPAC contains simulation models for eight technologies including wind, hydro, biogas, and cogeneration. The simulations are converted in a diversified utility production which can be either gross production or net production, which accounts for an internal electricity usage by the QF. The program allows for adjustments to the production to be made for scheduled and forced outages. The final output of the model is a technology-specific average annual rate. The report contains a description of the technologies and the simulations as well as complete user's guide to SERICPAC.
Gaydos, Leonard
1978-01-01
Computer-aided techniques for interpreting multispectral data acquired by Landsat offer economies in the mapping of land cover. Even so, the actual establishment of the statistical classes, or "signatures," is one of the relatively more costly operations involved. Analysts have therefore been seeking cost-saving signature extension techniques that would accept training data acquired for one time or place and apply them to another. Opportunities to extend signatures occur in preprocessing steps and in the classification steps that follow. In the present example, land cover classes were derived by the simplest and most direct form of signature extension: Classes statistically derived from a Landsat scene for the Puget Sound area, Wash., were applied to the Portland area, Oreg., using data for the next Landsat scene acquired less than 25 seconds down orbit. Many features can be recognized on the reduced-scale version of the Portland land cover map shown in this report, although no statistical assessment of its accuracy is available.
Adaptive Cost-Based Task Scheduling in Cloud Environment
Directory of Open Access Journals (Sweden)
Mohammed A. S. Mosleh
2016-01-01
Full Text Available Task execution in cloud computing requires obtaining stored data from remote data centers. Though this storage process reduces the memory constraints of the user’s computer, the time deadline is a serious concern. In this paper, Adaptive Cost-based Task Scheduling (ACTS is proposed to provide data access to the virtual machines (VMs within the deadline without increasing the cost. ACTS considers the data access completion time for selecting the cost effective path to access the data. To allocate data access paths, the data access completion time is computed by considering the mean and variance of the network service time and the arrival rate of network input/output requests. Then the task priority is assigned to the removed tasks based data access time. Finally, the cost of data paths are analyzed and allocated based on the task priority. Minimum cost path is allocated to the low priority tasks and fast access path are allocated to high priority tasks as to meet the time deadline. Thus efficient task scheduling can be achieved by using ACTS. The experimental results conducted in terms of execution time, computation cost, communication cost, bandwidth, and CPU utilization prove that the proposed algorithm provides better performance than the state-of-the-art methods.
SOME NOTES ON COST ALLOCATION IN MULTICASTING
Directory of Open Access Journals (Sweden)
Darko Skorin-Kapov
2012-12-01
Full Text Available We analyze the cost allocation strategies with the problef of broadcasting information from some source to a number of communication network users. A multicast routing chooses a minimum cost tree network that spans the source and all the receivers. The cost of such a network is distributed among its receivers who may be individuals or organizations with possibly conflicting interests. Providing network developers, users and owners with practical computable 'fair' cost allocation solution procedures is of great importance for network mamagement. Consequently, this multidisciplinary problem was extensively studied by Operational Researchers, Economists, Mathematicians and Computer Scientists. The fairness of various proposed solutions was even argued in US courts. This presentation overviews some previously published, as well as some recent results, in the development of algorithmic mechanisms to efficiently compute 'attractive' cost allocation solutions for multicast networks. Specifically, we will analyze cooperative game theory based cost allocation models that avoid cross subsidies and/or are distance and population monotonic. We will also present some related open cost allocation problems and the potential contribution that such models might make to this problem in the future.
Reflections on Costing, Pricing and Income Measurement at UK Higher Education Institutions
Oduoza, Chike F.
2009-01-01
In these days of radical contraction of funding and expansion in student numbers, universities are under pressure to prioritise their resources, as well as to achieve effective costing and pricing to support judgement and decision making for funding and any external work undertaken. This study reviews costing, pricing and income measurement in…
El-Galaly, Tarec Christoffer; Mylam, Karen Juul; Brown, Peter; Specht, Lena; Christiansen, Ilse; Munksgaard, Lars; Johnsen, Hans Erik; Loft, Annika; Bukh, Anne; Iyer, Victor; Nielsen, Anne Lerberg; Hutchings, Martin
2012-06-01
The value of performing post-therapy routine surveillance imaging in patients with Hodgkin lymphoma is controversial. This study evaluates the utility of positron emission tomography/computed tomography using 2-[18F]fluoro-2-deoxyglucose for this purpose and in situations with suspected lymphoma relapse. We conducted a multicenter retrospective study. Patients with newly diagnosed Hodgkin lymphoma achieving at least a partial remission on first-line therapy were eligible if they received positron emission tomography/computed tomography surveillance during follow-up. Two types of imaging surveillance were analyzed: "routine" when patients showed no signs of relapse at referral to positron emission tomography/computed tomography, and "clinically indicated" when recurrence was suspected. A total of 211 routine and 88 clinically indicated positron emission tomography/computed tomography studies were performed in 161 patients. In ten of 22 patients with recurrence of Hodgkin lymphoma, routine imaging surveillance was the primary tool for the diagnosis of the relapse. Extranodal disease, interim positron emission tomography-positive lesions and positron emission tomography activity at response evaluation were all associated with a positron emission tomography/computed tomography-diagnosed preclinical relapse. The true positive rates of routine and clinically indicated imaging were 5% and 13%, respectively (P = 0.02). The overall positive predictive value and negative predictive value of positron emission tomography/computed tomography were 28% and 100%, respectively. The estimated cost per routine imaging diagnosed relapse was US$ 50,778. Negative positron emission tomography/computed tomography reliably rules out a relapse. The high false positive rate is, however, an important limitation and a confirmatory biopsy is mandatory for the diagnosis of a relapse. With no proven survival benefit for patients with a pre-clinically diagnosed relapse, the high costs and low
Trenouth, Lani; Colbourn, Timothy; Fenn, Bridget; Pietzsch, Silke; Myatt, Mark; Puett, Chloe
2018-07-01
Cash-based interventions (CBIs) increasingly are being used to deliver humanitarian assistance and there is growing interest in the cost-effectiveness of cash transfers for preventing undernutrition in emergency contexts. The objectives of this study were to assess the costs, cost-efficiency and cost-effectiveness in achieving nutrition outcomes of three CBIs in southern Pakistan: a 'double cash' (DC) transfer, a 'standard cash' (SC) transfer and a 'fresh food voucher' (FFV) transfer. Cash and FFVs were provided to poor households with children aged 6-48 months for 6 months in 2015. The SC and FFV interventions provided $14 monthly and the DC provided $28 monthly. Cost data were collected via institutional accounting records, interviews, programme observation, document review and household survey. Cost-effectiveness was assessed as cost per case of wasting, stunting and disability-adjusted life year (DALY) averted. Beneficiary costs were higher for the cash groups than the voucher group. Net total cost transfer ratios (TCTRs) were estimated as 1.82 for DC, 2.82 for SC and 2.73 for FFV. Yet, despite the higher operational costs, the FFV TCTR was lower than the SC TCTR when incorporating the participation cost to households, demonstrating the relevance of including beneficiary costs in cost-efficiency estimations. The DC intervention achieved a reduction in wasting, at $4865 per case averted; neither the SC nor the FFV interventions reduced wasting. The cost per case of stunting averted was $1290 for DC, $882 for SC and $883 for FFV. The cost per DALY averted was $641 for DC, $434 for SC and $563 for FFV without discounting or age weighting. These interventions are highly cost-effective by international thresholds. While it is debatable whether these resource requirements represent a feasible or sustainable investment given low health expenditures in Pakistan, these findings may provide justification for continuing Pakistan's investment in national social safety
Young, David W
2015-11-01
Historically, hospital departments have computed the costs of individual tests or procedures using the ratio of cost to charges (RCC) method, which can produce inaccurate results. To determine a more accurate cost of a test or procedure, the activity-based costing (ABC) method must be used. Accurate cost calculations will ensure reliable information about the profitability of a hospital's DRGs.
Computing networks from cluster to cloud computing
Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien
2013-01-01
"Computing Networks" explores the core of the new distributed computing infrastructures we are using today: the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and
Higher energy: is it necessary, is it worth the cost for radiation oncology?
Das, I J; Kase, K R
1992-01-01
The physical characteristics of the interactions of megavoltage photons and electrons with matter provide distinct advantages, relative to low-energy (orthovoltage) x rays, that lead to better radiation dose distributions in patients. Use of these high-energy radiations has resulted in better patient care, which has been reflected in improved radiation treatment outcome in recent years. But, as the desire for higher energy radiation beams increases, it becomes important to determine whether the physical characteristics that make megavoltage beams beneficial continue to provide a net advantage. It is demonstrated that, in fact, there is an energy range from 4 to 15 MV for photons and 4 to 20 MeV for electrons that is optimally suited for the treatment of cancer in humans. Radiation beams that exceed these maximum energies were found to add no advantage. This is because the costs (price of unit, installation, maintenance, shielding for neutron and photons) are not justified by either improved physical characteristics of the radiation (penetration, skin sparing, dose distribution) or treatment outcome. In fact, for photon beams some physical characteristics result in less desirable dose distributions, less accurate dosimetry, and increased safety problems as the energy increases for example, increasingly diffuse beam edges, loss of electron equilibrium, uncertainty in dose perturbations at interfaces, increased neutron contamination, and potential for higher personnel dose. The special features that make electron beams useful at lower energies, for example, skin sparing and small penetration, are lost at high energies. These physical factors are analyzed together with the economic factors related to radiation therapy patient care using megavoltage beams.
The Protein Cost of Metabolic Fluxes: Prediction from Enzymatic Rate Laws and Cost Minimization.
Directory of Open Access Journals (Sweden)
Elad Noor
2016-11-01
Full Text Available Bacterial growth depends crucially on metabolic fluxes, which are limited by the cell's capacity to maintain metabolic enzymes. The necessary enzyme amount per unit flux is a major determinant of metabolic strategies both in evolution and bioengineering. It depends on enzyme parameters (such as kcat and KM constants, but also on metabolite concentrations. Moreover, similar amounts of different enzymes might incur different costs for the cell, depending on enzyme-specific properties such as protein size and half-life. Here, we developed enzyme cost minimization (ECM, a scalable method for computing enzyme amounts that support a given metabolic flux at a minimal protein cost. The complex interplay of enzyme and metabolite concentrations, e.g. through thermodynamic driving forces and enzyme saturation, would make it hard to solve this optimization problem directly. By treating enzyme cost as a function of metabolite levels, we formulated ECM as a numerically tractable, convex optimization problem. Its tiered approach allows for building models at different levels of detail, depending on the amount of available data. Validating our method with measured metabolite and protein levels in E. coli central metabolism, we found typical prediction fold errors of 4.1 and 2.6, respectively, for the two kinds of data. This result from the cost-optimized metabolic state is significantly better than randomly sampled metabolite profiles, supporting the hypothesis that enzyme cost is important for the fitness of E. coli. ECM can be used to predict enzyme levels and protein cost in natural and engineered pathways, and could be a valuable computational tool to assist metabolic engineering projects. Furthermore, it establishes a direct connection between protein cost and thermodynamics, and provides a physically plausible and computationally tractable way to include enzyme kinetics into constraint-based metabolic models, where kinetics have usually been ignored or
Volunteer Computing for Science Gateways
Anderson, David
2017-01-01
This poster offers information about volunteer computing for science gateways that offer high-throughput computing services. Volunteer computing can be used to get computing power. This increases the visibility of the gateway to the general public as well as increasing computing capacity at little cost.
Exploring Issues about Computational Thinking in Higher Education
Czerkawski, Betul C.; Lyman, Eugene W., III
2015-01-01
The term computational thinking (CT) has been in academic discourse for decades, but gained new currency in 2006, when Jeanette Wing used it to describe a set of thinking skills that students in all fields may require in order to succeed. Wing's initial article and subsequent writings on CT have been broadly influential; experts in…
Low-Cost Spectral Sensor Development Description.
Energy Technology Data Exchange (ETDEWEB)
Armijo, Kenneth Miguel; Yellowhair, Julius
2014-11-01
Solar spectral data for all parts of the US is limited due in part to the high cost of commercial spectrometers. Solar spectral information is necessary for accurate photovoltaic (PV) performance forecasting, especially for large utility-scale PV installations. A low-cost solar spectral sensor would address the obstacles and needs. In this report, a novel low-cost, discrete- band sensor device, comprised of five narrow-band sensors, is described. The hardware is comprised of commercial-off-the-shelf components to keep the cost low. Data processing algorithms were developed and are being refined for robustness. PV module short-circuit current ( I sc ) prediction methods were developed based on interaction-terms regression methodology and spectrum reconstruction methodology for computing I sc . The results suggest the computed spectrum using the reconstruction method agreed well with the measured spectrum from the wide-band spectrometer (RMS error of 38.2 W/m 2 -nm). Further analysis of computed I sc found a close correspondence of 0.05 A RMS error. The goal is for ubiquitous adoption of the low-cost spectral sensor in solar PV and other applications such as weather forecasting.
Parallel computing in experimental mechanics and optical measurement: A review (II)
Wang, Tianyi; Kemao, Qian
2018-05-01
With advantages such as non-destructiveness, high sensitivity and high accuracy, optical techniques have successfully integrated into various important physical quantities in experimental mechanics (EM) and optical measurement (OM). However, in pursuit of higher image resolutions for higher accuracy, the computation burden of optical techniques has become much heavier. Therefore, in recent years, heterogeneous platforms composing of hardware such as CPUs and GPUs, have been widely employed to accelerate these techniques due to their cost-effectiveness, short development cycle, easy portability, and high scalability. In this paper, we analyze various works by first illustrating their different architectures, followed by introducing their various parallel patterns for high speed computation. Next, we review the effects of CPU and GPU parallel computing specifically in EM & OM applications in a broad scope, which include digital image/volume correlation, fringe pattern analysis, tomography, hyperspectral imaging, computer-generated holograms, and integral imaging. In our survey, we have found that high parallelism can always be exploited in such applications for the development of high-performance systems.
Directory of Open Access Journals (Sweden)
Wichai Chattinnawat
2015-06-01
Full Text Available This research aims to apply Lean technique in conjunction with analysis of Material Flow Cost Accounting (MFCA to production process of canned sweet corn in order to increase process efficiency, eliminate waste and reduce cost of the production. This research develops and presents new type of MFCA analysis by incorporating value and non-value added activities into the MFCA cost allocation process. According to the simulation-based measurement of the process efficiency, integrated cost allocation based on activity types results in higher proportion of negative product cost in comparison to that computed from conventional MFCA cost allocation. Thus, considering types of activities and process efficiency have great impacts on cost structure especially for the negative product cost. The research leads to solutions to improve work procedures, eliminate waste and reduce production cost. The overall cost per unit decreases with higher proportion of positive product cost.
48 CFR 42.709-4 - Computing interest.
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computing interest. 42.709... MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES Indirect Cost Rates 42.709-4 Computing interest. For 42.709-1(a)(1)(ii), compute interest on any paid portion of the disallowed cost as follows: (a) Consider...
Spacelab experiment computer study. Volume 1: Executive summary (presentation)
Lewis, J. L.; Hodges, B. C.; Christy, J. O.
1976-01-01
A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.
Cost benefit analysis vs. referenda
Martin J. Osborne; Matthew A. Turner
2007-01-01
We consider a planner who chooses between two possible public policies and ask whether a referendum or a cost benefit analysis leads to higher welfare. We find that a referendum leads to higher welfare than a cost benefit analyses in "common value" environments. Cost benefit analysis is better in "private value" environments.
Virtualization and cloud computing in dentistry.
Chow, Frank; Muftu, Ali; Shorter, Richard
2014-01-01
The use of virtualization and cloud computing has changed the way we use computers. Virtualization is a method of placing software called a hypervisor on the hardware of a computer or a host operating system. It allows a guest operating system to run on top of the physical computer with a virtual machine (i.e., virtual computer). Virtualization allows multiple virtual computers to run on top of one physical computer and to share its hardware resources, such as printers, scanners, and modems. This increases the efficient use of the computer by decreasing costs (e.g., hardware, electricity administration, and management) since only one physical computer is needed and running. This virtualization platform is the basis for cloud computing. It has expanded into areas of server and storage virtualization. One of the commonly used dental storage systems is cloud storage. Patient information is encrypted as required by the Health Insurance Portability and Accountability Act (HIPAA) and stored on off-site private cloud services for a monthly service fee. As computer costs continue to increase, so too will the need for more storage and processing power. Virtual and cloud computing will be a method for dentists to minimize costs and maximize computer efficiency in the near future. This article will provide some useful information on current uses of cloud computing.
Bertoldi, Eduardo G; Stella, Steffan F; Rohde, Luis E; Polanczyk, Carisi A
2016-05-01
Several tests exist for diagnosing coronary artery disease, with varying accuracy and cost. We sought to provide cost-effectiveness information to aid physicians and decision-makers in selecting the most appropriate testing strategy. We used the state-transitions (Markov) model from the Brazilian public health system perspective with a lifetime horizon. Diagnostic strategies were based on exercise electrocardiography (Ex-ECG), stress echocardiography (ECHO), single-photon emission computed tomography (SPECT), computed tomography coronary angiography (CTA), or stress cardiac magnetic resonance imaging (C-MRI) as the initial test. Systematic review provided input data for test accuracy and long-term prognosis. Cost data were derived from the Brazilian public health system. Diagnostic test strategy had a small but measurable impact in quality-adjusted life-years gained. Switching from Ex-ECG to CTA-based strategies improved outcomes at an incremental cost-effectiveness ratio of 3100 international dollars per quality-adjusted life-year. ECHO-based strategies resulted in cost and effectiveness almost identical to CTA, and SPECT-based strategies were dominated because of their much higher cost. Strategies based on stress C-MRI were most effective, but the incremental cost-effectiveness ratio vs CTA was higher than the proposed willingness-to-pay threshold. Invasive strategies were dominant in the high pretest probability setting. Sensitivity analysis showed that results were sensitive to costs of CTA, ECHO, and C-MRI. Coronary CT is cost-effective for the diagnosis of coronary artery disease and should be included in the Brazilian public health system. Stress ECHO has a similar performance and is an acceptable alternative for most patients, but invasive strategies should be reserved for patients at high risk. © 2016 Wiley Periodicals, Inc.
DEFF Research Database (Denmark)
Wøhlk, Sanne; Laporte, Gilbert
2017-01-01
The aim of this paper is to computationally compare several algorithms for the Minimum Cost Perfect Matching Problem on an undirected complete graph. Our work is motivated by the need to solve large instances of the Capacitated Arc Routing Problem (CARP) arising in the optimization of garbage...... collection in Denmark. Common heuristics for the CARP involve the optimal matching of the odd-degree nodes of a graph. The algorithms used in the comparison include the CPLEX solution of an exact formulation, the LEDA matching algorithm, a recent implementation of the Blossom algorithm, as well as six...
Computerized cost estimation spreadsheet and cost data base for fusion devices
International Nuclear Information System (INIS)
Hamilton, W.R.; Rothe, K.E.
1985-01-01
An automated approach to performing and cataloging cost estimates has been developed at the Fusion Engineering Design Center (FEDC), wherein the cost estimate record is stored in the LOTUS 1-2-3 spreadsheet on an IBM personal computer. The cost estimation spreadsheet is based on the cost coefficient/cost algorithm approach and incorporates a detailed generic code of cost accounts for both tokamak and tandem mirror devices. Component design parameters (weight, surface area, etc.) and cost factors are input, and direct and indirect costs are calculated. The cost data base file derived from actual cost experience within the fusion community and refined to be compatible with the spreadsheet costing approach is a catalog of cost coefficients, algorithms, and component costs arranged into data modules corresponding to specific components and/or subsystems. Each data module contains engineering, equipment, and installation labor cost data for different configurations and types of the specific component or subsystem. This paper describes the assumptions, definitions, methodology, and architecture incorporated in the development of the cost estimation spreadsheet and cost data base, along with the type of input required and the output format
Faust, N.; Jordon, L.
1981-01-01
Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.
Pedrycz, Witold; Chen, Shyi-Ming
2011-01-01
Information granules are conceptual entities that aid the perception of complex phenomena. This book looks at granular computing techniques such as algorithmic pursuits and includes diverse applications and case studies from fields such as power engineering.
A recursive algorithm for computing the inverse of the Vandermonde matrix
Directory of Open Access Journals (Sweden)
Youness Aliyari Ghassabeh
2016-12-01
Full Text Available The inverse of a Vandermonde matrix has been used for signal processing, polynomial interpolation, curve fitting, wireless communication, and system identification. In this paper, we propose a novel fast recursive algorithm to compute the inverse of a Vandermonde matrix. The algorithm computes the inverse of a higher order Vandermonde matrix using the available lower order inverse matrix with a computational cost of $ O(n^2 $. The proposed algorithm is given in a matrix form, which makes it appropriate for hardware implementation. The running time of the proposed algorithm to find the inverse of a Vandermonde matrix using a lower order Vandermonde matrix is compared with the running time of the matrix inversion function implemented in MATLAB.
Using Amazon's Elastic Compute Cloud to scale CMS' compute hardware dynamically.
Melo, Andrew Malone
2011-01-01
Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud-computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely on-demand as limits and caps on usage are imposed. Our trial workflows allow us t...
Influence of studying in higher educational establishment on students’ harmful computer habits
Directory of Open Access Journals (Sweden)
M.D. Kudryavtsev
2016-10-01
Full Text Available Purpose: to determine influence of educational process on prevalence of students’ harmful computer habits. Material: in the research 1st-3rd year students (803 boys and 596 girls participated. All they specialized in discipline Physical culture. The students had no health disorders. Results: it was found that in average students have 2 computer habits everyone. The most probable and dangerous in respect to addicting are habits to use internet and computer games. Student, who has these habits, spends more than 4 hours a day for them. 33% of 1st year boys and 16% of 1st year girls spend more than 2 hours a day for computer games. 15-20 % of boys and 25-30% of year girls waste more than 4 hours a day in internet. 10-15% of boys spend more than 4 hours a day for computer games. It is very probable that these students already have computer games’ addiction. Conclusions: recent time dangerous tendency to watching anime has been appearing. Physical culture faculties and departments shall take additional measures on reduction of students’ computer addictions. Teachers of all disciplines shall organize educational process with the help of electronic resources so that not to provoke progressing of students’ computer habits.
The thermodynamic cost of quantum operations
International Nuclear Information System (INIS)
Bedingham, D J; Maroney, O J E
2016-01-01
The amount of heat generated by computers is rapidly becoming one of the main problems for developing new generations of information technology. The thermodynamics of computation sets the ultimate physical bounds on heat generation. A lower bound is set by the Landauer limit, at which computation becomes thermodynamically reversible. For classical computation there is no physical principle which prevents this limit being reached, and approaches to it are already being experimentally tested. In this paper we show that for quantum computation with a set of signal states satisfying given conditions, there is an unavoidable excess heat generation that renders it inherently thermodynamically irreversible. The Landauer limit cannot, in general, be reached by quantum computers. We show the existence of a lower bound to the heat generated by quantum computing that exceeds that given by the Landauer limit, give the special conditions where this excess cost may be avoided, and provide a protocol for achieving the limiting heat cost when these conditions are met. We also show how classical computing falls within the special conditions. (paper)
Global warming and urban smog: The cost effectiveness of CAFE standards and alternative fuels
International Nuclear Information System (INIS)
Krupnick, A.J.; Walls, M.A.; Collins, C.T.
1992-01-01
This paper evaluates alternative transportation policies for reducing greenhouse gas emissions and ozone precursors. The net cost-effectiveness -- i.e., the cost per ton of greenhouse gas reduced, adjusted for ozone reduction benefits -- of substituting methanol, compressed natural gas (CNG), and reformulated gasoline for conventional gasoline is assessed and compared with the cost-effectiveness of raising the corporate average fuel economy (CAFE) standard to 38 miles per gallon. Computing this open-quotes netclose quotes cost-effectiveness is one way of measuring the joint environmental benefits that these alternatives provide. Greenhouse gas emissions are assessed over the entire fuel cycle and include not only carbon dioxide emissions, but also methane, carbon monoxide, and nitrous oxide emissions. In computing cost-effectiveness, we account for the so-called open-quotes rebound effectclose quotes -- the impact on vehicle-miles traveled of higher or lower fuel costs. CNG is found to be the most cost-effective of these alternatives, followed by increasing the CAFE standard, substituting methanol for gasoline, and substituting reformulated for conventional gasoline. Including the ozone reduction benefits does not change the rankings of the alternatives, but does make the alternative fuels look better relative to increasing the CAFE standard. Incorporating the rebound effect greatly changes the magnitude of the estimates but does not change the rankings of the alternatives. None of the alternatives look cost-effective should a carbon tax of $35 per ton be passes (the proposal in the Stark bill, H.R. 1086), and only CNG under optimistic assumptions looks cost-effective if a tax of $100 per ton of carbon is passed
28 CFR 100.16 - Cost estimate submission.
2010-07-01
..., quantity, and cost. (ii) Direct labor. Provide a time-phased (e.g., monthly, quarterly) breakdown of labor... estimates. (iii) Allocable direct costs. Indicate how allocable costs are computed and applied, including... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Cost estimate submission. 100.16 Section...
Suiter, Martha, Ed.
This set of proceedings assembles papers presented at the 1995 Council for Higher Education Computing Services (CHECS) conference, held at the New Mexico Military Institute in Roswell, New Mexico. CHECS members are higher education computing services organizations within the state of New Mexico. The main focus of the conference was the Internet…
International Nuclear Information System (INIS)
Larcos, G.; Chi, K.K.G.; Berry, G.; Westmead Hospital, Sydney, NSW; Shiell, A.
2000-01-01
There is a controversy regarding the investigation of patients with suspected acute pulmonary embolism (PE). To compare the cost-effectiveness of alternative methods of diagnosing acute PE, chest helical computed tomography (CT) alone and in combination with venous ultrasound (US) of legs and pulmonary angiography (PA) were compared to a conventional algorithm using ventilation-perfusion (V/Q) scintigraphy supplemented in selected cases by US and PA. A decision-analytical model was constructed to model the costs and effects of the three diagnostic strategies in a hypothetical cohort of 1000 patients each. Transition probabilities were based on published data. Life years gained by each strategy were estimated from published mortality rates. Schedule fees were used to estimate costs. The V/Q protocol is both more expensive and more effective than CT alone resulting in 20.1 additional lives saved at a (discounted) cost of $940 per life year gained. An additional 2.5 lives can be saved if CT replaces V/Q scintigraphy in the diagnostic algorithm but at a cost of $23,905 per life year saved. It resulted that the more effective diagnostic strategies are also more expensive. In patients with suspected PE, the incremental cost-effectiveness of the V/Q based strategy over CT alone is reasonable in comparison with other health interventions. The cost-effectiveness of the supplemented CT strategy is more questionable. Copyright (2000) The Australasian College of Physicians
Aviat, Félix; Lagardère, Louis; Piquemal, Jean-Philip
2017-10-01
In a recent paper [F. Aviat et al., J. Chem. Theory Comput. 13, 180-190 (2017)], we proposed the Truncated Conjugate Gradient (TCG) approach to compute the polarization energy and forces in polarizable molecular simulations. The method consists in truncating the conjugate gradient algorithm at a fixed predetermined order leading to a fixed computational cost and can thus be considered "non-iterative." This gives the possibility to derive analytical forces avoiding the usual energy conservation (i.e., drifts) issues occurring with iterative approaches. A key point concerns the evaluation of the analytical gradients, which is more complex than that with a usual solver. In this paper, after reviewing the present state of the art of polarization solvers, we detail a viable strategy for the efficient implementation of the TCG calculation. The complete cost of the approach is then measured as it is tested using a multi-time step scheme and compared to timings using usual iterative approaches. We show that the TCG methods are more efficient than traditional techniques, making it a method of choice for future long molecular dynamics simulations using polarizable force fields where energy conservation matters. We detail the various steps required for the implementation of the complete method by software developers.
25 CFR 700.81 - Monthly housing cost.
2010-04-01
... 25 Indians 2 2010-04-01 2010-04-01 false Monthly housing cost. 700.81 Section 700.81 Indians THE... Policies and Instructions Definitions § 700.81 Monthly housing cost. (a) General. The term monthly housing...) Computation of monthly housing cost for replacement dwelling. A person's monthly housing cost for a...
Local matching indicators for concave transport costs
Delon , Julie; Salomon , Julien; Sobolevskii , A.
2010-01-01
International audience; In this note, we introduce a class of indicators that enable to compute efficiently optimal transport plans associated to arbitrary distributions of $N$ demands and $N$ supplies in $\\mathbf{R}$ in the case where the cost function is concave. The computational cost of these indicators is small and independent of $N$. A hierarchical use of them enables to obtain an efficient algorithm.
The provider cost of treating tuberculosis in Bauchi State, Nigeria
Directory of Open Access Journals (Sweden)
Nisser Ali Umar
2011-09-01
Full Text Available The study was aimed at assessing the economic cost shouldered by government, as providers, in the provision of free tuberculosis (TB diagnosis and treatment services in Bauchi State, northern Nigeria. A cost analysis study was designed and questionnaires administered by the principal investigators to officers in charge of 27 randomly sampled government TB services providers across the State of Bauchi. Seventeen of these centers were primary care centers, 9 secondary care providers and one was a tertiary care provider. Data was also collected from personnel and projects records in the State Ministry of Health, of Works as well as the Ministry of Budget and Planning. The cost of buildings, staff and equipment replacement, laboratory, radiology and drugs in facilities were assessed and costs attributable tuberculosis inpatient, outpatient and directly observed therapy (DOT services were estimated from the total cost based on the proportion of TB cases in the total patient pool accessing those services. The average proportion of TB patients in facilities was 3.4% in overall, 3.3% among inpatients and 3.1% in the outpatient population. The average cost spent to treat a patient with TB was estimated at US $227.14. The cost of inpatient care averaged $16.95/patient; DOT and outpatient services was $133.34/patient, while the overhead cost per patient was $30.89. The overall cost and all computed cost elements, except for DOT services, were highest in the tertiary center and least expensive in the infectious diseases hospital partly due to the higher administrative and other overhead recurrent spending in the tertiary health facility while the lower overhead cost observed in the infectious diseases hospital could be due to the economy of scale as a result of the relative higher number of TB cases seen in the facility operating with relatively same level of resources as other facilities in the state.
Scripting intercultural computer-supported collaborative learning in higher education
Popov, V.
2013-01-01
Introduction of computer-supported collaborative learning (CSCL), specifically in an intercultural learning environment, creates both challenges and benefits. Among the challenges are the coordination of different attitudes, styles of communication, and patterns of behaving. Among the benefits are
Workflow Scheduling Using Hybrid GA-PSO Algorithm in Cloud Computing
Directory of Open Access Journals (Sweden)
Ahmad M. Manasrah
2018-01-01
Full Text Available Cloud computing environment provides several on-demand services and resource sharing for clients. Business processes are managed using the workflow technology over the cloud, which represents one of the challenges in using the resources in an efficient manner due to the dependencies between the tasks. In this paper, a Hybrid GA-PSO algorithm is proposed to allocate tasks to the resources efficiently. The Hybrid GA-PSO algorithm aims to reduce the makespan and the cost and balance the load of the dependent tasks over the heterogonous resources in cloud computing environments. The experiment results show that the GA-PSO algorithm decreases the total execution time of the workflow tasks, in comparison with GA, PSO, HSGA, WSGA, and MTCT algorithms. Furthermore, it reduces the execution cost. In addition, it improves the load balancing of the workflow application over the available resources. Finally, the obtained results also proved that the proposed algorithm converges to optimal solutions faster and with higher quality compared to other algorithms.
Cost and cost-effectiveness of conventional and liquid-based ...
African Journals Online (AJOL)
Methods. The unit of effectiveness was defined as the number of cervical intraepithelial neoplasm (CIN) II or higher lesions detected. Costs were assessed retrospectively for the financial year (2010/11) from a laboratory service provider perspective. A cost-effectiveness analysis was performed by combining secondary data ...
Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors
Taylor, Estelle; Goede, Roelien; Steyn, Tjaart
2011-01-01
Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…
Jones, Nicholas Rv; Tong, Tammy Yn; Monsivais, Pablo
2018-04-01
To test whether diets achieving recommendations from the UK's Scientific Advisory Committee on Nutrition (SACN) were associated with higher monetary costs in a nationally representative sample of UK adults. A cross-sectional study linking 4 d diet diaries in the National Diet and Nutrition Survey (NDNS) to contemporaneous food price data from a market research firm. The monetary cost of diets was assessed in relation to whether or not they met eight food- and nutrient-based recommendations from SACN. Regression models adjusted for potential confounding factors. The primary outcome measure was individual dietary cost per day and per 2000 kcal (8368 kJ). UK. Adults (n 2045) sampled between 2008 and 2012 in the NDNS. On an isoenergetic basis, diets that met the recommendations for fruit and vegetables, oily fish, non-milk extrinsic sugars, fat, saturated fat and salt were estimated to be between 3 and 17 % more expensive. Diets meeting the recommendation for red and processed meats were 4 % less expensive, while meeting the recommendation for fibre was cost-neutral. Meeting multiple targets was also associated with higher costs; on average, diets meeting six or more SACN recommendations were estimated to be 29 % more costly than isoenergetic diets that met no recommendations. Food costs may be a population-level barrier limiting the adoption of dietary recommendations in the UK. Future research should focus on identifying systems- and individual-level strategies to enable consumers achieve dietary recommendations without increasing food costs. Such strategies may improve the uptake of healthy eating in the population.
Can Online Learning Bend the Higher Education Cost Curve?
David J. Deming; Claudia Goldin; Lawrence F. Katz; Noam Yuchtman
2015-01-01
We examine whether online learning technologies have led to lower prices in higher education. Using data from the Integrated Postsecondary Education Data System, we show that online education is concentrated in large for-profit chains and less-selective public institutions. We find that colleges with a higher share of online students charge lower tuition prices. We present evidence of declining real and relative prices for full-time undergraduate online education from 2006 to 2013. Although t...
Dessoff, Alan
2009-01-01
This article examines issues on health care costs and describes measures taken by public districts to reduce spending. As in most companies in America, health plan designs in public districts are being changed to reflect higher out-of-pocket costs, such as higher deductibles on visits to providers, hospital stays, and prescription drugs. District…
Chen, Y-F; Madan, J; Welton, N; Yahaya, I; Aveyard, P; Bauld, L; Wang, D; Fry-Smith, A; Munafò, M R
2012-01-01
Smoking is harmful to health. On average, lifelong smokers lose 10 years of life, and about half of all lifelong smokers have their lives shortened by smoking. Stopping smoking reverses or prevents many of these harms. However, cessation services in the NHS achieve variable success rates with smokers who want to quit. Approaches to behaviour change can be supplemented with electronic aids, and this may significantly increase quit rates and prevent a proportion of cases that relapse. The primary research question we sought to answer was: What is the effectiveness and cost-effectiveness of internet, pc and other electronic aids to help people stop smoking? We addressed the following three questions: (1) What is the effectiveness of internet sites, computer programs, mobile telephone text messages and other electronic aids for smoking cessation and/or reducing relapse? (2) What is the cost-effectiveness of incorporating internet sites, computer programs, mobile telephone text messages and other electronic aids into current nhs smoking cessation programmes? and (3) What are the current gaps in research into the effectiveness of internet sites, computer programs, mobile telephone text messages and other electronic aids to help people stop smoking? For the effectiveness review, relevant primary studies were sought from The Cochrane Library [Cochrane Central Register of Controlled Trials (CENTRAL)] 2009, Issue 4, and MEDLINE (Ovid), EMBASE (Ovid), PsycINFO (Ovid), Health Management Information Consortium (HMIC) (Ovid) and Cumulative Index to Nursing and Allied Health Literature (CINAHL) (EBSCOhost) from 1980 to December 2009. In addition, NHS Economic Evaluation Database (NHS EED) and Database of Abstracts of Reviews of Effects (DARE) were searched for information on cost-effectiveness and modelling for the same period. Reference lists of included studies and of relevant systematic reviews were examined to identify further potentially relevant studies. Research registries
Opinions on Computing Education in Korean K-12 System: Higher Education Perspective
Kim, Dae-Kyoo; Jeong, Dongwon; Lu, Lunjin; Debnath, Debatosh; Ming, Hua
2015-01-01
The need for computing education in the K-12 curriculum has grown globally. The Republic of Korea is not an exception. In response to the need, the Korean Ministry of Education has announced an outline for software-centric computing education in the K-12 system, which aims at enhancing the current computing education with software emphasis. In…
Development of a small-scale computer cluster
Wilhelm, Jay; Smith, Justin T.; Smith, James E.
2008-04-01
An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.
Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming
Philip A. Araman
1990-01-01
This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...
Consumer Dispersion and Logistics Costs in Various Distribution Systems
DEFF Research Database (Denmark)
Turkensteen, Marcel; Klose, Andreas
We address the relationship between the geographical dispersion of a set of demand points and the expected logistics costs. This is relevant in the strategic marketing decision which groups of consumers to target. We devise quickly computable measures for the logistics costs. In our experiments......, dispersed sets of demand points are created. For various types of distribution systems, expected logistics costs are computed using continuous approximation, location and routing methodologies. We find that the average distance between locations is an effective estimate of the logistics costs....
20 CFR 404.270 - Cost-of-living increases.
2010-04-01
... INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases § 404.270 Cost-of-living... rises in the cost of living. These automatic increases also apply to other benefit amounts, as described...
Drug development costs when financial risk is measured using the Fama-French three-factor model.
Vernon, John A; Golec, Joseph H; Dimasi, Joseph A
2010-08-01
In a widely cited article, DiMasi, Hansen, and Grabowski (2003) estimate the average pre-tax cost of bringing a new molecular entity to market. Their base case estimate, excluding post-marketing studies, was $802 million (in $US 2000). Strikingly, almost half of this cost (or $399 million) is the cost of capital (COC) used to fund clinical development expenses to the point of FDA marketing approval. The authors used an 11% real COC computed using the capital asset pricing model (CAPM). But the CAPM is a single factor risk model, and multi-factor risk models are the current state of the art in finance. Using the Fama-French three factor model we find that the cost of drug development to be higher than the earlier estimate. Copyright (c) 2009 John Wiley & Sons, Ltd.
Direct Synthesis of Microwave Waveforms for Quantum Computing
Raftery, James; Vrajitoarea, Andrei; Zhang, Gengyan; Leng, Zhaoqi; Srinivasan, Srikanth; Houck, Andrew
Current state of the art quantum computing experiments in the microwave regime use control pulses generated by modulating microwave tones with baseband signals generated by an arbitrary waveform generator (AWG). Recent advances in digital analog conversion technology have made it possible to directly synthesize arbitrary microwave pulses with sampling rates of 65 gigasamples per second (GSa/s) or higher. These new ultra-wide bandwidth AWG's could dramatically simplify the classical control chain for quantum computing experiments, presenting potential cost savings and reducing the number of components that need to be carefully calibrated. Here we use a Keysight M8195A AWG to study the viability of such a simplified scheme, demonstrating randomized benchmarking of a superconducting qubit with high fidelity.
Replacement power costs due to nuclear-plant outages: a higher standard of care
International Nuclear Information System (INIS)
Gransee, M.F.
1982-01-01
This article examines recent state public utility commission cases that deal with the high costs of replacement power that utilities must purchase after a nuclear power plant outage. Although most commissions have approved such expenses, it may be that there is a trend toward splitting the costs of such expenses between ratepayer and stockholder. Commissions are demanding a management prudence test to determine the cause of the outage and whether it meets the reasonable man standard before allowing these costs to be passed along to ratepayers. Unless the standard is applied with flexibility, however, utility companies could invoke the defenses covering traditional common law negligence
Schutzer, Matthew E; Arthur, Douglas W; Anscher, Mitchell S
2016-05-01
Value in health care is defined as outcomes achieved per dollar spent, and understanding cost is critical to delivering high-value care. Traditional costing methods reflect charges rather than fundamental costs to provide a service. The more rigorous method of time-driven activity-based costing was used to compare cost between whole-breast radiotherapy (WBRT) and accelerated partial-breast irradiation (APBI) using balloon-based brachytherapy. For WBRT (25 fractions with five-fraction boost) and APBI (10 fractions twice daily), process maps were created outlining each activity from consultation to post-treatment follow up. Through staff interviews, time estimates were obtained for each activity. The capacity cost rates (CCR), defined as cost per minute, were calculated for personnel, equipment, and physical space. Total cost was calculated by multiplying the time required of each resource by its CCR. This was then summed and combined with cost of consumable materials. The total cost for WBRT was $5,333 and comprised 56% personnel costs and 44% space/equipment costs. For APBI, the total cost was $6,941 (30% higher than WBRT) and comprised 51% personnel costs, 6% space/equipment costs, and 43% consumable materials costs. The attending physician had the highest CCR of all personnel ($4.28/min), and APBI required 24% more attending time than WBRT. The most expensive activity for APBI was balloon placement and for WBRT was computed tomography simulation. APBI cost more than WBRT when using the dose/fractionation schemes analyzed. Future research should use time-driven activity-based costing to better understand cost with the aim of reducing expenditure and defining bundled payments. Copyright © 2016 by American Society of Clinical Oncology.
Development of computer software for pavement life cycle cost analysis.
1988-01-01
The life cycle cost analysis program (LCCA) is designed to automate and standardize life cycle costing in Virginia. It allows the user to input information necessary for the analysis, and it then completes the calculations and produces a printed copy...
Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff
2014-01-01
This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres
Higher dimensional time-energy entanglement
International Nuclear Information System (INIS)
Richart, Daniel Lampert
2014-01-01
Judging by the compelling number of innovations based on taming quantum mechanical effects, such as the development of transistors and lasers, further research in this field promises to tackle further technological challenges in the years to come. This statement gains even more importance in the information processing scenario. Here, the growing data generation and the correspondingly higher need for more efficient computational resources and secure high bandwidth networks are central problems which need to be tackled. In this sense, the required CPU minituarization makes the design of structures at atomic levels inevitable, as foreseen by Moore's law. From these perspectives, it is necessary to concentrate further research efforts into controlling and manipulating quantum mechanical systems. This enables for example to encode quantum superposition states to tackle problems which are computationally NP hard and which therefore cannot be solved efficiently by classical computers. The only limitation affecting these solutions is the low scalability of existing quantum systems. Similarly, quantum communication schemes are devised to certify the secure transmission of quantum information, but are still limited by a low transmission bandwidth. This thesis follows the guideline defined by these research projects and aims to further increase the scalability of the quantum mechanical systems required to perform these tasks. The method used here is to encode quantum states into photons generated by spontaneous parametric down-conversion (SPDC). An intrinsic limitation of photons is that the scalability of quantum information schemes employing them is limited by the low detection efficiency of commercial single photon detectors. This is addressed by encoding higher dimensional quantum states into two photons, increasing the scalability of the scheme in comparison to multi-photon states. Further on, the encoding of quantum information into the emission-time degree of
Higher dimensional time-energy entanglement
Energy Technology Data Exchange (ETDEWEB)
Richart, Daniel Lampert
2014-07-08
Judging by the compelling number of innovations based on taming quantum mechanical effects, such as the development of transistors and lasers, further research in this field promises to tackle further technological challenges in the years to come. This statement gains even more importance in the information processing scenario. Here, the growing data generation and the correspondingly higher need for more efficient computational resources and secure high bandwidth networks are central problems which need to be tackled. In this sense, the required CPU minituarization makes the design of structures at atomic levels inevitable, as foreseen by Moore's law. From these perspectives, it is necessary to concentrate further research efforts into controlling and manipulating quantum mechanical systems. This enables for example to encode quantum superposition states to tackle problems which are computationally NP hard and which therefore cannot be solved efficiently by classical computers. The only limitation affecting these solutions is the low scalability of existing quantum systems. Similarly, quantum communication schemes are devised to certify the secure transmission of quantum information, but are still limited by a low transmission bandwidth. This thesis follows the guideline defined by these research projects and aims to further increase the scalability of the quantum mechanical systems required to perform these tasks. The method used here is to encode quantum states into photons generated by spontaneous parametric down-conversion (SPDC). An intrinsic limitation of photons is that the scalability of quantum information schemes employing them is limited by the low detection efficiency of commercial single photon detectors. This is addressed by encoding higher dimensional quantum states into two photons, increasing the scalability of the scheme in comparison to multi-photon states. Further on, the encoding of quantum information into the emission-time degree of
Costa Ferrer, Raquel; Serrano Rosa, Miguel Ángel; Zornoza Abad, Ana; Salvador Fernández-Montejo, Alicia
2010-11-01
The cardiovascular (CV) response to social challenge and stress is associated with the etiology of cardiovascular diseases. New ways of communication, time pressure and different types of information are common in our society. In this study, the cardiovascular response to two different tasks (open vs. closed information) was examined employing different communication channels (computer-mediated vs. face-to-face) and with different pace control (self vs. external). Our results indicate that there was a higher CV response in the computer-mediated condition, on the closed information task and in the externally paced condition. These role of these factors should be considered when studying the consequences of social stress and their underlying mechanisms.
Leger, Guy
Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…
Engineering computations at the national magnetic fusion energy computer center
International Nuclear Information System (INIS)
Murty, S.
1983-01-01
The National Magnetic Fusion Energy Computer Center (NMFECC) was established by the U.S. Department of Energy's Division of Magnetic Fusion Energy (MFE). The NMFECC headquarters is located at Lawrence Livermore National Laboratory. Its purpose is to apply large-scale computational technology and computing techniques to the problems of controlled thermonuclear research. In addition to providing cost effective computing services, the NMFECC also maintains a large collection of computer codes in mathematics, physics, and engineering that is shared by the entire MFE research community. This review provides a broad perspective of the NMFECC, and a list of available codes at the NMFECC for engineering computations is given
Nigatu, Yeshambel T.; Bultmann, Ute; Schoevers, Robert A.; Penninx, Brenda W. J. H.; Reijneveld, Sijmen A.
2017-01-01
Background: Evidence lacks on whether obesity along with major depression (MD)/anxiety leads to higher health care use (HCU) and health care-related costs (HCC) compared with either condition alone. The objective of the study was to examine the longitudinal associations of obesity, MD/anxiety, and
Rai, Ansaar T; Evans, Kim; Riggs, Jack E; Hobbs, Gerald R
2016-04-01
Owing to their severity, large vessel occlusion (LVO) strokes may be associated with higher costs that are not reflected in current coding systems. This study aimed to determine whether intravenous thrombolysis costs are related to the presence or absence of LVO. Patients who had undergone intravenous thrombolysis over a 9-year period were divided into LVO and no LVO (nLVO) groups based on admission CT angiography. The primary outcome was hospital cost per admission. Secondary outcomes included admission duration, 90-day clinical outcome, and discharge destination. 119 patients (53%) had LVO and 104 (47%) had nLVO. Total mean±SD cost per LVO patient was $18,815±14,262 compared with $15,174±11,769 per nLVO patient (p=0.04). Hospital payments per admission were $17,338±13,947 and $15,594±16,437 for LVO and nLVO patients, respectively (p=0.4). A good outcome was seen in 33 LVO patients (27.7%) and in 69 nLVO patients (66.4%) (OR 0.2, 95% CI 0.1 to 0.3, pregression analysis after controlling for comorbidities showed the presence of LVO to be an independent predictor of higher total hospital costs. The presence or absence of LVO is associated with significant differences in hospital costs, outcomes, admission duration, and home discharge. These differences can be important when developing systems of care models for acute ischemic stroke. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
MODERN ADVANCES IMPLEMENTATION FOR A PASTROL VENTURE MODELS OF NOVEL CLOUD COMPUTING
Sandeep Kumar* Ankur Goel
2018-01-01
In this paper nnovations are expected to affect the progress in environment. A majority of enterprises are effecting to cut back their computing cost from the options for virtualization. This need for lowering the computing cost has ended in the innovation of Cloud Computing. Cloud Computing offers better computing through improved utilization and reduced administration and infrastructure cost. Cloud Computing is separated around the world in distinguish format. This is the schema to emerge h...
Cloud Computing with iPlant Atmosphere.
McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos
2013-10-15
Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.
CECP, Decommissioning Costs for PWR and BWR
International Nuclear Information System (INIS)
Bierschbach, M.C.
1997-01-01
1 - Description of program or function: The Cost Estimating Computer Program CECP, designed for use on an IBM personal computer or equivalent, was developed for estimating the cost of decommissioning boiling water reactor (BWR) and light-water reactor (PWR) power stations to the point of license termination. 2 - Method of solution: Cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial volume and costs; and manpower staffing costs. Using equipment and consumables costs and inventory data supplied by the user, CECP calculates unit cost factors and then combines these factors with transportation and burial cost algorithms to produce a complete report of decommissioning costs. In addition to costs, CECP also calculates person-hours, crew-hours, and exposure person-hours associated with decommissioning. 3 - Restrictions on the complexity of the problem: The program is designed for a specific waste charge structure. The waste cost data structure cannot handle intermediate waste handlers or changes in the charge rate structures. The decommissioning of a reactor can be divided into 5 periods. 200 different items for special equipment costs are possible. The maximum amount for each special equipment item is 99,999,999$. You can support data for 10 buildings, 100 components each; ESTS1071/01: There are 65 components for 28 systems available to specify the contaminated systems costs (BWR). ESTS1071/02: There are 75 components for 25 systems available to specify the contaminated systems costs (PWR)
Gould, Michael K; Sanders, Gillian D; Barnett, Paul G; Rydzak, Chara E; Maclean, Courtney C; McClellan, Mark B; Owens, Douglas K
2003-05-06
Positron emission tomography (PET) with 18-fluorodeoxyglucose (FDG) is a potentially useful but expensive test to diagnose solitary pulmonary nodules. To evaluate the cost-effectiveness of strategies for pulmonary nodule diagnosis and to specifically compare strategies that did and did not include FDG-PET. Decision model. Accuracy and complications of diagnostic tests were estimated by using meta-analysis and literature review. Modeled survival was based on data from a large tumor registry. Cost estimates were derived from Medicare reimbursement and other sources. All adult patients with a new, noncalcified pulmonary nodule seen on chest radiograph. Patient lifetime. Societal. 40 clinically plausible combinations of 5 diagnostic interventions, including computed tomography, FDG-PET, transthoracic needle biopsy, surgery, and watchful waiting. Costs, quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios. The cost-effectiveness of strategies depended critically on the pretest probability of malignancy. For patients with low pretest probability (26%), strategies that used FDG-PET selectively when computed tomography results were possibly malignant cost as little as 20 000 dollars per QALY gained. For patients with high pretest probability (79%), strategies that used FDG-PET selectively when computed tomography results were benign cost as little as 16 000 dollars per QALY gained. For patients with intermediate pretest probability (55%), FDG-PET strategies cost more than 220 000 dollars per QALY gained because they were more costly but only marginally more effective than computed tomography-based strategies. The choice of strategy also depended on the risk for surgical complications, the probability of nondiagnostic needle biopsy, the sensitivity of computed tomography, and patient preferences for time spent in watchful waiting. In probabilistic sensitivity analysis, FDG-PET strategies were cost saving or cost less than 100 000 dollars per QALY
Execution spaces for simple higher dimensional automata
DEFF Research Database (Denmark)
Raussen, Martin
2012-01-01
Higher dimensional automata (HDA) are highly expressive models for concurrency in Computer Science, cf van Glabbeek (Theor Comput Sci 368(1–2): 168–194, 2006). For a topologist, they are attractive since they can be modeled as cubical complexes—with an inbuilt restriction for directions of allowa......Higher dimensional automata (HDA) are highly expressive models for concurrency in Computer Science, cf van Glabbeek (Theor Comput Sci 368(1–2): 168–194, 2006). For a topologist, they are attractive since they can be modeled as cubical complexes—with an inbuilt restriction for directions...
Thoracoabdominal computed tomography in trauma patients: a cost-consequences analysis
Vugt, R. van; Kool, D.R.; Brink, M.; Dekker, H.M.; Deunk, J.; Edwards, M.J.R.
2014-01-01
BACKGROUND: CT is increasingly used during the initial evaluation of blunt trauma patients. In this era of increasing cost-awareness, the pros and cons of CT have to be assessed. OBJECTIVES: This study was performed to evaluate cost-consequences of different diagnostic algorithms that use
Energy Technology Data Exchange (ETDEWEB)
Huppertz, Alexander, E-mail: Alexander.Huppertz@charite.de [Imaging Science Institute Charite Berlin, Robert-Koch-Platz 7, D-10115 Berlin (Germany); Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Radmer, Sebastian, E-mail: s.radmer@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany); Asbach, Patrick, E-mail: Patrick.Asbach@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Juran, Ralf, E-mail: ralf.juran@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Schwenke, Carsten, E-mail: carsten.schwenke@scossis.de [Biostatistician, Scossis Statistical Consulting, Zeltinger Str. 58G, D-13465 Berlin (Germany); Diederichs, Gerd, E-mail: gerd.diederichs@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Hamm, Bernd, E-mail: Bernd.Hamm@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Sparmann, Martin, E-mail: m.sparmann@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany)
2011-06-15
Computed tomography (CT) was used for preoperative planning of minimal-invasive total hip arthroplasty (THA). 92 patients (50 males, 42 females, mean age 59.5 years) with a mean body-mass-index (BMI) of 26.5 kg/m{sup 2} underwent 64-slice CT to depict the pelvis, the knee and the ankle in three independent acquisitions using combined x-, y-, and z-axis tube current modulation. Arthroplasty planning was performed using 3D-Hip Plan (Symbios, Switzerland) and patient radiation dose exposure was determined. The effects of BMI, gender, and contralateral THA on the effective dose were evaluated by an analysis-of-variance. A process-cost-analysis from the hospital perspective was done. All CT examinations were of sufficient image quality for 3D-THA planning. A mean effective dose of 4.0 mSv (SD 0.9 mSv) modeled by the BMI (p < 0.0001) was calculated. The presence of a contralateral THA (9/92 patients; p = 0.15) and the difference between males and females were not significant (p = 0.08). Personnel involved were the radiologist (4 min), the surgeon (16 min), the radiographer (12 min), and administrative personnel (4 min). A CT operation time of 11 min and direct per-patient costs of 52.80 Euro were recorded. Preoperative CT for THA was associated with a slight and justifiable increase of radiation exposure in comparison to conventional radiographs and low per-patient costs.
Internationalization of Higher Education: Potential Benefits and Costs
Jibeen, Tahira; Khan, Masha Asad
2015-01-01
Internationalization of higher education is the top stage of international relations among universities and it is no longer regarded as a goal in itself, but as a means to improve the quality of education. The knowledge translation and acquisition, mobilization of talent in support of global research and enchantment of the curriculum with…
Gedanken Experiments in Educational Cost Effectiveness
Brudner, Harvey J.
1978-01-01
Discusses the effectiveness of cost determining techniques in education. The areas discussed are: education and management; cost-effectiveness models; figures of merit determination; and the implications as they relate to the areas of audio-visual and computer educational technology. (Author/GA)
Selectively Fortifying Reconfigurable Computing Device to Achieve Higher Error Resilience
Directory of Open Access Journals (Sweden)
Mingjie Lin
2012-01-01
Full Text Available With the advent of 10 nm CMOS devices and “exotic” nanodevices, the location and occurrence time of hardware defects and design faults become increasingly unpredictable, therefore posing severe challenges to existing techniques for error-resilient computing because most of them statically assign hardware redundancy and do not account for the error tolerance inherently existing in many mission-critical applications. This work proposes a novel approach to selectively fortifying a target reconfigurable computing device in order to achieve hardware-efficient error resilience for a specific target application. We intend to demonstrate that such error resilience can be significantly improved with effective hardware support. The major contributions of this work include (1 the development of a complete methodology to perform sensitivity and criticality analysis of hardware redundancy, (2 a novel problem formulation and an efficient heuristic methodology to selectively allocate hardware redundancy among a target design’s key components in order to maximize its overall error resilience, and (3 an academic prototype of SFC computing device that illustrates a 4 times improvement of error resilience for a H.264 encoder implemented with an FPGA device.
Computed radiography in NDT application
International Nuclear Information System (INIS)
Deprins, Eric
2004-01-01
Computed Radiography, or digital radiography by use of reusable Storage Phosphor screens, offers a convenient and reliable way to replace film. In addition to the reduced cost on consumables, the return on investment of CR systems is strongly determined by savings in exposure time, processing times and archival times. But also intangible costs like plant shutdown, environment safety and longer usability of isotopes are increasingly important when considering replacing film by Storage Phosphor systems. But mote than in traditional radiography, the use of digital images is a trade-off between the speed and the required quality. Better image quality is obtained by longer exposure times, slower phosphor screens and higher scan resolutions. Therefore, different kinds of storage phosphor screens are needed in order to cover every application. Most operations have the data, associated with the tests to be performed, centrally stored in a database. Using a digital radiography system gives not only the advantages of the manipulation of digital images, but also the digital data that is associated with it. Smart methods to associate cassettes and Storage screens with exposed images enhance the workflow of the NDT processes, and avoid human error. Automated measurements tools increase the throughput in different kinds of operations. This paper gives an overview of the way certain operations have decided to replace film by Computed Radiography, and what the major benefits for them have been.
Chidburee, P.; Mills, J. P.; Miller, P. E.; Fieber, K. D.
2016-06-01
Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM) pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i) the development of an Android mobile application; (ii) the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii) performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan) and a web-based system (Autodesk 123D Catch). Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard at a local-scale.
Directory of Open Access Journals (Sweden)
P. Chidburee
2016-06-01
Full Text Available Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i the development of an Android mobile application; (ii the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan and a web-based system (Autodesk 123D Catch. Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard
5 CFR 838.241 - Cost-of-living adjustments.
2010-01-01
... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Cost-of-living adjustments. 838.241... Affecting Employee Annuities Procedures for Computing the Amount Payable § 838.241 Cost-of-living... provide for cost-of-living adjustments on the former spouse's payment from employee annuity, the cost-of...
Cloud computing for radiologists
Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh
2012-01-01
Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...
Handheld computers for self-administered sensitive data collection: A comparative study in Peru
Directory of Open Access Journals (Sweden)
Hughes James P
2008-03-01
Full Text Available Abstract Background Low-cost handheld computers (PDA potentially represent an efficient tool for collecting sensitive data in surveys. The goal of this study is to evaluate the quality of sexual behavior data collected with handheld computers in comparison with paper-based questionnaires. Methods A PDA-based program for data collection was developed using Open-Source tools. In two cross-sectional studies, we compared data concerning sexual behavior collected with paper forms to data collected with PDA-based forms in Ancon (Lima. Results The first study enrolled 200 participants (18–29 years. General agreement between data collected with paper format and handheld computers was 86%. Categorical variables agreement was between 70.5% and 98.5% (Kappa: 0.43–0.86 while numeric variables agreement was between 57.1% and 79.8% (Spearman: 0.76–0.95. Agreement and correlation were higher in those who had completed at least high school than those with less education. The second study enrolled 198 participants. Rates of responses to sensitive questions were similar between both kinds of questionnaires. However, the number of inconsistencies (p = 0.0001 and missing values (p = 0.001 were significantly higher in paper questionnaires. Conclusion This study showed the value of the use of handheld computers for collecting sensitive data, since a high level of agreement between paper and PDA responses was reached. In addition, a lower number of inconsistencies and missing values were found with the PDA-based system. This study has demonstrated that it is feasible to develop a low-cost application for handheld computers, and that PDAs are feasible alternatives for collecting field data in a developing country.
A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing
Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul
2014-01-01
The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245
A lightweight distributed framework for computational offloading in mobile cloud computing.
Directory of Open Access Journals (Sweden)
Muhammad Shiraz
Full Text Available The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs. Therefore, Mobile Cloud Computing (MCC leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.
Palmer, Rebecca; Cooper, Cindy; Enderby, Pam; Brady, Marian; Julious, Steven; Bowen, Audrey; Latimer, Nicholas
2015-01-27
Aphasia affects the ability to speak, comprehend spoken language, read and write. One third of stroke survivors experience aphasia. Evidence suggests that aphasia can continue to improve after the first few months with intensive speech and language therapy, which is frequently beyond what resources allow. The development of computer software for language practice provides an opportunity for self-managed therapy. This pragmatic randomised controlled trial will investigate the clinical and cost effectiveness of a computerised approach to long-term aphasia therapy post stroke. A total of 285 adults with aphasia at least four months post stroke will be randomly allocated to either usual care, computerised intervention in addition to usual care or attention and activity control in addition to usual care. Those in the intervention group will receive six months of self-managed word finding practice on their home computer with monthly face-to-face support from a volunteer/assistant. Those in the attention control group will receive puzzle activities, supplemented by monthly telephone calls. Study delivery will be coordinated by 20 speech and language therapy departments across the United Kingdom. Outcome measures will be made at baseline, six, nine and 12 months after randomisation by blinded speech and language therapist assessors. Primary outcomes are the change in number of words (of personal relevance) named correctly at six months and improvement in functional conversation. Primary outcomes will be analysed using a Hochberg testing procedure. Significance will be declared if differences in both word retrieval and functional conversation at six months are significant at the 5% level, or if either comparison is significant at 2.5%. A cost utility analysis will be undertaken from the NHS and personal social service perspective. Differences between costs and quality-adjusted life years in the three groups will be described and the incremental cost effectiveness ratio
Cost estimating relationships for nuclear power plant operationa and maintenance
International Nuclear Information System (INIS)
Bowers, H.I.; Fuller, L.C.; Myers, M.L.
1987-11-01
Revised cost estimating relationships for 1987 are presented for estimating annual nonfuel operation and maintenance (O and M) costs for light-water reactor (LWR) nuclear power plants, which update guidelines published previously in 1982. The purpose of these cost estimating relationships is for use in long range planning and evaluations of the economics of nuclear energy for electric power generation. A listing of a computer program, LWROM, implementing the cost estimating relationships and written in advanced BASIC for IBM personal computers, is included
Cloud Computing Adoption Model for Universities to Increase ICT Proficiency
Directory of Open Access Journals (Sweden)
Safiya Okai
2014-08-01
Full Text Available Universities around the world especially those in developing countries are faced with the problem of delivering the level of information and communications technology (ICT needed to facilitate teaching, learning, research, and development activities ideal in a typical university, which is needed to meet educational needs in-line with advancement in technology and the growing dependence on IT. This is mainly due to the high cost involved in providing and maintaining the needed hardware and software. A technology such as cloud computing that delivers on demand provisioning of IT resources on a pay per use basis can be used to address this problem. Cloud computing promises better delivery of IT services as well as availability whenever and wherever needed at reduced costs with users paying only as much as they consume through the services of cloud service providers. The cloud technology reduces complexity while increasing speed and quality of IT services provided; however, despite these benefits the challenges that come with its adoption have left many sectors especially the higher education skeptical in committing to this technology. This article identifies the reasons for the slow rate of adoption of cloud computing at university level, discusses the challenges faced and proposes a cloud computing adoption model that contains strategic guidelines to overcome the major challenges identified and a roadmap for the successful adoption of cloud computing by universities. The model was tested in one of the universities and found to be both useful and appropriate for adopting cloud computing at university level.
Directory of Open Access Journals (Sweden)
David Nicol
2003-12-01
Full Text Available Significant investments are being made in the application of new information and communications technologies (ICT to teaching and learning in higher education. However, until recently, there has been little progress in devising an integrated costbenefit model that decision-makers can use to appraise ICT investment options from the wider institutional perspective. This paper describes and illustrates a model that has been developed to enable evaluations of the costs and benefits of the use of ICT. The strengths and limitations of the model are highlighted and discussed
[Process-oriented cost calculation in interventional radiology. A case study].
Mahnken, A H; Bruners, P; Günther, R W; Rasche, C
2012-01-01
Currently used costing methods such as cost centre accounting do not sufficiently reflect the process-based resource utilization in medicine. The goal of this study was to establish a process-oriented cost assessment of percutaneous radiofrequency (RF) ablation of liver and lung metastases. In each of 15 patients a detailed task analysis of the primary process of hepatic and pulmonary RF ablation was performed. Based on these data a dedicated cost calculation model was developed for each primary process. The costs of each process were computed and compared with the revenue for in-patients according to the German diagnosis-related groups (DRG) system 2010. The RF ablation of liver metastases in patients without relevant comorbidities and a low patient complexity level results in a loss of EUR 588.44, whereas the treatment of patients with a higher complexity level yields an acceptable profit. The treatment of pulmonary metastases is profitable even in cases of additional expenses due to complications. Process-oriented costing provides relevant information that is needed for understanding the economic impact of treatment decisions. It is well suited as a starting point for economically driven process optimization and reengineering. Under the terms of the German DRG 2010 system percutaneous RF ablation of lung metastases is economically reasonable, while RF ablation of liver metastases in cases of low patient complexity levels does not cover the costs.
Marketing Policy and Its Cost in a College of Higher Education.
Riley, Eric
1984-01-01
Discusses the development of advertising and publicity strategies and policy for student recruitment purposes at a college of education in the United Kingdom between 1972 and 1982. Covers changes in staff attitudes, selection of media, organization of administration, and cost factors. (PGD)
Capabilities and Advantages of Cloud Computing in the Implementation of Electronic Health Record.
Ahmadi, Maryam; Aslani, Nasim
2018-01-01
With regard to the high cost of the Electronic Health Record (EHR), in recent years the use of new technologies, in particular cloud computing, has increased. The purpose of this study was to review systematically the studies conducted in the field of cloud computing. The present study was a systematic review conducted in 2017. Search was performed in the Scopus, Web of Sciences, IEEE, Pub Med and Google Scholar databases by combination keywords. From the 431 article that selected at the first, after applying the inclusion and exclusion criteria, 27 articles were selected for surveyed. Data gathering was done by a self-made check list and was analyzed by content analysis method. The finding of this study showed that cloud computing is a very widespread technology. It includes domains such as cost, security and privacy, scalability, mutual performance and interoperability, implementation platform and independence of Cloud Computing, ability to search and exploration, reducing errors and improving the quality, structure, flexibility and sharing ability. It will be effective for electronic health record. According to the findings of the present study, higher capabilities of cloud computing are useful in implementing EHR in a variety of contexts. It also provides wide opportunities for managers, analysts and providers of health information systems. Considering the advantages and domains of cloud computing in the establishment of HER, it is recommended to use this technology.
International Nuclear Information System (INIS)
Didsbury, R.; Bains, N.; Cho, U.Y.
1998-01-01
The use of three dimensional(3D) computer-aided design and drafting(CADD) models, and the associated information technology and databases, in the engineering and construction phases of large projects is well established and yielding significant improvements in project cost, schedule and quality. The information contained in these models can also be extremely valuable to operating plants, particularly when the visual and spatial information contained in the 3D models is interfaced to other plant information databases. Indeed many plant owners and operators in the process and power industries are already using this technology to assist with such activities as plant configuration management, staff training, work planning and radiation protection. This paper will explore the application of 3D models and the associated databases in an operating plant environment and describe the resulting operational benefits and cost reduction benefits. Several industrial experience case studies will be presented along with suggestions for further future applications. (author). 4 refs., 1 tab., 8 figs
Research on cloud computing solutions
Liudvikas Kaklauskas; Vaida Zdanytė
2015-01-01
Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...
Min, James K; Shaw, Leslee J; Berman, Daniel S; Gilmore, Amanda; Kang, Ning
2008-09-15
Multidetector coronary computed tomographic angiography (CCTA) demonstrates high accuracy for the detection and exclusion of coronary artery disease (CAD) and predicts adverse prognosis. To date, opportunity costs relating the clinical and economic outcomes of CCTA compared with other methods of diagnosing CAD, such as myocardial perfusion single-photon emission computed tomography (SPECT), remain unknown. An observational, multicenter, patient-level analysis of patients without known CAD who underwent CCTA or SPECT was performed. Patients who underwent CCTA (n = 1,938) were matched to those who underwent SPECT (n = 7,752) on 8 demographic and clinical characteristics and 2 summary measures of cardiac medications and co-morbidities and were evaluated for 9-month expenditures and clinical outcomes. Adjusted total health care and CAD expenditures were 27% (p cost-efficient alternative to SPECT for the initial coronary evaluation of patients without known CAD.
COMPUTER EXPERIMENTS WITH FINITE ELEMENTS OF HIGHER ORDER
Directory of Open Access Journals (Sweden)
Khomchenko A.
2017-12-01
Full Text Available The paper deals with the problem of constructing the basic functions of a quadrilateral finite element of the fifth order by the means of the computer algebra system Maple. The Lagrangian approximation of such a finite element contains 36 nodes: 20 nodes perimeter and 16 internal nodes. Alternative models with reduced number of internal nodes are considered. Graphs of basic functions and cognitive portraits of lines of zero level are presented. The work is aimed at studying the possibilities of using modern information technologies in the teaching of individual mathematical disciplines.
Directory of Open Access Journals (Sweden)
Carlos S. Garcia
2016-08-01
Full Text Available Firm lifecycle theory predicts that the Weighted Average Cost of Capital (WACC will tend to fall over the lifecycle of the firm (Mueller, 2003, p. 80-81. However, given that previous research finds that corporate governance deteriorates as firms get older (Mueller and Yun, 1998; Saravia, 2014 there is good reason to suspect that the opposite could be the case, that is, that the WACC is higher for older firms. Since our literature review indicates that no direct tests to clarify this question have been carried out up till now, this paper aims to fill the gap by testing this prediction empirically. Our findings support the proposition that the WACC of younger firms is higher than that of mature firms. Thus, we find that the mature firm overinvestment problem is not intensified by a higher cost of capital, on the contrary, our results suggest that mature firms manage to invest in negative net present value projects even though they have access to cheaper capital. This finding sheds new light on the magnitude of the corporate governance problems found in mature firms.
International Nuclear Information System (INIS)
Niederer, J.
1983-01-01
This note outlines several ways in which large scale simulation computing and programming support may be provided to the SSC design community. One aspect of the problem is getting supercomputer power without the high cost and long lead times of large scale institutional computing. Another aspect is the blending of modern programming practices with more conventional accelerator design programs in ways that do not also swamp designers with the details of complicated computer technology
Hassan, Cesare; Pickhardt, Perry J; Pickhardt, Perry; Laghi, Andrea; Kim, Daniel H; Kim, Daniel; Zullo, Angelo; Iafrate, Franco; Di Giulio, Lorenzo; Morini, Sergio
2008-04-14
In addition to detecting colorectal neoplasia, abdominal computed tomography (CT) with colonography technique (CTC) can also detect unsuspected extracolonic cancers and abdominal aortic aneurysms (AAA).The efficacy and cost-effectiveness of this combined abdominal CT screening strategy are unknown. A computerized Markov model was constructed to simulate the occurrence of colorectal neoplasia, extracolonic malignant neoplasm, and AAA in a hypothetical cohort of 100,000 subjects from the United States who were 50 years of age. Simulated screening with CTC, using a 6-mm polyp size threshold for reporting, was compared with a competing model of optical colonoscopy (OC), both without and with abdominal ultrasonography for AAA detection (OC-US strategy). In the simulated population, CTC was the dominant screening strategy, gaining an additional 1458 and 462 life-years compared with the OC and OC-US strategies and being less costly, with a savings of $266 and $449 per person, respectively. The additional gains for CTC were largely due to a decrease in AAA-related deaths, whereas the modeled benefit from extracolonic cancer downstaging was a relatively minor factor. At sensitivity analysis, OC-US became more cost-effective only when the CTC sensitivity for large polyps dropped to 61% or when broad variations of costs were simulated, such as an increase in CTC cost from $814 to $1300 or a decrease in OC cost from $1100 to $500. With the OC-US approach, suboptimal compliance had a strong negative influence on efficacy and cost-effectiveness. The estimated mortality from CT-induced cancer was less than estimated colonoscopy-related mortality (8 vs 22 deaths), both of which were minor compared with the positive benefit from screening. When detection of extracolonic findings such as AAA and extracolonic cancer are considered in addition to colorectal neoplasia in our model simulation, CT colonography is a dominant screening strategy (ie, more clinically effective and more cost
Method and computer program product for maintenance and modernization backlogging
Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M
2013-02-19
According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.
Running Neuroimaging Applications on Amazon Web Services: How, When, and at What Cost?
Directory of Open Access Journals (Sweden)
Tara M. Madhyastha
2017-11-01
Full Text Available The contribution of this paper is to identify and describe current best practices for using Amazon Web Services (AWS to execute neuroimaging workflows “in the cloud.” Neuroimaging offers a vast set of techniques by which to interrogate the structure and function of the living brain. However, many of the scientists for whom neuroimaging is an extremely important tool have limited training in parallel computation. At the same time, the field is experiencing a surge in computational demands, driven by a combination of data-sharing efforts, improvements in scanner technology that allow acquisition of images with higher image resolution, and by the desire to use statistical techniques that stress processing requirements. Most neuroimaging workflows can be executed as independent parallel jobs and are therefore excellent candidates for running on AWS, but the overhead of learning to do so and determining whether it is worth the cost can be prohibitive. In this paper we describe how to identify neuroimaging workloads that are appropriate for running on AWS, how to benchmark execution time, and how to estimate cost of running on AWS. By benchmarking common neuroimaging applications, we show that cloud computing can be a viable alternative to on-premises hardware. We present guidelines that neuroimaging labs can use to provide a cluster-on-demand type of service that should be familiar to users, and scripts to estimate cost and create such a cluster.
International Nuclear Information System (INIS)
Ko, Soon Heum; Kim, Na Yong; Nikitopoulos, Dimitris E.; Moldovan, Dorel; Jha, Shantenu
2014-01-01
Numerical approaches are presented to minimize the statistical errors inherently present due to finite sampling and the presence of thermal fluctuations in the molecular region of a hybrid computational fluid dynamics (CFD) - molecular dynamics (MD) flow solution. Near the fluid-solid interface the hybrid CFD-MD simulation approach provides a more accurate solution, especially in the presence of significant molecular-level phenomena, than the traditional continuum-based simulation techniques. It also involves less computational cost than the pure particle-based MD. Despite these advantages the hybrid CFD-MD methodology has been applied mostly in flow studies at high velocities, mainly because of the higher statistical errors associated with low velocities. As an alternative to the costly increase of the size of the MD region to decrease statistical errors, we investigate a few numerical approaches that reduce sampling noise of the solution at moderate-velocities. These methods are based on sampling of multiple simulation replicas and linear regression of multiple spatial/temporal samples. We discuss the advantages and disadvantages of each technique in the perspective of solution accuracy and computational cost.
The Real University Cost in a ''Free'' Higher Education Country
Psacharopoulos, G.; Papakonstantinou, G.
2005-01-01
Using a sample of over 3000 first year university entrants in Greece, we investigate the time and expense incurred in preparation for the highly competitive higher education entry examinations, as well as what students spend privately while attending university. It is shown that in a constitutionally ''free for all'' higher education country,…
Financial Resource Allocation in Higher Education
Ušpuriene, Ana; Sakalauskas, Leonidas; Dumskis, Valerijonas
2017-01-01
The paper considers a problem of financial resource allocation in a higher education institution. The basic financial management instruments and the multi-stage cost minimization model created are described involving financial instruments to constraints. Both societal and institutional factors that determine the costs of educating students are…
[Operating cost analysis of anaesthesia: activity based costing (ABC analysis)].
Majstorović, Branislava M; Kastratović, Dragana A; Vučović, Dragan S; Milaković, Branko D; Miličić, Biljana R
2011-01-01
Cost of anaesthesiology represent defined measures to determine a precise profile of expenditure estimation of surgical treatment, which is important regarding planning of healthcare activities, prices and budget. In order to determine the actual value of anaestesiological services, we started with the analysis of activity based costing (ABC) analysis. Retrospectively, in 2005 and 2006, we estimated the direct costs of anestesiological services (salaries, drugs, supplying materials and other: analyses and equipment.) of the Institute of Anaesthesia and Resuscitation of the Clinical Centre of Serbia. The group included all anesthetized patients of both sexes and all ages. We compared direct costs with direct expenditure, "each cost object (service or unit)" of the Republican Healthcare Insurance. The Summary data of the Departments of Anaesthesia documented in the database of the Clinical Centre of Serbia. Numerical data were utilized and the numerical data were estimated and analyzed by computer programs Microsoft Office Excel 2003 and SPSS for Windows. We compared using the linear model of direct costs and unit costs of anaesthesiological services from the Costs List of the Republican Healthcare Insurance. Direct costs showed 40% of costs were spent on salaries, (32% on drugs and supplies, and 28% on other costs, such as analyses and equipment. The correlation of the direct costs of anaestesiological services showed a linear correlation with the unit costs of the Republican Healthcare Insurance. During surgery, costs of anaesthesia would increase by 10% the surgical treatment cost of patients. Regarding the actual costs of drugs and supplies, we do not see any possibility of costs reduction. Fixed elements of direct costs provide the possibility of rationalization of resources in anaesthesia.
Developing Activities for Teaching Cloud Computing and Virtualization
Directory of Open Access Journals (Sweden)
E. Erturk
2014-10-01
Full Text Available Cloud computing and virtualization are new but indispensable components of computer engineering and information systems curricula for universities and higher education institutions. Learning about these topics is important for students preparing to work in the IT industry. In many companies, information technology operates under tight financial constraints. Virtualization, (for example storage, desktop, and server virtualization, reduces overall IT costs through the consolidation of systems. It also results in reduced loads and energy savings in terms of the power and cooling infrastructure. Therefore it is important to investigate the practical aspects of this topic both for industry practice and for teaching purposes. This paper demonstrates some activities undertaken recently by students at the Eastern Institute of Technology New Zealand and concludes with general recommendations for IT educators, software developers, and other IT professionals
International Nuclear Information System (INIS)
Cha, Jeong Hun; Choi, Heui Joo; Lee, Jong Youl; Choi, Jong Won
2009-01-01
It is expected that a substantial amount of spent fuels will be transported from the four nuclear power plant (NPP) sites in Korea to a hypothetical centralized interim storage facility or a final repository in the near future. The cost for the transportation is proportional to the amount of spent fuels. In this paper, a cost estimation program is developed based on the conceptual design of a transportation system and a logistics analysis. Using the developed computer program, named as CASK, the minimum capacity of a centralized interim storage facility (CISF) and the transportation cost for PWR spent fuels are calculated. The PWR spent fuels are transported from 4 NPP sites to a final repository (FR) via the CISF. Since NPP sites and the CISF are located along the coast, a sea-transportation is considered and a road-transportation is considered between the CISF and the FR. The result shows that the minimum capacity of the interim storage facility is 15,000 MTU
Yilmaz, Ferkan
2012-12-01
The higher-order statistics (HOS) of the channel capacity μn=E[logn (1+γ end)], where n ∈ N denotes the order of the statistics, has received relatively little attention in the literature, due in part to the intractability of its analysis. In this letter, we propose a novel and unified analysis, which is based on the moment generating function (MGF) technique, to exactly compute the HOS of the channel capacity. More precisely, our mathematical formalism can be readily applied to maximal-ratio-combining (MRC) receivers operating in generalized fading environments. The mathematical formalism is illustrated by some numerical examples focusing on the correlated generalized fading environments. © 2012 IEEE.
Yilmaz, Ferkan; Alouini, Mohamed-Slim
2012-01-01
The higher-order statistics (HOS) of the channel capacity μn=E[logn (1+γ end)], where n ∈ N denotes the order of the statistics, has received relatively little attention in the literature, due in part to the intractability of its analysis. In this letter, we propose a novel and unified analysis, which is based on the moment generating function (MGF) technique, to exactly compute the HOS of the channel capacity. More precisely, our mathematical formalism can be readily applied to maximal-ratio-combining (MRC) receivers operating in generalized fading environments. The mathematical formalism is illustrated by some numerical examples focusing on the correlated generalized fading environments. © 2012 IEEE.
Data mining in Cloud Computing
Directory of Open Access Journals (Sweden)
Ruxandra-Ştefania PETRE
2012-10-01
Full Text Available This paper describes how data mining is used in cloud computing. Data Mining is used for extracting potentially useful information from raw data. The integration of data mining techniques into normal day-to-day activities has become common place. Every day people are confronted with targeted advertising, and data mining techniques help businesses to become more efficient by reducing costs.Data mining techniques and applications are very much needed in the cloud computing paradigm. The implementation of data mining techniques through Cloud computing will allow the users to retrieve meaningful information from virtually integrated data warehouse that reduces the costs of infrastructure and storage.
International Nuclear Information System (INIS)
Cattaneo, M.
1996-01-01
War is costly. But peace cost is even higher. The destruction of weapons (mines, nuclear weapons, chemical weapons) is much more expensive than their manufacturing. The soldiers demobilization cost is enormous, for instance in Angola, Mozambique, Nicaragua, Zimbabwe the demobilization of 270000 soldiers cost 2.5 10 9 francs. The measures intended to reduce the war risk are also expensive. That is why the arsenal of ex USSR is still intact. Today no international agency is entirely dedicated to peace building. The question is how would cost such an agency? (O.L.). 5 refs., 2 figs
Paiva, Joana S.; Dias, Duarte
2017-01-01
In recent years, safer and more reliable biometric methods have been developed. Apart from the need for enhanced security, the media and entertainment sectors have also been applying biometrics in the emerging market of user-adaptable objects/systems to make these systems more user-friendly. However, the complexity of some state-of-the-art biometric systems (e.g., iris recognition) or their high false rejection rate (e.g., fingerprint recognition) is neither compatible with the simple hardware architecture required by reduced-size devices nor the new trend of implementing smart objects within the dynamic market of the Internet of Things (IoT). It was recently shown that an individual can be recognized by extracting features from their electrocardiogram (ECG). However, most current ECG-based biometric algorithms are computationally demanding and/or rely on relatively large (several seconds) ECG samples, which are incompatible with the aforementioned application fields. Here, we present a computationally low-cost method (patent pending), including simple mathematical operations, for identifying a person using only three ECG morphology-based characteristics from a single heartbeat. The algorithm was trained/tested using ECG signals of different duration from the Physionet database on more than 60 different training/test datasets. The proposed method achieved maximal averaged accuracy of 97.450% in distinguishing each subject from a ten-subject set and false acceptance and rejection rates (FAR and FRR) of 5.710±1.900% and 3.440±1.980%, respectively, placing Beat-ID in a very competitive position in terms of the FRR/FAR among state-of-the-art methods. Furthermore, the proposed method can identify a person using an average of 1.020 heartbeats. It therefore has FRR/FAR behavior similar to obtaining a fingerprint, yet it is simpler and requires less expensive hardware. This method targets low-computational/energy-cost scenarios, such as tiny wearable devices (e.g., a
Counting loop diagrams: computational complexity of higher-order amplitude evaluation
International Nuclear Information System (INIS)
Eijk, E. van; Kleiss, R.; Lazopoulos, A.
2004-01-01
We discuss the computational complexity of the perturbative evaluation of scattering amplitudes, both by the Caravaglios-Moretti algorithm and by direct evaluation of the individual diagrams. For a self-interacting scalar theory, we determine the complexity as a function of the number of external legs. We describe a method for obtaining the number of topologically inequivalent Feynman graphs containing closed loops, and apply this to 1- and 2-loop amplitudes. We also compute the number of graphs weighted by their symmetry factors, thus arriving at exact and asymptotic estimates for the average symmetry factor of diagrams. We present results for the asymptotic number of diagrams up to 10 loops, and prove that the average symmetry factor approaches unity as the number of external legs becomes large. (orig.)
GPU-accelerated micromagnetic simulations using cloud computing
Energy Technology Data Exchange (ETDEWEB)
Jermain, C.L., E-mail: clj72@cornell.edu [Cornell University, Ithaca, NY 14853 (United States); Rowlands, G.E.; Buhrman, R.A. [Cornell University, Ithaca, NY 14853 (United States); Ralph, D.C. [Cornell University, Ithaca, NY 14853 (United States); Kavli Institute at Cornell, Ithaca, NY 14853 (United States)
2016-03-01
Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.
GPU-accelerated micromagnetic simulations using cloud computing
International Nuclear Information System (INIS)
Jermain, C.L.; Rowlands, G.E.; Buhrman, R.A.; Ralph, D.C.
2016-01-01
Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.
Natural Computing in Computational Finance Volume 4
O’Neill, Michael; Maringer, Dietmar
2012-01-01
This book follows on from Natural Computing in Computational Finance Volumes I, II and III. As in the previous volumes of this series, the book consists of a series of chapters each of which was selected following a rigorous, peer-reviewed, selection process. The chapters illustrate the application of a range of cutting-edge natural computing and agent-based methodologies in computational finance and economics. The applications explored include option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading, corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation. While describing cutting edge applications, the chapters are written so that they are accessible to a wide audience. Hence, they should be of interest to academics, students and practitioners in the fields of computational finance and economics.
Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.
1992-01-01
A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).
34 CFR 74.27 - Allowable costs.
2010-07-01
... Procedures or uniform cost accounting standards that comply with cost principles acceptable to ED. (b) The... OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial... principles for determining allowable costs. Allowability of costs are determined in accordance with the cost...
An assessment of mass burn incineration costs
International Nuclear Information System (INIS)
Fox, M.R.; Scutter, J.N.; Sutton, A.M.
1993-01-01
This study comprises the third and final part of a cost assessment exercise of waste-to-energy options. The specific objectives of this particular study were: to determine the capital and operating costs of three generic types of mass burn waste-to-energy systems, for waste inputs of 200,000 and 400,000 t/y of municipal solid waste (MSW); to verify the mass and energy balances of the systems; to develop a computer cost model to manipulate the data as required; to carry out sensitivity checks on the computer model of changes to key parameters; and to conduct the study in a manner approximating as closely as possible to a real commercial situation. (author)
Laugesen, Miriam J; Glied, Sherry A
2011-09-01
Higher health care prices in the United States are a key reason that the nation's health spending is so much higher than that of other countries. Our study compared physicians' fees paid by public and private payers for primary care office visits and hip replacements in Australia, Canada, France, Germany, the United Kingdom, and the United States. We also compared physicians' incomes net of practice expenses, differences in financing the cost of medical education, and the relative contribution of payments per physician and of physician supply in the countries' national spending on physician services. Public and private payers paid somewhat higher fees to US primary care physicians for office visits (27 percent more for public, 70 percent more for private) and much higher fees to orthopedic physicians for hip replacements (70 percent more for public, 120 percent more for private) than public and private payers paid these physicians' counterparts in other countries. US primary care and orthopedic physicians also earned higher incomes ($186,582 and $442,450, respectively) than their foreign counterparts. We conclude that the higher fees, rather than factors such as higher practice costs, volume of services, or tuition expenses, were the main drivers of higher US spending, particularly in orthopedics.
International Nuclear Information System (INIS)
Coy, Peter; Schaafsma, Joseph; Schofield, John A.
2000-01-01
Purpose: To compute cost-effectiveness/cost-utility (CE/CU) ratios, from the treatment clinic and societal perspectives, for high-dose palliative radiotherapy treatment (RT) for advanced non-small-cell lung cancer (NSCLC) against best supportive care (BSC) as comparator, and thereby demonstrate a method for computing CE/CU ratios when randomized clinical trial (RCT) data cannot be generated. Methods and Materials: Unit cost estimates based on an earlier reported 1989-90 analysis of treatment costs at the Vancouver Island Cancer Centre, Victoria, British Columbia, Canada, are updated to 1997-1998 and then used to compute the incremental cost of an average dose of high-dose palliative RT. The incremental number of life days and quality-adjusted life days (QALDs) attributable to treatment are from earlier reported regression analyses of the survival and quality-of-life data from patients who enrolled prospectively in a lung cancer management cost-effectiveness study at the clinic over a 2-year period from 1990 to 1992. Results: The baseline CE and CU ratios are $9245 Cdn per life year (LY) and $12,836 per quality-adjusted life year (QALY), respectively, from the clinic perspective; and $12,253/LY and $17,012/QALY, respectively, from the societal perspective. Multivariate sensitivity analysis for the CE ratio produces a range of $5513-28,270/LY from the clinic perspective, and $7307-37,465/LY from the societal perspective. Similar calculations for the CU ratio produce a range of $7205-37,134/QALY from the clinic perspective, and $9550-49,213/QALY from the societal perspective. Conclusion: The cost effectiveness and cost utility of high-dose palliative RT for advanced NSCLC compares favorably with the cost effectiveness of other forms of treatment for NSCLC, of treatments of other forms of cancer, and of many other commonly used medical interventions; and lies within the US $50,000/QALY benchmark often cited for cost-effective care
28 CFR 70.27 - Allowable costs.
2010-07-01
... AND AGREEMENTS (INCLUDING SUBAWARDS) WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 70.27 Allowable costs. (a... Organizations.” The allowability of costs incurred by institutions of higher education is determined in...
Directory of Open Access Journals (Sweden)
Robin H. Kay
2011-04-01
Full Text Available Because of decreased prices, increased convenience, and wireless access, an increasing number of college and university students are using laptop computers in their classrooms. This recent trend has forced instructors to address the educational consequences of using these mobile devices. The purpose of the current study was to analyze and assess beneficial and challenging laptop behaviours in higher education classrooms. Both quantitative and qualitative data were collected from 177 undergraduate university students (89 males, 88 females. Key benefits observed include note-taking activities, in-class laptop-based academic tasks, collaboration, increased focus, improved organization and efficiency, and addressing special needs. Key challenges noted include other student’s distracting laptop behaviours, instant messaging, surfing the web, playing games, watching movies, and decreased focus. Nearly three-quarters of the students claimed that laptops were useful in supporting their academic experience. Twice as many benefits were reported compared to challenges. It is speculated that the integration of meaningful laptop activities is a critical determinant of benefits and challenges experienced in higher education classrooms.
29 CFR 95.27 - Allowable costs.
2010-07-01
... cost principles applicable to the entity incurring the costs. Thus, allowability of costs incurred by... Governments.” The allowability of costs incurred by non-profit organizations is determined in accordance with... Organizations.” The allowability of costs incurred by institutions of higher education is determined in...
24 CFR 84.27 - Allowable costs.
2010-04-01
... to the entity incurring the costs. Thus, allowability of costs incurred by State, local or federally..., “Cost Principles for State and Local Governments.” The allowability of costs incurred by non-profit...-Profit Organizations.” The allowability of costs incurred by institutions of higher education is...
International Nuclear Information System (INIS)
Wattson, Daniel A.; Hunink, M.G. Myriam; DiPiro, Pamela J.; Das, Prajnan; Hodgson, David C.; Mauch, Peter M.; Ng, Andrea K.
2014-01-01
Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening
Energy Technology Data Exchange (ETDEWEB)
Wattson, Daniel A., E-mail: dwattson@partners.org [Harvard Radiation Oncology Program, Boston, Massachusetts (United States); Hunink, M.G. Myriam [Departments of Radiology and Epidemiology, Erasmus Medical Center, Rotterdam, the Netherlands and Center for Health Decision Science, Harvard School of Public Health, Boston, Massachusetts (United States); DiPiro, Pamela J. [Department of Imaging, Dana-Farber Cancer Institute, Boston, Massachusetts (United States); Das, Prajnan [Department of Radiation Oncology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Hodgson, David C. [Department of Radiation Oncology, University of Toronto, Toronto, Ontario (Canada); Mauch, Peter M.; Ng, Andrea K. [Department of Radiation Oncology, Brigham and Women' s Hospital and Dana-Farber Cancer Institute, Boston, Massachusetts (United States)
2014-10-01
Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening
Wattson, Daniel A; Hunink, M G Myriam; DiPiro, Pamela J; Das, Prajnan; Hodgson, David C; Mauch, Peter M; Ng, Andrea K
2014-10-01
Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening may be cost effective for all smokers but possibly not
Costs of hospital malnutrition.
Curtis, Lori Jane; Bernier, Paule; Jeejeebhoy, Khursheed; Allard, Johane; Duerksen, Donald; Gramlich, Leah; Laporte, Manon; Keller, Heather H
2017-10-01
Hospital malnutrition has been established as a critical, prevalent, and costly problem in many countries. Many cost studies are limited due to study population or cost data used. The aims of this study were to determine: the relationship between malnutrition and hospital costs; the influence of confounders on, and the drivers (medical or surgical patients or degree of malnutrition) of the relationship; and whether hospital reported cost data provide similar information to administrative data. To our knowledge, the last two goals have not been studied elsewhere. Univariate and multivariate analyses were performed on data from the Canadian Malnutrition Task Force prospective cohort study combined with administrative data from the Canadian Institute for Health Information. Subjective Global Assessment was used to assess the relationship between nutritional status and length of stay and hospital costs, controlling for health and demographic characteristics, for 956 patients admitted to medical and surgical wards in 18 hospitals across Canada. After controlling for patient and hospital characteristics, moderately malnourished patients' (34% of surveyed patients) hospital stays were 18% (p = 0.014) longer on average than well-nourished patients. Medical stays increased by 23% (p = 0.014), and surgical stays by 32% (p = 0.015). Costs were, on average, between 31% and 34% (p-values < 0.05) higher than for well-nourished patients with similar characteristics. Severely malnourished patients (11% of surveyed patients) stayed 34% (p = 0.000) longer and had 38% (p = 0.003) higher total costs than well-nourished patients. They stayed 53% (p = 0.001) longer in medical beds and had 55% (p = 0.003) higher medical costs, on average. Trends were similar no matter the type of costing data used. Over 40% of patients were found to be malnourished (1/3 moderately and 1/10 severely). Malnourished patients had longer hospital stays and as a result cost more than well
Cook, Perry R.
This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).
Cloud Computing. Technology Briefing. Number 1
Alberta Education, 2013
2013-01-01
Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…
Advanced Fuel Cycle Cost Basis
Energy Technology Data Exchange (ETDEWEB)
D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider
2009-12-01
This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.
Advanced Fuel Cycle Cost Basis
Energy Technology Data Exchange (ETDEWEB)
D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert
2007-04-01
This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 26 cost modules—24 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, and high-level waste.
Advanced Fuel Cycle Cost Basis
Energy Technology Data Exchange (ETDEWEB)
D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider
2008-03-01
This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.
20 CFR 404.278 - Additional cost-of-living increase.
2010-04-01
... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Additional cost-of-living increase. 404.278... DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases § 404.278 Additional cost-of-living increase. (a) General. In addition to the cost-of-living increase explained in...
Compact, open-architecture computed radiography system
International Nuclear Information System (INIS)
Huang, H.K.; Lim, A.; Kangarloo, H.; Eldredge, S.; Loloyan, M.; Chuang, K.S.
1990-01-01
Computed radiography (CR) was introduced in 1982, and its basic system design has not changed. Current CR systems have certain limitations: spatial resolution and signal-to-noise ratios are lower than those of screen-film systems, they are complicated and expensive to build, and they have a closed architecture. The authors of this paper designed and implemented a simpler, lower-cost, compact, open-architecture CR system to overcome some of these limitations. The open-architecture system is a manual-load-single-plate reader that can fit on a desk top. Phosphor images are stored in a local disk and can be sent to any other computer through standard interfaces. Any manufacturer's plate can be read with a scanning time of 90 second for a 35 x 43-cm plate. The standard pixel size is 174 μm and can be adjusted for higher spatial resolution. The data resolution is 12 bits/pixel over an x-ray exposure range of 0.01-100 mR
Can value-based insurance impose societal costs?
Koenig, Lane; Dall, Timothy M; Ruiz, David; Saavoss, Josh; Tongue, John
2014-09-01
Among policy alternatives considered to reduce health care costs and improve outcomes, value-based insurance design (VBID) has emerged as a promising option. Most applications of VBID, however, have not used higher cost sharing to discourage specific services. In April 2011, the state of Oregon introduced a policy for public employees that required additional cost sharing for high-cost procedures such as total knee arthroplasty (TKA). Our objectives were to estimate the societal impact of higher co-pays for TKA using Oregon as a case study and building on recent work demonstrating the effects of knee osteoarthritis and surgical treatment on employment and disability outcomes. We used a Markov model to estimate the societal impact in terms of quality of life, direct costs, and indirect costs of higher co-pays for TKA using Oregon as a case study. We found that TKA for a working population can generate societal benefits that offset the direct medical costs of the procedure. Delay in receiving surgical care, because of higher co-payment or other reasons, reduced the societal savings from TKA. We conclude that payers moving toward value-based cost sharing should consider consequences beyond direct medical expenses. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Higher-order Spatial Accuracy in Diffeomorphic Image Registration
DEFF Research Database (Denmark)
Jacobs, Henry O.; Sommer, Stefan
-jets. We show that the solutions convergence to optimal solutions of the original cost functional as the number of particles increases with a convergence rate of O(hd+k) where h is a resolution parameter. The effect of this approach over traditional particle methods is illustrated on synthetic examples......We discretize a cost functional for image registration problems by deriving Taylor expansions for the matching term. Minima of the discretized cost functionals can be computed with no spatial discretization error, and the optimal solutions are equivalent to minimal energy curves in the space of kk...
45 CFR 2543.27 - Allowable costs.
2010-10-01
... GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 2543.27 Allowable costs. For each kind... Organizations.” The allowability of costs incurred by institutions of higher education is determined in...
20 CFR 435.27 - Allowable costs.
2010-04-01
... AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, OTHER NON-PROFIT ORGANIZATIONS, AND COMMERCIAL ORGANIZATIONS Post-Award Requirements Financial and Program Management § 435.27 Allowable costs. For each kind... Organizations.” (c) Allowability of costs incurred by institutions of higher education is determined in...
Brief: Managing computing technology
International Nuclear Information System (INIS)
Startzman, R.A.
1994-01-01
While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies
7 CFR 550.25 - Allowable costs.
2010-01-01
... cost principles applicable to the entity incurring the costs. Thus, allowability of costs incurred by... at 2 CFR part 225. The allowability of costs incurred by non-profit organizations is determined in... at 2 CFR part 230. The allowability of costs incurred by institutions of higher education is...
5 CFR 847.705 - Cost-of-living adjustments.
2010-01-01
... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Cost-of-living adjustments. 847.705... FUND INSTRUMENTALITIES Computation of Benefits Under the Retroactive Provisions § 847.705 Cost-of-living adjustments. Cost-of-living adjustments are applied to the rate payable to the retiree or survivor...
Cost Estimation and Comparison of Carbon Capture and Storage Technology with Wind Energy
Directory of Open Access Journals (Sweden)
ABDULLAH MENGAL
2017-04-01
Full Text Available The CCS (Carbon Capture and Storage is one of the significant solutions to reduce CO2 emissions from fossil fuelled electricity generation plants and minimize the effect of global warming. Economic analysis of CCS technology is, therefore, essential for the feasibility appraisal towards CO2 reduction. In this paper LCOE (Levelized Cost of Electricity Generation has been estimated with and without CCS technology for fossil fuel based power plants of Pakistan and also further compared with computed LCOE of WE (Wind Energy based power plants of the Pakistan. The results of this study suggest that the electricity generation costs of the fossil fuel power plants increase more than 44% with CCS technology as compared to without CCS technology. The generation costs are also found to be 10% further on higher side when considering efficiency penalty owing to installation of CCS technology. In addition, the CO2 avoided costs from natural gas plant are found to be 40 and 10% higher than the local coal and imported coal plants respectively. As such, the electricity generation cost of 5.09 Rs/kWh from WE plants is found to be competitive even when fossil fuel based plants are without CCS technology, with lowest cost of 5.9 Rs./kWh of CCNG (Combined Cycle Natural Gas plant. Based on analysis of results of this study and anticipated future development of efficient and cheap WE technologies, it is concluded that WE based electricity generation would be most appropriate option for CO2 reduction for Pakistan.
Ibnteesam Pondor; Wan Ying Gan; Geeta Appannah
2017-01-01
Food price is a determining factor of food choices; however its relationship with diet quality is unclear in Malaysia. This study aimed to examine socio-economic characteristics and daily dietary cost (DDC) in relation to diet quality in the state of Selangor, Malaysia. Dietary intake was assessed using a Food Frequency Questionnaire (FFQ) and diet quality was estimated using a Malaysian Healthy Eating Index (M-HEI). DDC in Malaysian Ringgit (RM) was calculated from dietary intake and nationa...
15 CFR 14.27 - Allowable costs.
2010-01-01
... GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, OTHER NON-PROFIT, AND COMMERCIAL ORGANIZATIONS Post-Award Requirements Financial and Program Management § 14.27 Allowable costs. For each kind of... Organizations.” The allowability of costs incurred by institutions of higher education is determined in...
48 CFR 49.303-4 - Adjustment of indirect costs.
2010-10-01
... costs. 49.303-4 Section 49.303-4 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT TERMINATION OF CONTRACTS Additional Principles for Cost-Reimbursement Contracts... compute indirect costs for other contracts performed during the applicable accounting period. [48 FR 42447...
Framework for Computation Offloading in Mobile Cloud Computing
Directory of Open Access Journals (Sweden)
Dejan Kovachev
2012-12-01
Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.
49 CFR 19.27 - Allowable costs.
2010-10-01
... applicable to the entity incurring the costs. Thus, allowability of costs incurred by State, local or... Circular A-87, “Cost Principles for State and Local Governments.” The allowability of costs incurred by non... Principles for Non-Profit Organizations.” The allowability of costs incurred by institutions of higher...
36 CFR 1210.27 - Allowable costs.
2010-07-01
... applicable to the entity incurring the costs. Thus, allowability of costs incurred by State, local or... Circular A-87, “Cost Principles for State and Local Governments.” The allowability of costs incurred by non... Principles for Non-Profit Organizations.” The allowability of costs incurred by institutions of higher...
7 CFR 3019.27 - Allowable costs.
2010-01-01
... applicable to the entity incurring the costs. Thus, allowability of costs incurred by State, local or... Circular A-87, “Cost Principles for State and Local Governments.” The allowability of costs incurred by non... Principles for Non-Profit Organizations.” The allowability of costs incurred by institutions of higher...
Shared Leadership Transforms Higher Education IT
Duin, Ann Hill; Cawley, Steve; Gulachek, Bernard; O'Sullivan, Douglas M.; Wollner, Diane
2011-01-01
Globalization, immersive research and learning environments, unlimited access to information and analytics, and fiscal realities continue to impact higher education--and higher education IT. Although IT organizations face immense pressure to meet significantly greater expectations at significantly less cost, with such pressure comes the…
The Determinants of Costs and Length of Stay for Hip Fracture Patients
Castelli, Adriana; Daidone, Silvio; Jacobs, Rowena; Kasteridis, Panagiotis; Street, Andrew David
2015-01-01
Background and Purpose An ageing population at greater risk of proximal femoral fracture places an additional clinical and financial burden on hospital and community medical services. We analyse the variation in i) length of stay (LoS) in hospital and ii) costs across the acute care pathway for hip fracture from emergency admission, to hospital stay and follow-up outpatient appointments. Patients and Methods We analyse patient-level data from England for 2009/10 for around 60,000 hip fracture cases in 152 hospitals using a random effects generalized linear multi-level model where the dependent variable is given by the patient’s cost or length of stay (LoS). We control for socio-economic characteristics, type of fracture and intervention, co-morbidities, discharge destination of patients, and quality indicators. We also control for provider and social care characteristics. Results Older patients and those from more deprived areas have higher costs and LoS, as do those with specific co-morbidities or that develop pressure ulcers, and those transferred between hospitals or readmitted within 28 days. Costs are also higher for those having a computed tomography (CT) scan or cemented arthroscopy. Costs and LoS are lower for those admitted via a 24h emergency department, receiving surgery on the same day of admission, and discharged to their own homes. Interpretation Patient and treatment characteristics are more important as determinants of cost and LoS than provider or social care factors. A better understanding of the impact of these characteristics can support providers to develop treatment strategies and pathways to better manage this patient population. PMID:26204450
The cost of nuclear electricity: France after Fukushima
International Nuclear Information System (INIS)
Boccard, Nicolas
2014-01-01
The Fukushima disaster has lead the French government to release novel cost information relative to its nuclear electricity program allowing us to compute a levelized cost. We identify a modest escalation of capital cost and a larger than expected operational cost. Under the best scenario, the cost of French nuclear power over the last four decades is 59€/MWh (at 2010 prices) while in the worst case it is 83€/MWh. On the basis of these findings, we estimate the future cost of nuclear power in France to be at least 76€/MWh and possibly 117€/MWh. A comparison with the US confirms that French nuclear electricity nevertheless remains cheaper. Comparisons with coal, natural gas and wind power are carried out to find the advantage of these. - Highlights: • We compute the levelized cost of French nuclear power over 40 years using a novel court of audit report. • We include R and D, technology development, fissile fuel, financing cost, decommissioning and the back-end cycle. • We find a mild capital cost escalation and a high operation cost driven by a low fleet availability. • The levelized cost ranges between 59 and 83€/MWh (at 2010 prices) and compares favorably to the US. • A tentative cost for future nuclear power ranges between 76 and 117€/MWh and compares unfavorably against alternative fuels
Nelson, Winnie W; Wang, Li; Baser, Onur; Damaraju, C V; Schein, Jeffrey R
2015-05-01
Patients with out-of-range international normalized ratio (INR) values 3.0 have been associated with increased risk of thromboembolic and bleeding events. INR monitoring is costly, because of associated physician and nurse time, laboratory resource use, and dose adjustments. This study assessed the healthcare cost burden associated with out-of-range INR among warfarin initiator patients diagnosed with non-valvular atrial fibrillation (NVAF) in the US Veterans Health Administration (VHA) population. Adult NVAF patients (≥18 years) initiating warfarin were selected from the VHA dataset for the study period October 1, 2007-September 30, 2012. Only valid INR measurements (0.5 ≤ INR ≤ 20) were examined for the follow-up period, from the index date (warfarin initiation date) until the end of warfarin exposure or death. All-cause healthcare costs within 30 days were measured starting from the second month (31 days post-index date) to the end of the study period. Costs for inpatient stays, emergency room, outpatient facility, physician office visits, and other services were computed separately. Multiple regression was performed using the generalized linear model for overall cost analysis. In total, 29,463 patients were included in the study sample. Mean costs for out-of-range INR ranged from $3419 to $5126. Inpatient, outpatient, outpatient pharmacy, and total costs were significantly higher after patients experienced out-of-range results (INR 3), compared with in-range INR (2 ≤ INR ≤ 3). When exposed to out-of-range INR, patients also incurred higher mean total costs within 2-6 months ($3840-$5820) than after the first 6 months ($2789-$3503) of warfarin therapy. In the VHA population, INR measures outside of the 2-3 range were associated with significantly higher healthcare costs. Increased costs were especially apparent when INR values were below 2, although INR measures above 3 were also associated with higher costs relative to in
Above-Campus Services: Shaping the Promise of Cloud Computing for Higher Education
Wheeler, Brad; Waggener, Shelton
2009-01-01
The concept of today's cloud computing may date back to 1961, when John McCarthy, retired Stanford professor and Turing Award winner, delivered a speech at MIT's Centennial. In that speech, he predicted that in the future, computing would become a "public utility." Yet for colleges and universities, the recent growth of pervasive, very high speed…
Barth, Michael M.; Karagiannidis, Iordanis
2016-01-01
Many universities have implemented tuition differentials for certain undergraduate degree programs, citing higher degree costs or higher demand. However, most college accounting systems are unsuited for measuring cost differentials by degree program. This research outlines a method that can convert commonly available financial data to a more…
Costs comparison of electric energy in Brazil
International Nuclear Information System (INIS)
Goncalves, D.; Menegassi, J.
1981-01-01
A cost comparison study of various sources of electric energy generation was performed using uniform analysis criteria. The results indicate higher costs for coal, followed by nuclear and hidro. It was verified that presently, large hidro-power plants can only be located far from the load centers, with increasing costs of hidro-power energy in Brazil. These costs become higher than the nuclear plant if the hidro plant is located at distances exceeding 1000 Km. (Author) [pt
Effects of housing system on the costs of commercial egg production.
Matthews, W A; Sumner, D A
2015-03-01
This article reports the first publicly available egg production costs compared across 3 hen-housing systems. We collected detailed data from 2 flock cycles from a commercial egg farm operating a conventional barn, an aviary, and an enriched colony system at the same location. The farm employed the same operational and accounting procedures for each housing system. Results provide clear evidence that egg production costs are much higher for the aviary system than the other 2 housing systems. Feed costs per dozen eggs are somewhat higher for the aviary and lower for the enriched house compared with the conventional house. Labor costs are much lower for the conventional house than the other 2, and pullet costs are much higher for the aviary. Energy and miscellaneous costs are a minimal part of total operating costs and do not differ by housing system. Total capital investments per hen-capacity are much higher for the aviary and the enriched house. Capital costs per dozen eggs depend on assumptions about appropriate interest and depreciation rates. Using the same 10% rate for each housing system shows capital costs per dozen for the aviary and the enriched housing system are much higher than capital costs per dozen for the conventional house. The aviary has average operating costs (feed, labor, pullet, energy, and miscellaneous costs that recur for each flock and vary with egg production) about 23% higher and average total costs about 36% higher compared with the conventional house. The enriched housing system has average operating costs only about 4% higher compared with the conventional house, but average total costs are 13% higher than for the conventional house. © The Author 2015. Published by Oxford University Press on behalf of Poultry Science Association.
Computation of piecewise affine terminal cost functions for model predictive control
Brunner, F.D.; Lazar, M.; Allgöwer, F.; Fränzle, Martin; Lygeros, John
2014-01-01
This paper proposes a method for the construction of piecewise affine terminal cost functions for model predictive control (MPC). The terminal cost function is constructed on a predefined partition by solving a linear program for a given piecewise affine system, a stabilizing piecewise affine
Long-term cost-effectiveness of disease management in systolic heart failure.
Miller, George; Randolph, Stephen; Forkner, Emma; Smith, Brad; Galbreath, Autumn Dawn
2009-01-01
Although congestive heart failure (CHF) is a primary target for disease management programs, previous studies have generated mixed results regarding the effectiveness and cost savings of disease management when applied to CHF. We estimated the long-term impact of systolic heart failure disease management from the results of an 18-month clinical trial. We used data generated from the trial (starting population distributions, resource utilization, mortality rates, and transition probabilities) in a Markov model to project results of continuing the disease management program for the patients' lifetimes. Outputs included distribution of illness severity, mortality, resource consumption, and the cost of resources consumed. Both cost and effectiveness were discounted at a rate of 3% per year. Cost-effectiveness was computed as cost per quality-adjusted life year (QALY) gained. Model results were validated against trial data and indicated that, over their lifetimes, patients experienced a lifespan extension of 51 days. Combined discounted lifetime program and medical costs were $4850 higher in the disease management group than the control group, but the program had a favorable long-term discounted cost-effectiveness of $43,650/QALY. These results are robust to assumptions regarding mortality rates, the impact of aging on the cost of care, the discount rate, utility values, and the targeted population. Estimation of the clinical benefits and financial burden of disease management can be enhanced by model-based analyses to project costs and effectiveness. Our results suggest that disease management of heart failure patients can be cost-effective over the long term.
Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.
1992-01-01
The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).
Computational identification of candidate nucleotide cyclases in higher plants
Wong, Aloysius Tze
2013-09-03
In higher plants guanylyl cyclases (GCs) and adenylyl cyclases (ACs) cannot be identified using BLAST homology searches based on annotated cyclic nucleotide cyclases (CNCs) of prokaryotes, lower eukaryotes, or animals. The reason is that CNCs are often part of complex multifunctional proteins with different domain organizations and biological functions that are not conserved in higher plants. For this reason, we have developed CNC search strategies based on functionally conserved amino acids in the catalytic center of annotated and/or experimentally confirmed CNCs. Here we detail this method which has led to the identification of >25 novel candidate CNCs in Arabidopsis thaliana, several of which have been experimentally confirmed in vitro and in vivo. We foresee that the application of this method can be used to identify many more members of the growing family of CNCs in higher plants. © Springer Science+Business Media New York 2013.
Cost reduction in abdominal CT by weight-adjusted dose.
Arana, Estanislao; Martí-Bonmatí, Luis; Tobarra, Eva; Sierra, Consuelo
2009-06-01
To analyze the influence of contrast dose adjusted by weight vs. fixed contrast dose in the attenuation and cost of abdominal computed tomography (CT). A randomised, consecutive, parallel group study was conducted in 151 patients (74 men and 77 women, age range 22-67 years), studied with the same CT helical protocol. A dose at 1.75 ml/kg was administered in 101 patients while 50 patients had a fixed dose of 120 ml of same non-ionic contrast material (320 mg/ml). Mean enhancements were measured at right hepatic lobe, superior abdominal aorta and inferior cava vein. Statistical analysis was weight-stratified (81 kg). Aortic attenuation was significantly superior (p61 kg in dose-adjusted group, presented higher hepatic attenuation, being statistically significant in those >81 kg (p80 kg, there was an over cost of euro 10.7 per patient. An injection volume of 1.75 ml/kg offers an optimal diagnostic quality with a global savings of euro 1.34 per patient.
Hardware for soft computing and soft computing for hardware
Nedjah, Nadia
2014-01-01
Single and Multi-Objective Evolutionary Computation (MOEA), Genetic Algorithms (GAs), Artificial Neural Networks (ANNs), Fuzzy Controllers (FCs), Particle Swarm Optimization (PSO) and Ant colony Optimization (ACO) are becoming omnipresent in almost every intelligent system design. Unfortunately, the application of the majority of these techniques is complex and so requires a huge computational effort to yield useful and practical results. Therefore, dedicated hardware for evolutionary, neural and fuzzy computation is a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, digital as well as analog hardware implementations of such computation become cost-effective. The idea behind this book is to offer a variety of hardware designs for soft computing techniques that can be embedded in any final product. Also, to introduce the successful application of soft computing technique to solve many hard problem encountered during the design of embedded hardware designs. Reconfigurable em...
Directory of Open Access Journals (Sweden)
Julia K Ostermann
Full Text Available The aim of this study was to compare the health care costs for patients using additional homeopathic treatment (homeopathy group with the costs for those receiving usual care (control group.Cost data provided by a large German statutory health insurance company were retrospectively analysed from the societal perspective (primary outcome and from the statutory health insurance perspective. Patients in both groups were matched using a propensity score matching procedure based on socio-demographic variables as well as costs, number of hospital stays and sick leave days in the previous 12 months. Total cumulative costs over 18 months were compared between the groups with an analysis of covariance (adjusted for baseline costs across diagnoses and for six specific diagnoses (depression, migraine, allergic rhinitis, asthma, atopic dermatitis, and headache.Data from 44,550 patients (67.3% females were available for analysis. From the societal perspective, total costs after 18 months were higher in the homeopathy group (adj. mean: EUR 7,207.72 [95% CI 7,001.14-7,414.29] than in the control group (EUR 5,857.56 [5,650.98-6,064.13]; p<0.0001 with the largest differences between groups for productivity loss (homeopathy EUR 3,698.00 [3,586.48-3,809.53] vs. control EUR 3,092.84 [2,981.31-3,204.37] and outpatient care costs (homeopathy EUR 1,088.25 [1,073.90-1,102.59] vs. control EUR 867.87 [853.52-882.21]. Group differences decreased over time. For all diagnoses, costs were higher in the homeopathy group than in the control group, although this difference was not always statistically significant.Compared with usual care, additional homeopathic treatment was associated with significantly higher costs. These analyses did not confirm previously observed cost savings resulting from the use of homeopathy in the health care system.
Ostermann, Julia K.; Reinhold, Thomas; Witt, Claudia M.
2015-01-01
Objectives The aim of this study was to compare the health care costs for patients using additional homeopathic treatment (homeopathy group) with the costs for those receiving usual care (control group). Methods Cost data provided by a large German statutory health insurance company were retrospectively analysed from the societal perspective (primary outcome) and from the statutory health insurance perspective. Patients in both groups were matched using a propensity score matching procedure based on socio-demographic variables as well as costs, number of hospital stays and sick leave days in the previous 12 months. Total cumulative costs over 18 months were compared between the groups with an analysis of covariance (adjusted for baseline costs) across diagnoses and for six specific diagnoses (depression, migraine, allergic rhinitis, asthma, atopic dermatitis, and headache). Results Data from 44,550 patients (67.3% females) were available for analysis. From the societal perspective, total costs after 18 months were higher in the homeopathy group (adj. mean: EUR 7,207.72 [95% CI 7,001.14–7,414.29]) than in the control group (EUR 5,857.56 [5,650.98–6,064.13]; phomeopathy EUR 3,698.00 [3,586.48–3,809.53] vs. control EUR 3,092.84 [2,981.31–3,204.37]) and outpatient care costs (homeopathy EUR 1,088.25 [1,073.90–1,102.59] vs. control EUR 867.87 [853.52–882.21]). Group differences decreased over time. For all diagnoses, costs were higher in the homeopathy group than in the control group, although this difference was not always statistically significant. Conclusion Compared with usual care, additional homeopathic treatment was associated with significantly higher costs. These analyses did not confirm previously observed cost savings resulting from the use of homeopathy in the health care system. PMID:26230412
Babbitt, Callie W; Kahhat, Ramzy; Williams, Eric; Babbitt, Gregory A
2009-07-01
Product lifespan is a fundamental variable in understanding the environmental impacts associated with the life cycle of products. Existing life cycle and materials flow studies of products, almost without exception, consider lifespan to be constant over time. To determine the validity of this assumption, this study provides an empirical documentation of the long-term evolution of personal computer lifespan, using a major U.S. university as a case study. Results indicate that over the period 1985-2000, computer lifespan (purchase to "disposal") decreased steadily from a mean of 10.7 years in 1985 to 5.5 years in 2000. The distribution of lifespan also evolved, becoming narrower over time. Overall, however, lifespan distribution was broader than normally considered in life cycle assessments or materials flow forecasts of electronic waste management for policy. We argue that these results suggest that at least for computers, the assumption of constant lifespan is problematic and that it is important to work toward understanding the dynamics of use patterns. We modify an age-structured model of population dynamics from biology as a modeling approach to describe product life cycles. Lastly, the purchase share and generation of obsolete computers from the higher education sector is estimated using different scenarios for the dynamics of product lifespan.
Computer graphics in engineering education
Rogers, David F
2013-01-01
Computer Graphics in Engineering Education discusses the use of Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) as an instructional material in engineering education. Each of the nine chapters of this book covers topics and cites examples that are relevant to the relationship of CAD-CAM with engineering education. The first chapter discusses the use of computer graphics in the U.S. Naval Academy, while Chapter 2 covers key issues in instructional computer graphics. This book then discusses low-cost computer graphics in engineering education. Chapter 4 discusses the uniform b
Optimization and large scale computation of an entropy-based moment closure
Kristopher Garrett, C.; Hauck, Cory; Hill, Judith
2015-12-01
We present computational advances and results in the implementation of an entropy-based moment closure, MN, in the context of linear kinetic equations, with an emphasis on heterogeneous and large-scale computing platforms. Entropy-based closures are known in several cases to yield more accurate results than closures based on standard spectral approximations, such as PN, but the computational cost is generally much higher and often prohibitive. Several optimizations are introduced to improve the performance of entropy-based algorithms over previous implementations. These optimizations include the use of GPU acceleration and the exploitation of the mathematical properties of spherical harmonics, which are used as test functions in the moment formulation. To test the emerging high-performance computing paradigm of communication bound simulations, we present timing results at the largest computational scales currently available. These results show, in particular, load balancing issues in scaling the MN algorithm that do not appear for the PN algorithm. We also observe that in weak scaling tests, the ratio in time to solution of MN to PN decreases.
Maragakis, Antonios; van den Dobbelsteen, Andy; Maragakis, Alexandros
2016-01-01
As students continue to review the sustainability of higher education institutions, there is a growing need to understand the economic returns of degrees as a function of a sustainable institution. This paper reviews a range of international research to summarize the economic drivers of higher education attainment. Although the cost inputs to…
Strategies for compensating for higher costs of geothermal electricity with environmental benefits
International Nuclear Information System (INIS)
Murphy, H.; Niitsuma, Hiroaki
1999-01-01
After very high growth in the 1980s, geothermal electricity production has slowed in the mid- and late-1990s. While Japanese, Indonesian and Philippine geothermal growth has remained high as a consequence of supportive government policies, geothermal electricity production has been flat or reduced in much of Europe and North America. Low prices for coal and natural gas, combined with deregulation, means that in much of the world electricity from new fuel-burning electricity plants can be provided at half the cost of new geothermal electricity. Cost-cutting must be pursued, but is unlikely to close the price gap by itself. Geothermal production is widely perceived as being environmentally clean, but this is not unambiguously true, and requires reinjection to be fully realized. Strategies for monetizing the environmental advantages of geothermal, including the carbon tax, are discussed. (author)
Awofala, Adeneye O. A.; Akinoso, Sabainah O.; Fatade, Alfred O.
2017-01-01
The study investigated attitudes towards computer and computer self-efficacy as predictors of computer anxiety among 310 preservice mathematics teachers from five higher institutions of learning in Lagos and Ogun States of Nigeria using the quantitative research method within the blueprint of the descriptive survey design. Data collected were…
Higher-Order Integral Equation Methods in Computational Electromagnetics
DEFF Research Database (Denmark)
Jørgensen, Erik; Meincke, Peter
Higher-order integral equation methods have been investigated. The study has focused on improving the accuracy and efficiency of the Method of Moments (MoM) applied to electromagnetic problems. A new set of hierarchical Legendre basis functions of arbitrary order is developed. The new basis...
Security Architecture of Cloud Computing
V.KRISHNA REDDY; Dr. L.S.S.REDDY
2011-01-01
The Cloud Computing offers service over internet with dynamically scalable resources. Cloud Computing services provides benefits to the users in terms of cost and ease of use. Cloud Computing services need to address the security during the transmission of sensitive data and critical applications to shared and public cloud environments. The cloud environments are scaling large for data processing and storage needs. Cloud computing environment have various advantages as well as disadvantages o...
Cloud Computing and Security Issues
Rohan Jathanna; Dhanamma Jagli
2017-01-01
Cloud computing has become one of the most interesting topics in the IT world today. Cloud model of computing as a resource has changed the landscape of computing as it promises of increased greater reliability, massive scalability, and decreased costs have attracted businesses and individuals alike. It adds capabilities to Information Technology’s. Over the last few years, cloud computing has grown considerably in Information Technology. As more and more information of individuals and compan...
Cost benefit analysis of instrumentation, supervision and control systems for nuclear power plants
International Nuclear Information System (INIS)
Hagen, P.
1973-08-01
A cost benefit analysis is carried out on a BWR type reactor power plant in which an on-line computer performs plant supervision, reporting, logging, calibration and control functions, using display devices and plotters, while an off-line computer is available for bigger jobs such as fuel management calculations. All on-line functions are briefly described and specified. Three types of computer system are considered, a simplex system, a dual computer system and a multi-processor system. These systems are analysed with respect to reliability, back-up instrumentation requirements and costs. While the multiprocessor system gave in all cases the lowest annual failure costs, the margin to the duplex system was so small that hardware, maintenance and software costs would play an important role in making a decision. (JIW)
Factors cost effectively improved using computer simulations of ...
African Journals Online (AJOL)
LPhidza
effectively managed using computer simulations in semi-arid conditions pertinent to much of sub-Saharan Africa. ... small scale farmers to obtain optimal crop yields thus ensuring their food security and livelihood is ... those that simultaneously incorporate and simulate processes involved throughout the course of crop ...
Technology and the Broken Higher Education Cost Model: Insights from the Delta Cost Project
Kirshstein, Rita; Wellman, Jane
2012-01-01
Although U.S. higher education has faced numerous crises and dilemmas in its history, the situation in which colleges and universities find themselves at the moment is indeed different. Shrinking public subsidies coupled with historic rises in tuitions come at the same time that colleges and universities have been tasked to dramatically increase…
20 CFR 228.60 - Cost-of-living increase.
2010-04-01
... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Cost-of-living increase. 228.60 Section 228... COMPUTATION OF SURVIVOR ANNUITIES The Tier II Annuity Component § 228.60 Cost-of-living increase. The tier II... tier II component at the time the survivor annuity begins, all cost-of-living increases that were...
International Nuclear Information System (INIS)
Sahni, D.C.; Sharma, A.
2000-01-01
The integral form of one-speed, spherically symmetric neutron transport equation with isotropic scattering is considered. Two standard problems are solved using normal mode expansion technique. The expansion coefficients are obtained by solving their singular integral equations. It is shown that these expansion coefficients provide a representation of all spherical harmonics moments of the angular flux as a superposition of Bessel functions. It is seen that large errors occur in the computation of higher moments unless we take certain precautions. The reasons for this phenomenon are explained. They throw some light on the failure of spherical harmonics method in treating spherical geometry problems as observed by Aronsson
Stereo Disparity through Cost Aggregation with Guided Filter
Directory of Open Access Journals (Sweden)
Pauline Tan
2014-10-01
Full Text Available Estimating the depth, or equivalently the disparity, of a stereo scene is a challenging problem in computer vision. The method proposed by Rhemann et al. in 2011 is based on a filtering of the cost volume, which gives for each pixel and for each hypothesized disparity a cost derived from pixel-by-pixel comparison. The filtering is performed by the guided filter proposed by He et al. in 2010. It computes a weighted local average of the costs. The weights are such that similar pixels tend to have similar costs. Eventually, a winner-take-all strategy selects the disparity with the minimal cost for each pixel. Non-consistent labels according to left-right consistency are rejected; a densification step can then be launched to fill the disparity map. The method can be used to solve other labeling problems (optical flow, segmentation but this article focuses on the stereo matching problem.
Directory of Open Access Journals (Sweden)
Kostiuk Mariia
2016-04-01
Full Text Available The volume of demand and supply on educational services constantly grows and education becomes the perspective sphere of the Ukrainian economy. In the conditions of the permanent increased competition between educational establishments, it is impossible to do without marketing, namely - to marketing of educational services. The article substantiates the necessity of the use of computer-integrated marketing communications in advancement of higher educational establishment. It considers questions of advancement of higher educational establishments and educational services in Internet, analyses indexes of advancement of higher educational establishment in «VKontakte» social network. The recommendations for the promotion of universities in social networks were formulated on the basis of the study results.
Hanly, Paul A; Sharp, Linda
2014-03-26
Most measures of the cancer burden take a public health perspective. Cancer also has a significant economic impact on society. To assess this economic burden, we estimated years of potential productive life lost (YPPLL) and costs of lost productivity due to premature cancer-related mortality in Ireland. All cancers combined and the 10 sites accounting for most deaths in men and in women were considered. To compute YPPLL, deaths in 5-year age-bands between 15 and 64 years were multiplied by average working-life expectancy. Valuation of costs, using the human capital approach, involved multiplying YPPLL by age-and-gender specific gross wages, and adjusting for unemployment and workforce participation. Sensitivity analyses were conducted around retirement age and wage growth, labour force participation, employment and discount rates, and to explore the impact of including household production and caring costs. Costs were expressed in €2009. Total YPPLL was lower in men than women (men = 10,873; women = 12,119). Premature cancer-related mortality costs were higher in men (men: total cost = €332 million, cost/death = €290,172, cost/YPPLL = €30,558; women: total cost = €177 million, cost/death = €159,959, cost/YPPLL = €14,628). Lung cancer had the highest premature mortality cost (€84.0 million; 16.5% of total costs), followed by cancers of the colorectum (€49.6 million; 9.7%), breast (€49.4 million; 9.7%) and brain & CNS (€42.4 million: 8.3%). The total economic cost of premature cancer-related mortality in Ireland amounted to €509.5 million or 0.3% of gross domestic product. An increase of one year in the retirement age increased the total all-cancer premature mortality cost by 9.9% for men and 5.9% for women. The inclusion of household production and caring costs increased the total cost to €945.7 million. Lost productivity costs due to cancer-related premature mortality are significant. The higher premature mortality cost in males than
Computer software to estimate timber harvesting system production, cost, and revenue
Dr. John E. Baumgras; Dr. Chris B. LeDoux
1992-01-01
Large variations in timber harvesting cost and revenue can result from the differences between harvesting systems, the variable attributes of harvesting sites and timber stands, or changing product markets. Consequently, system and site specific estimates of production rates and costs are required to improve estimates of harvesting revenue. This paper describes...
Computers for Your Classroom: CAI and CMI.
Thomas, David B.; Bozeman, William C.
1981-01-01
The availability of compact, low-cost computer systems provides a means of assisting classroom teachers in the performance of their duties. Computer-assisted instruction (CAI) and computer-managed instruction (CMI) are two applications of computer technology with which school administrators should become familiar. CAI is a teaching medium in which…
High-cost users of medical care
Garfinkel, Steven A.; Riley, Gerald F.; Iannacchione, Vincent G.
1988-01-01
Based on data from the National Medical Care Utilization and Expenditure Survey, the 10 percent of the noninstitutionalized U.S. population that incurred the highest medical care charges was responsible for 75 percent of all incurred charges. Health status was the strongest predictor of high-cost use, followed by economic factors. Persons 65 years of age or over incurred far higher costs than younger persons and had higher out-of-pocket costs, absolutely and as a percentage of income, althoug...
Computer systems: What the future holds
Stone, H. S.
1976-01-01
Developement of computer architecture is discussed in terms of the proliferation of the microprocessor, the utility of the medium-scale computer, and the sheer computational power of the large-scale machine. Changes in new applications brought about because of ever lowering costs, smaller sizes, and faster switching times are included.
DEFF Research Database (Denmark)
Zou, Yihuan
is about constructing a more inclusive understanding of quality in higher education through combining the macro, meso and micro levels, i.e. from the perspectives of national policy, higher education institutions as organizations in society, individual teaching staff and students. It covers both......Quality in higher education was not invented in recent decades – universities have always possessed mechanisms for assuring the quality of their work. The rising concern over quality is closely related to the changes in higher education and its social context. Among others, the most conspicuous...... changes are the massive expansion, diversification and increased cost in higher education, and new mechanisms of accountability initiated by the state. With these changes the traditional internally enacted academic quality-keeping has been given an important external dimension – quality assurance, which...
The cost of electrocoagulation
Energy Technology Data Exchange (ETDEWEB)
Donini, J.C.; Kan, J.; Szynkarczuk, J.; Hassan, T.A.; Kar, K.L.
1993-01-01
Electrocoagulation could be an attractive and suitable method for separating solids from waste water. The electrocoagulation of kaolinite and bentonite suspensions was studied in a pilot electrocoagulation unit to assess the cost and efficiency of the process. Factors affecting cost such as the formation of passivation layers on electrode plates and the recirculation and concentration of sodium chloride were examined. Colorimetry was used to analyze aluminum content in the suspension. The results were used to calculate the cost due to consumption of electrode material (aluminium) during the process. Total cost was assumed to comprise the energy cost and the cost of electrode material. Comparison was based on the settling properties of the treated product: turbidity, settling rate, and cake height. In most cases, aluminium efficiency averaged around 200% and material cost accounted for 80% of total cost. Although higher concentrations of sodium chloride could only slightly increase aluminium efficiency and electrode efficiency, the higher concentrations resulted in much greater total cost, due to the greater current generated by the increased suspension conductivity, which in turn dissolved a larger amount of aluminium. The recirculation loop increased the flow rate by 3-10 times, enhancing the mass transport between the electrodes and resulting in lower cost and better settling properties. Over the course of two months the electrodes coatings became thicker while efficiency decreased. The electrode efficiency was found to be as high as 94% for virgin electrodes and as low as 10% after two months. 8 refs., 25 figs., 9 tabs.
Battlefield awareness computers: the engine of battlefield digitization
Ho, Jackson; Chamseddine, Ahmad
1997-06-01
To modernize the army for the 21st century, the U.S. Army Digitization Office (ADO) initiated in 1995 the Force XXI Battle Command Brigade-and-Below (FBCB2) Applique program which became a centerpiece in the U.S. Army's master plan to win future information wars. The Applique team led by TRW fielded a 'tactical Internet' for Brigade and below command to demonstrate the advantages of 'shared situation awareness' and battlefield digitization in advanced war-fighting experiments (AWE) to be conducted in March 1997 at the Army's National Training Center in California. Computing Devices is designated the primary hardware developer for the militarized version of the battlefield awareness computers. The first generation of militarized battlefield awareness computer, designated as the V3 computer, was an integration of off-the-shelf components developed to meet the agressive delivery requirements of the Task Force XXI AWE. The design efficiency and cost effectiveness of the computer hardware were secondary in importance to delivery deadlines imposed by the March 1997 AWE. However, declining defense budgets will impose cost constraints on the Force XXI production hardware that can only be met by rigorous value engineering to further improve design optimization for battlefield awareness without compromising the level of reliability the military has come to expect in modern military hardened vetronics. To answer the Army's needs for a more cost effective computing solution, Computing Devices developed a second generation 'combat ready' battlefield awareness computer, designated the V3+, which is designed specifically to meet the upcoming demands of Force XXI (FBCB2) and beyond. The primary design objective is to achieve a technologically superior design, value engineered to strike an optimal balance between reliability, life cycle cost, and procurement cost. Recognizing that the diverse digitization demands of Force XXI cannot be adequately met by any one computer hardware
Cloud computing for comparative genomics with windows azure platform.
Kim, Insik; Jung, Jae-Yoon; Deluca, Todd F; Nelson, Tristan H; Wall, Dennis P
2012-01-01
Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services.
Lowering Business Education Cost with a Custom Professor-Written Online Text
Baker-Eveleth, Lori Jo; Miller, Jon Robert; Tucker, Laura
2011-01-01
Inflation-adjusted tuition and fees in education have risen for decades. College textbook costs have risen as well. The authors discuss reasons for higher textbook costs. The development and use of encyclopedic introductory textbooks creates higher monetary cost for students and higher nonmonetary cost for students and teachers, from increased…
Time Domain Partitioning of Electricity Production Cost Simulations
Energy Technology Data Exchange (ETDEWEB)
Barrows, C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hummon, M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Jones, W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hale, E. [National Renewable Energy Lab. (NREL), Golden, CO (United States)
2014-01-01
Production cost models are often used for planning by simulating power system operations over long time horizons. The simulation of a day-ahead energy market can take several weeks to compute. Tractability improvements are often made through model simplifications, such as: reductions in transmission modeling detail, relaxation of commitment variable integrality, reductions in cost modeling detail, etc. One common simplification is to partition the simulation horizon so that weekly or monthly horizons can be simulated in parallel. However, horizon partitions are often executed with overlap periods of arbitrary and sometimes zero length. We calculate the time domain persistence of historical unit commitment decisions to inform time domain partitioning of production cost models. The results are implemented using PLEXOS production cost modeling software in an HPC environment to improve the computation time of simulations while maintaining solution integrity.
Cost-effectiveness of clinical decision support system in improving maternal health care in Ghana.
Directory of Open Access Journals (Sweden)
Maxwell Ayindenaba Dalaba
Full Text Available This paper investigated the cost-effectiveness of a computer-assisted Clinical Decision Support System (CDSS in the identification of maternal complications in Ghana.A cost-effectiveness analysis was performed in a before- and after-intervention study. Analysis was conducted from the provider's perspective. The intervention area was the Kassena- Nankana district where computer-assisted CDSS was used by midwives in maternal care in six selected health centres. Six selected health centers in the Builsa district served as the non-intervention group, where the normal Ghana Health Service activities were being carried out.Computer-assisted CDSS increased the detection of pregnancy complications during antenatal care (ANC in the intervention health centres (before-intervention = 9 /1,000 ANC attendance; after-intervention = 12/1,000 ANC attendance; P-value = 0.010. In the intervention health centres, there was a decrease in the number of complications during labour by 1.1%, though the difference was not statistically significant (before-intervention =107/1,000 labour clients; after-intervention = 96/1,000 labour clients; P-value = 0.305. Also, at the intervention health centres, the average cost per pregnancy complication detected during ANC (cost -effectiveness ratio decreased from US$17,017.58 (before-intervention to US$15,207.5 (after-intervention. Incremental cost -effectiveness ratio (ICER was estimated at US$1,142. Considering only additional costs (cost of computer-assisted CDSS, cost per pregnancy complication detected was US$285.Computer -assisted CDSS has the potential to identify complications during pregnancy and marginal reduction in labour complications. Implementing computer-assisted CDSS is more costly but more effective in the detection of pregnancy complications compared to routine maternal care, hence making the decision to implement CDSS very complex. Policy makers should however be guided by whether the additional benefit is worth
Effects of housing system on the costs of commercial egg production1
Matthews, W. A.; Sumner, D. A.
2014-01-01
This article reports the first publicly available egg production costs compared across 3 hen-housing systems. We collected detailed data from 2 flock cycles from a commercial egg farm operating a conventional barn, an aviary, and an enriched colony system at the same location. The farm employed the same operational and accounting procedures for each housing system. Results provide clear evidence that egg production costs are much higher for the aviary system than the other 2 housing systems. Feed costs per dozen eggs are somewhat higher for the aviary and lower for the enriched house compared with the conventional house. Labor costs are much lower for the conventional house than the other 2, and pullet costs are much higher for the aviary. Energy and miscellaneous costs are a minimal part of total operating costs and do not differ by housing system. Total capital investments per hen-capacity are much higher for the aviary and the enriched house. Capital costs per dozen eggs depend on assumptions about appropriate interest and depreciation rates. Using the same 10% rate for each housing system shows capital costs per dozen for the aviary and the enriched housing system are much higher than capital costs per dozen for the conventional house. The aviary has average operating costs (feed, labor, pullet, energy, and miscellaneous costs that recur for each flock and vary with egg production) about 23% higher and average total costs about 36% higher compared with the conventional house. The enriched housing system has average operating costs only about 4% higher compared with the conventional house, but average total costs are 13% higher than for the conventional house. PMID:25480736
Goudey, Benjamin; Abedini, Mani; Hopper, John L; Inouye, Michael; Makalic, Enes; Schmidt, Daniel F; Wagner, John; Zhou, Zeyu; Zobel, Justin; Reumann, Matthias
2015-01-01
Genome-wide association studies (GWAS) are a common approach for systematic discovery of single nucleotide polymorphisms (SNPs) which are associated with a given disease. Univariate analysis approaches commonly employed may miss important SNP associations that only appear through multivariate analysis in complex diseases. However, multivariate SNP analysis is currently limited by its inherent computational complexity. In this work, we present a computational framework that harnesses supercomputers. Based on our results, we estimate a three-way interaction analysis on 1.1 million SNP GWAS data requiring over 5.8 years on the full "Avoca" IBM Blue Gene/Q installation at the Victorian Life Sciences Computation Initiative. This is hundreds of times faster than estimates for other CPU based methods and four times faster than runtimes estimated for GPU methods, indicating how the improvement in the level of hardware applied to interaction analysis may alter the types of analysis that can be performed. Furthermore, the same analysis would take under 3 months on the currently largest IBM Blue Gene/Q supercomputer "Sequoia" at the Lawrence Livermore National Laboratory assuming linear scaling is maintained as our results suggest. Given that the implementation used in this study can be further optimised, this runtime means it is becoming feasible to carry out exhaustive analysis of higher order interaction studies on large modern GWAS.
Directory of Open Access Journals (Sweden)
A Saravanakumar
2015-01-01
Full Text Available In the present work, a pediatric head and body phantom was fabricated using polymethyl methacrylate (PMMA at a low cost when compared to commercially available phantoms for the purpose of computed tomography (CT dosimetry. The dimensions of head and body phantoms were 10 cm diameter, 15 cm length and 16 cm diameter, 15 cm length, respectively. The dose from a 128-slice CT machine received by the head and body phantom at the center and periphery were measured using a 100 mm pencil ion chamber and 150 mm CT dose profiler (CTDP. Using these values, the weighted computed tomography dose index (CTDIw and in turn the volumetric CTDI (CTDIv were calculated for various combinations of tube voltage and current-time product. A similar study was carried out using standard calibrated phantom and the results have been compared with the fabricated ones to ascertain that the performance of the latter is equivalent to that of the former. Finally, CTDIv measured using fabricated and standard phantoms were compared with respective values displayed on the console. The difference between the values was well within the limits specified by Atomic Energy Regulatory Board (AERB, India. These results indicate that the cost-effective pediatric phantom can be employed for CT dosimetry.
Melvin, Richard G; Ballard, J William O
2011-07-01
Males and females age at different rates in a variety of species, but the mechanisms underlying the difference is not understood. In this study, we investigated sex-specific costs of a naturally occurring mildly deleterious deletion (DTrp85, DVal86) in cytochrome c oxidase subunit 7A (cox7A) in Drosophila simulans. We observed that females and males homozygous for the mutation had 30% and 26% reduced Cox activity, respectively, compared with wild type. Furthermore, 4-day-old females had 34%-42% greater physical activity than males. Greater physical activity in mutant females was correlated with a 19% lower 50% survival compared with wild-type females. Mutant and wild-type males had equal survival. These data suggest that females paid a higher cost of the mutation than did males. The data demonstrate linking population genetics and structural modeling to experimental manipulations that lead to functional predictions of mitochondrial bioenergetics and organism aging.
Computer-Mediated Assessment of Higher-Order Thinking Development
Tilchin, Oleg; Raiyn, Jamal
2015-01-01
Solving complicated problems in a contemporary knowledge-based society requires higher-order thinking (HOT). The most productive way to encourage development of HOT in students is through use of the Problem-based Learning (PBL) model. This model organizes learning by solving corresponding problems relative to study courses. Students are directed…
ResourceGate: A New Solution for Cloud Computing Resource Allocation
Abdullah A. Sheikh
2012-01-01
Cloud computing has taken place to be focused by educational and business communities. These concerns include their needs to improve the Quality of Services (QoS) provided, also services such as reliability, performance and reducing costs. Cloud computing provides many benefits in terms of low cost and accessibility of data. Ensuring these benefits is considered to be the major factor in the cloud computing environment. This paper surveys recent research related to cloud computing resource al...
ANL statement of site strategy for computing workstations
Energy Technology Data Exchange (ETDEWEB)
Fenske, K.R. (ed.); Boxberger, L.M.; Amiot, L.W.; Bretscher, M.E.; Engert, D.E.; Moszur, F.M.; Mueller, C.J.; O' Brien, D.E.; Schlesselman, C.G.; Troyer, L.J.
1991-11-01
This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is to develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.
19 CFR 152.106 - Computed value.
2010-04-01
... 19 Customs Duties 2 2010-04-01 2010-04-01 false Computed value. 152.106 Section 152.106 Customs... (CONTINUED) CLASSIFICATION AND APPRAISEMENT OF MERCHANDISE Valuation of Merchandise § 152.106 Computed value. (a) Elements. The computed value of imported merchandise is the sum of: (1) The cost or value of the...
Cost optimization for buildings with hybrid ventilation systems
Ji, Kun; Lu, Yan
2018-02-13
A method including: computing a total cost for a first zone in a building, wherein the total cost is equal to an actual energy cost of the first zone plus a thermal discomfort cost of the first zone; and heuristically optimizing the total cost to identify temperature setpoints for a mechanical heating/cooling system and a start time and an end time of the mechanical heating/cooling system, based on external weather data and occupancy data of the first zone.
Harris, Catherine R; Osterberg, E Charles; Sanford, Thomas; Alwaal, Amjad; Gaither, Thomas W; McAninch, Jack W; McCulloch, Charles E; Breyer, Benjamin N
2016-08-01
To determine which factors are associated with higher costs of urethroplasty procedure and whether these factors have been increasing over time. Identification of determinants of extreme costs may help reduce cost while maintaining quality. We conducted a retrospective analysis using the 2001-2010 Healthcare Cost and Utilization Project-Nationwide Inpatient Sample (HCUP-NIS). The HCUP-NIS captures hospital charges which we converted to cost using the HCUP cost-to-charge ratio. Log cost linear regression with sensitivity analysis was used to determine variables associated with increased costs. Extreme cost was defined as the top 20th percentile of expenditure, analyzed with logistic regression, and expressed as odds ratios (OR). A total of 2298 urethroplasties were recorded in NIS over the study period. The median (interquartile range) calculated cost was $7321 ($5677-$10,000). Patients with multiple comorbid conditions were associated with extreme costs [OR 1.56, 95% confidence interval (CI) 1.19-2.04, P = .02] compared with patients with no comorbid disease. Inpatient complications raised the odds of extreme costs (OR 3.2, CI 2.14-4.75, P costs (OR 1.78, 95% CI 1.2-2.64, P = .005). Variations in patient age, race, hospital region, bed size, teaching status, payor type, and volume of urethroplasty cases were not associated with extremes of cost. Cost variation for perioperative inpatient urethroplasty procedures is dependent on preoperative patient comorbidities, postoperative complications, and surgical complexity related to graft usage. Procedural cost and cost variation are critical for understanding which aspects of care have the greatest impact on cost. Copyright © 2016 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Maxwell Ayindenaba Dalaba
Full Text Available This study analyzed cost of implementing computer-assisted Clinical Decision Support System (CDSS in selected health care centres in Ghana.A descriptive cross sectional study was conducted in the Kassena-Nankana district (KND. CDSS was deployed in selected health centres in KND as an intervention to manage patients attending antenatal clinics and the labour ward. The CDSS users were mainly nurses who were trained. Activities and associated costs involved in the implementation of CDSS (pre-intervention and intervention were collected for the period between 2009-2013 from the provider perspective. The ingredients approach was used for the cost analysis. Costs were grouped into personnel, trainings, overheads (recurrent costs and equipment costs (capital cost. We calculated cost without annualizing capital cost to represent financial cost and cost with annualizing capital costs to represent economic cost.Twenty-two trained CDSS users (at least 2 users per health centre participated in the study. Between April 2012 and March 2013, users managed 5,595 antenatal clients and 872 labour clients using the CDSS. We observed a decrease in the proportion of complications during delivery (pre-intervention 10.74% versus post-intervention 9.64% and a reduction in the number of maternal deaths (pre-intervention 4 deaths versus post-intervention 1 death. The overall financial cost of CDSS implementation was US$23,316, approximately US$1,060 per CDSS user trained. Of the total cost of implementation, 48% (US$11,272 was pre-intervention cost and intervention cost was 52% (US$12,044. Equipment costs accounted for the largest proportion of financial cost: 34% (US$7,917. When economic cost was considered, total cost of implementation was US$17,128-lower than the financial cost by 26.5%.The study provides useful information in the implementation of CDSS at health facilities to enhance health workers' adherence to practice guidelines and taking accurate decisions to
Dalaba, Maxwell Ayindenaba; Akweongo, Patricia; Williams, John; Saronga, Happiness Pius; Tonchev, Pencho; Sauerborn, Rainer; Mensah, Nathan; Blank, Antje; Kaltschmidt, Jens; Loukanova, Svetla
2014-01-01
This study analyzed cost of implementing computer-assisted Clinical Decision Support System (CDSS) in selected health care centres in Ghana. A descriptive cross sectional study was conducted in the Kassena-Nankana district (KND). CDSS was deployed in selected health centres in KND as an intervention to manage patients attending antenatal clinics and the labour ward. The CDSS users were mainly nurses who were trained. Activities and associated costs involved in the implementation of CDSS (pre-intervention and intervention) were collected for the period between 2009-2013 from the provider perspective. The ingredients approach was used for the cost analysis. Costs were grouped into personnel, trainings, overheads (recurrent costs) and equipment costs (capital cost). We calculated cost without annualizing capital cost to represent financial cost and cost with annualizing capital costs to represent economic cost. Twenty-two trained CDSS users (at least 2 users per health centre) participated in the study. Between April 2012 and March 2013, users managed 5,595 antenatal clients and 872 labour clients using the CDSS. We observed a decrease in the proportion of complications during delivery (pre-intervention 10.74% versus post-intervention 9.64%) and a reduction in the number of maternal deaths (pre-intervention 4 deaths versus post-intervention 1 death). The overall financial cost of CDSS implementation was US$23,316, approximately US$1,060 per CDSS user trained. Of the total cost of implementation, 48% (US$11,272) was pre-intervention cost and intervention cost was 52% (US$12,044). Equipment costs accounted for the largest proportion of financial cost: 34% (US$7,917). When economic cost was considered, total cost of implementation was US$17,128-lower than the financial cost by 26.5%. The study provides useful information in the implementation of CDSS at health facilities to enhance health workers' adherence to practice guidelines and taking accurate decisions to improve
Computer-based interventions for drug use disorders: A systematic review
Moore, Brent A.; Fazzino, Tera; Garnet, Brian; Cutter, Christopher J.; Barry, Declan T.
2011-01-01
A range of innovative computer-based interventions for psychiatric disorders have been developed, and are promising for drug use disorders, due to reduced cost and greater availability compared to traditional treatment. Electronic searches were conducted from 1966 to November 19, 2009 using MEDLINE, Psychlit, and EMBASE. 468 non-duplicate records were identified. Two reviewers classified abstracts for study inclusion, resulting in 12 studies of moderate quality. Eleven studies were pilot or full-scale trials compared to a control condition. Interventions showed high acceptability despite substantial variation in type and amount of treatment. Compared to treatment-as-usual, computer-based interventions led to less substance use as well as higher motivation to change, better retention, and greater knowledge of presented information. Computer-based interventions for drug use disorders have the potential to dramatically expand and alter the landscape of treatment. Evaluation of internet and phone-based delivery that allow for treatment-on-demand in patients’ own environment is needed. PMID:21185683
Higher Franz-Reidemeister torsion
Igusa, Kiyoshi
2002-01-01
The book is devoted to the theory of topological higher Franz-Reidemeister torsion in K-theory. The author defines the higher Franz-Reidemeister torsion based on Volodin's K-theory and Borel's regulator map. He describes its properties and generalizations and studies the relation between the higher Franz-Reidemeister torsion and other torsions used in K-theory: Whitehead torsion and Ray-Singer torsion. He also presents methods of computing higher Franz-Reidemeister torsion, illustrates them with numerous examples, and describes various applications of higher Franz-Reidemeister torsion, particularly for the study of homology of mapping class groups. Packed with up-to-date information, the book provides a unique research and reference tool for specialists working in algebraic topology and K-theory.
Benchmarking undedicated cloud computing providers for analysis of genomic datasets.
Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W
2014-01-01
A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.
Benchmarking undedicated cloud computing providers for analysis of genomic datasets.
Directory of Open Access Journals (Sweden)
Seyhan Yazar
Full Text Available A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR on Amazon EC2 instances and Google Compute Engine (GCE, using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2 for E.coli and 53.5% (95% CI: 34.4-72.6 for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1 and 173.9% (95% CI: 134.6-213.1 more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.
A model for calculating the optimal replacement interval of computer systems
International Nuclear Information System (INIS)
Fujii, Minoru; Asai, Kiyoshi
1981-08-01
A mathematical model for calculating the optimal replacement interval of computer systems is described. This model is made to estimate the best economical interval of computer replacement when computing demand, cost and performance of computer, etc. are known. The computing demand is assumed to monotonously increase every year. Four kinds of models are described. In the model 1, a computer system is represented by only a central processing unit (CPU) and all the computing demand is to be processed on the present computer until the next replacement. On the other hand in the model 2, the excessive demand is admitted and may be transferred to other computing center and processed costly there. In the model 3, the computer system is represented by a CPU, memories (MEM) and input/output devices (I/O) and it must process all the demand. Model 4 is same as model 3, but the excessive demand is admitted to be processed in other center. (1) Computing demand at the JAERI, (2) conformity of Grosch's law for the recent computers, (3) replacement cost of computer systems, etc. are also described. (author)
Computer networks and advanced communications
International Nuclear Information System (INIS)
Koederitz, W.L.; Macon, B.S.
1992-01-01
One of the major methods for getting the most productivity and benefits from computer usage is networking. However, for those who are contemplating a change from stand-alone computers to a network system, the investigation of actual networks in use presents a paradox: network systems can be highly productive and beneficial; at the same time, these networks can create many complex, frustrating problems. The issue becomes a question of whether the benefits of networking are worth the extra effort and cost. In response to this issue, the authors review in this paper the implementation and management of an actual network in the LSU Petroleum Engineering Department. The network, which has been in operation for four years, is large and diverse (50 computers, 2 sites, PC's, UNIX RISC workstations, etc.). The benefits, costs, and method of operation of this network will be described, and an effort will be made to objectively weigh these elements from the point of view of the average computer user
Franken, Margreet G; Leeneman, Brenda; Jochems, Anouk; Schouwenburg, Maartje G; Aarts, Maureen J B; van Akkooi, Alexander C J; van den Berkmortel, Franchette W P J; van den Eertwegh, Alfonsus J M; de Groot, Jan Willem B; van der Hoeven, Koos J M; Hospers, Geke A P; Kapiteijn, Ellen; Koornstra, Rutger; Kruit, Wim H J; Louwman, Marieke W J; Piersma, Djura; van Rijn, Rozemarijn S; Suijkerbuijk, Karijn P M; Ten Tije, Albert J; Vreugdenhil, Gerard; Wouters, Michel W J M; van Zeijl, Michiel; Haanen, John B A G; Uyl-de Groot, Carin A
2018-07-01
There is limited evidence on the costs associated with ipilimumab. We investigated healthcare costs of all Dutch patients with advanced cutaneous melanoma who were treated with ipilimumab. Data were retrieved from the nation-wide Dutch Melanoma Treatment Registry. Costs were determined by applying unit costs to individual patient resource use. A total of 807 patients who were diagnosed between July 2012 and July 2015 received ipilimumab in Dutch practice. The mean (median) episode duration was 6.27 (4.61) months (computed from the start of ipilimumab until the start of a next treatment, death, or the last date of follow-up). The average total healthcare costs amounted to &OV0556;81 484, but varied widely (range: &OV0556;18 131-&OV0556;160 002). Ipilimumab was by far the most important cost driver (&OV0556;73 739). Other costs were related to hospital admissions (&OV0556;3323), hospital visits (&OV0556;1791), diagnostics and imaging (&OV0556;1505), radiotherapy (&OV0556;828), and surgery (&OV0556;297). Monthly costs for resource use other than ipilimumab were &OV0556;1997 (SD: &OV0556;2629). Treatment-naive patients (n=344) had higher total costs compared with previously-treated patients (n=463; &OV0556;85 081 vs. &OV0556;78 811). Although patients with colitis (n=106) had higher costs for resource use other than ipilimumab (&OV0556;11 426) compared with patients with other types of immune-related adverse events (n=90; &OV0556;9850) and patients with no immune-related adverse event (n=611; &OV0556;6796), they had lower total costs (&OV0556;76 075 vs. &OV0556;87 882 and &OV0556;81 480, respectively). In conclusion, this nation-wide study provides valuable insights into the healthcare costs of advanced cutaneous melanoma patients who were treated with ipilimumab in clinical practice. Most of the costs were attributable to ipilimumab, but the costs and its distribution varied considerably across subgroups.
Attrition Cost Model Instruction Manual
Yanagiura, Takeshi
2012-01-01
This instruction manual explains in detail how to use the Attrition Cost Model program, which estimates the cost of student attrition for a state's higher education system. Programmed with SAS, this model allows users to instantly calculate the cost of attrition and the cumulative attrition rate that is based on the most recent retention and…
Curreri, Peter A.; Hoffman, Eric; Domack, Marcia; Brewster, Jeb; Russell, Carolyn
2013-01-01
With the goal of lower cost (simplified manufacturing and lower part count) and higher performance (higher strength to weight alloys) the NASA Technical Maturation Program in 2006 funded a proposal to investigate spin forming of space launch vehicle cryogenic tank domes. The project funding continued under the NASA Exploration Technology Development Program through completion in FY12. The first phase of the project involved spin forming of eight, 1 meter diameter "path finder" domes. Half of these were processed using a concave spin form process (MT Aerospace, Augsburg Germany) and the other half using a convex process (Spincraft, Boston MA). The convex process has been used to produce the Ares Common Bulkhead and the concave process has been used to produce dome caps for the Space Shuttle light weight external tank and domes for the NASDA H2. Aluminum Lithium material was chosen because of its higher strength to weight ratio than the Aluminum 2219 baseline. Aluminum lithium, in order to obtain the desired temper (T8), requires a cold stretch after the solution heat treatment and quench. This requirement favors the concave spin form process which was selected for scale up. This paper describes the results of processing four, 5.5 meter diameter (upper stage scale) net shaped spin formed Aluminum Lithium domes. In order to allow scalability beyond the limits of foundry and rolling mills (about 12 foot width) the circular blank contained one friction stir weld (heavy lifter scales require a flat blank containing two welds). Mechanical properties data (tensile, fracture toughness, stress corrosion, and simulated service testing) for the parent metal and weld will also be discussed.
47 CFR 32.2124 - General purpose computers.
2010-10-01
... 47 Telecommunication 2 2010-10-01 2010-10-01 false General purpose computers. 32.2124 Section 32... General purpose computers. (a) This account shall include the original cost of computers and peripheral... financial, statistical, or other business analytical reports; preparation of payroll, customer bills, and...
DEFF Research Database (Denmark)
Chongtay, Rocio; Robering, Klaus
2016-01-01
In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...
Low cost monocrystalline silicon sheet fabrication for solar cells by advanced ingot technology
Fiegl, G. F.; Bonora, A. C.
1980-01-01
The continuous liquid feed (CLF) Czochralski furnace and the enhanced I.D. slicing technology for the low-cost production of monocrystalline silicon sheets for solar cells are discussed. The incorporation of the CLF system is shown to improve ingot production rate significantly. As demonstrated in actual runs, higher than average solidification rates (75 to 100 mm/hr for 150 mm 1-0-0 crystals) can be achieved, when the system approaches steady-state conditions. The design characteristics of the CLF furnace are detailed, noting that it is capable of precise control of dopant impurity incorporation in the axial direction of the crystal. The crystal add-on cost is computed to be $11.88/sq m, considering a projected 1986 25-slice per cm conversion factor with an 86% crystal growth yield.
Hanly, Paul; Skally, Mairead; Fenlon, Helen; Sharp, Linda
2012-10-01
The European Code Against Cancer recommends individuals aged ≥ 50 should participate in colorectal cancer screening. CT-colonography (CTC) is one of several screening tests available. We systematically reviewed evidence on, and identified key factors influencing, cost-effectiveness of CTC screening. PubMed, Medline, and the Cochrane library were searched for cost-effectiveness or cost-utility analyses of CTC-based screening, published in English, January 1999 to July 2010. Data was abstracted on setting, model type and horizon, screening scenario(s), comparator(s), participants, uptake, CTC performance and cost, effectiveness, ICERs, and whether extra-colonic findings and medical complications were considered. Sixteen studies were identified from the United States (n = 11), Canada (n = 2), and France, Italy, and the United Kingdom (1 each). Markov state-transition (n = 14) or microsimulation (n = 2) models were used. Eleven considered direct medical costs only; five included indirect costs. Fourteen compared CTC with no screening; fourteen compared CTC with colonoscopy-based screening; fewer compared CTC with sigmoidoscopy (8) or fecal tests (4). Outcomes assessed were life-years gained/saved (13), QALYs (2), or both (1). Three considered extra-colonic findings; seven considered complications. CTC appeared cost-effective versus no screening and, in general, flexible sigmoidoscopy and fecal occult blood testing. Results were mixed comparing CTC to colonoscopy. Parameters most influencing cost-effectiveness included: CTC costs, screening uptake, threshold for polyp referral, and extra-colonic findings. Evidence on cost-effectiveness of CTC screening is heterogeneous, due largely to between-study differences in comparators and parameter values. Future studies should: compare CTC with currently favored tests, especially fecal immunochemical tests; consider extra-colonic findings; and conduct comprehensive sensitivity analyses.
Who Should Pay for Higher Education?
Bou-Habib, Paul
2010-01-01
Policies that shift the costs of higher education from the taxpayer to the university student or graduate are increasingly popular, yet they have not been subjected to a thorough normative analysis. This paper provides a critical survey of the standard arguments that have been used in the public debate on higher education funding. These arguments…
Cost-based droop scheme with lower generation costs for microgrids
DEFF Research Database (Denmark)
Nutkani, Inam Ullah; Loh, Poh Chiang; Blaabjerg, Frede
2014-01-01
-based droop scheme, whose objective is to reduce a generation cost function realised with various DG operating characteristics taken into consideration. Where desired, proportional power sharing based on the DG kVA ratings can also be included, whose disadvantage is a slightly higher generation cost, which...... on the DG kilovolts ampere (kVA) ratings. Other factors like generation costs, efficiencies and emission penalties at different load demands have not been considered. This omission might not be appropriate if different types of DGs are present in the microgrids. As an alternative, this study proposes a cost...... is still lower than that produced by the traditional droop schemes. The proposed droop scheme therefore retains all advantages of the traditional droop schemes, whereas at the same time, keeps its generation cost low. These findings have been validated in experiments....
Computer Based Modelling and Simulation
Indian Academy of Sciences (India)
GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...
Rourke, Martha; Rourke, Patrick
1974-01-01
The school district business manager can make sound, cost-conscious decisions in the purchase of computer equipment by developing a list of cost-justified applications for automation, considering the software, writing performance specifications for bidding or negotiating a contract, and choosing the vendor wisely prior to the purchase; and by…
Directory of Open Access Journals (Sweden)
Anamaroa SIclovan
2011-12-01
Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offeredto the consumers as a product delivered online. This represents an advantage for the organization both regarding the cost and the opportunity for the new business. This paper presents the future perspectives in cloud computing. The paper presents some issues of the cloud computing paradigm. It is a theoretical paper.Keywords: Cloud Computing, Pay-per-use
Ant colony optimisation for economic dispatch problem with non-smooth cost functions
Energy Technology Data Exchange (ETDEWEB)
Pothiya, Saravuth; Kongprawechnon, Waree [School of Communication, Instrumentation and Control, Sirindhorn International Institute of Technology, Thammasat University, P.O. Box 22, Pathumthani (Thailand); Ngamroo, Issarachai [Center of Excellence for Innovative Energy Systems, Faculty of Engineering, King Mongkut' s Institute of Technology Ladkrabang, Bangkok 10520 (Thailand)
2010-06-15
This paper presents a novel and efficient optimisation approach based on the ant colony optimisation (ACO) for solving the economic dispatch (ED) problem with non-smooth cost functions. In order to improve the performance of ACO algorithm, three additional techniques, i.e. priority list, variable reduction, and zoom feature are presented. To show its efficiency and effectiveness, the proposed ACO is applied to two types of ED problems with non-smooth cost functions. Firstly, the ED problem with valve-point loading effects consists of 13 and 40 generating units. Secondly, the ED problem considering the multiple fuels consists of 10 units. Additionally, the results of the proposed ACO are compared with those of the conventional heuristic approaches. The experimental results show that the proposed ACO approach is comparatively capable of obtaining higher quality solution and faster computational time. (author)
Herd-Level Mastitis-Associated Costs on Canadian Dairy Farms
Aghamohammadi, Mahjoob; Haine, Denis; Kelton, David F.; Barkema, Herman W.; Hogeveen, Henk; Keefe, Gregory P.; Dufour, Simon
2018-01-01
Mastitis imposes considerable and recurring economic losses on the dairy industry worldwide. The main objective of this study was to estimate herd-level costs incurred by expenditures and production losses associated with mastitis on Canadian dairy farms in 2015, based on producer reports. Previously, published mastitis economic frameworks were used to develop an economic model with the most important cost components. Components investigated were divided between clinical mastitis (CM), subclinical mastitis (SCM), and other costs components (i.e., preventive measures and product quality). A questionnaire was mailed to 374 dairy producers randomly selected from the (Canadian National Dairy Study 2015) to collect data on these costs components, and 145 dairy producers returned a completed questionnaire. For each herd, costs due to the different mastitis-related components were computed by applying the values reported by the dairy producer to the developed economic model. Then, for each herd, a proportion of the costs attributable to a specific component was computed by dividing absolute costs for this component by total herd mastitis-related costs. Median self-reported CM incidence was 19 cases/100 cow-year and mean self-reported bulk milk somatic cell count was 184,000 cells/mL. Most producers reported using post-milking teat disinfection (97%) and dry cow therapy (93%), and a substantial proportion of producers reported using pre-milking teat disinfection (79%) and wearing gloves during milking (77%). Mastitis costs were substantial (662 CAD per milking cow per year for a typical Canadian dairy farm), with a large portion of the costs (48%) being attributed to SCM, and 34 and 15% due to CM and implementation of preventive measures, respectively. For SCM, the two most important cost components were the subsequent milk yield reduction and culling (72 and 25% of SCM costs, respectively). For CM, first, second, and third most important cost components were culling (48
Herd-Level Mastitis-Associated Costs on Canadian Dairy Farms
Directory of Open Access Journals (Sweden)
Mahjoob Aghamohammadi
2018-05-01
Full Text Available Mastitis imposes considerable and recurring economic losses on the dairy industry worldwide. The main objective of this study was to estimate herd-level costs incurred by expenditures and production losses associated with mastitis on Canadian dairy farms in 2015, based on producer reports. Previously, published mastitis economic frameworks were used to develop an economic model with the most important cost components. Components investigated were divided between clinical mastitis (CM, subclinical mastitis (SCM, and other costs components (i.e., preventive measures and product quality. A questionnaire was mailed to 374 dairy producers randomly selected from the (Canadian National Dairy Study 2015 to collect data on these costs components, and 145 dairy producers returned a completed questionnaire. For each herd, costs due to the different mastitis-related components were computed by applying the values reported by the dairy producer to the developed economic model. Then, for each herd, a proportion of the costs attributable to a specific component was computed by dividing absolute costs for this component by total herd mastitis-related costs. Median self-reported CM incidence was 19 cases/100 cow-year and mean self-reported bulk milk somatic cell count was 184,000 cells/mL. Most producers reported using post-milking teat disinfection (97% and dry cow therapy (93%, and a substantial proportion of producers reported using pre-milking teat disinfection (79% and wearing gloves during milking (77%. Mastitis costs were substantial (662 CAD per milking cow per year for a typical Canadian dairy farm, with a large portion of the costs (48% being attributed to SCM, and 34 and 15% due to CM and implementation of preventive measures, respectively. For SCM, the two most important cost components were the subsequent milk yield reduction and culling (72 and 25% of SCM costs, respectively. For CM, first, second, and third most important cost components were
Prototyping low-cost and flexible vehicle diagnostic systems
Directory of Open Access Journals (Sweden)
Marisol GARCÍA-VALLS
2016-12-01
Full Text Available Diagnostic systems are software and hardware-based equipment that interoperate with an external monitored system. Traditionally, they have been expensive equipment running test algorithms to monitor physical properties of, e.g., vehicles, or civil infrastructure equipment, among others. As computer hardware is increasingly powerful (whereas its cost and size is decreasing and communication software becomes easier to program and more run-time efficient, new scenarios are enabled that yield to lower cost monitoring solutions. This paper presents a low cost approach towards the development of a diagnostic systems relying on a modular component-based approach and running on a resource limited embedded computer. Results on a prototype implementation are shown that validate the presented design, its flexibility, performance, and communication latency.
Advanced computer-based training
Energy Technology Data Exchange (ETDEWEB)
Fischer, H D; Martin, H D
1987-05-01
The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment.
Advanced computer-based training
International Nuclear Information System (INIS)
Fischer, H.D.; Martin, H.D.
1987-01-01
The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment. (orig.) [de
LPGC, Levelized Steam Electric Power Generator Cost
International Nuclear Information System (INIS)
Coen, J.J.; Delene, J.G.
1994-01-01
1 - Description of program or function: LPGC is a set of nine microcomputer programs for estimating power generation costs for large steam-electric power plants. These programs permit rapid evaluation using various sets of economic and technical ground rules. The levelized power generation costs calculated may be used to compare the relative economics of nuclear and coal-fired plants based on life-cycle costs. Cost calculations include capital investment cost, operation and maintenance cost, fuel cycle cost, decommissioning cost, and total levelized power generation cost. These programs can be used for quick analyses of power generation costs using alternative economic parameters, such as interest rate, escalation rate, inflation rate, plant lead times, capacity factor, fuel prices, etc. The two major types of electric generating plants considered are pressurized-water reactor (PWR) and pulverized coal-fired plants. Data are also provided for the Large Scale Prototype Breeder (LSPB) type liquid metal reactor. Costs for plant having either one or two units may be obtained. 2 - Method of solution: LPGC consists of nine individual menu-driven programs controlled by a driver program, MAINPWR. The individual programs are PLANTCAP, for calculating capital investment costs; NUCLOM, for determining operation and maintenance (O and M) costs for nuclear plants; COALOM, for computing O and M costs for coal-fired plants; NFUEL, for calculating levelized fuel costs for nuclear plants; COALCOST, for determining levelized fuel costs for coal-fired plants; FCRATE, for computing the fixed charge rate on the capital investment; LEVEL, for calculating levelized power generation costs; CAPITAL, for determining capitalized cost from overnight cost; and MASSGEN, for generating, deleting, or changing fuel cycle mass balance data for use with NFUEL. LPGC has three modes of operation. In the first, each individual code can be executed independently to determine one aspect of the total
48 CFR 1602.170-5 - Cost or pricing data.
2010-10-01
... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Cost or pricing data. 1602... Terms 1602.170-5 Cost or pricing data. (a) Experience-rated carriers. Cost or pricing data for... pricing data for community rated carriers is the specialized rating data used by carriers in computing a...
PET-CT in oncological patients: analysis of informal care costs in cost-benefit assessment.
Orlacchio, Antonio; Ciarrapico, Anna Micaela; Schillaci, Orazio; Chegai, Fabrizio; Tosti, Daniela; D'Alba, Fabrizio; Guazzaroni, Manlio; Simonetti, Giovanni
2014-04-01
The authors analysed the impact of nonmedical costs (travel, loss of productivity) in an economic analysis of PET-CT (positron-emission tomography-computed tomography) performed with standard contrast-enhanced CT protocols (CECT). From October to November 2009, a total of 100 patients referred to our institute were administered a questionnaire to evaluate the nonmedical costs of PET-CT. In addition, the medical costs (equipment maintenance and depreciation, consumables and staff) related to PET-CT performed with CECT and PET-CT with low-dose nonenhanced CT and separate CECT were also estimated. The medical costs were 919.3 euro for PET-CT with separate CECT, and 801.3 euro for PET-CT with CECT. Therefore, savings of approximately 13% are possible. Moreover, savings in nonmedical costs can be achieved by reducing the number of hospital visits required by patients undergoing diagnostic imaging. Nonmedical costs heavily affect patients' finances as well as having an indirect impact on national health expenditure. Our results show that PET-CT performed with standard dose CECT in a single session provides benefits in terms of both medical and nonmedical costs.
The cost of tuberculosis in Denmark
DEFF Research Database (Denmark)
Fløe, Andreas; Hilberg, Ole; Wejse, Christian
Hypothesis: Tuberculosis (TB) patients carry higher direct health-related and indirect costs than the general population. Objective: To calculate the economic burden of TB in Denmark, including the health-related costs of treatment and the indirect costs for society in a national retrospective case...
International Nuclear Information System (INIS)
Zachariadou, K; Yiasemides, K; Trougkakos, N
2012-01-01
We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C number sign and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments’ academic staff. (paper)
Energy Technology Data Exchange (ETDEWEB)
Zachariadou, K; Yiasemides, K; Trougkakos, N [Technological Educational Institute of Piraeus, P Ralli and Thivon 250, 12244 Egaleo (Greece)
2012-11-15
We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C number sign and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments' academic staff. (paper)
Yankee links computing needs, increases productivity
International Nuclear Information System (INIS)
Anon.
1994-01-01
Yankee Atomic Electric Company provides design and consultation services to electric utility companies that operate nuclear power plants. This means bringing together the skills and talents of more than 500 people in many disciplines, including computer-aided design, human resources, financial services, and nuclear engineering. The company was facing a problem familiar to many companies in the nuclear industry.Key corporate data and applications resided on UNIX or other types of computer systems, but most users at Yankee had personal computers on their desks. How could Yankee enable the PC users to share the data, applications, and resources of the larger computing environment such as UNIX, while ensuring they could still use their favorite PC applications? The solution was PC-NFS from Sunsoft, of Chelmsford, Mass., which links PCs to UNIX and other systems. The Yankee computing story is an example of computer downsizing-the trend of moving away from mainframe computers in favor of lower-cost, more flexible client/server computing. Today, Yankee Atomic has more than 350 PCs on desktops throughout the company, using PC-NFS, which enables them t;o use the data, applications, disks, and printers of the FUNIX server systems. This new client/server environment has reduced Yankee's computing costs while increasing its computing power and its ability to respond to customers
An efficient and cost effective nuclear medicine image network
International Nuclear Information System (INIS)
Sampathkumaran, K.S.; Miller, T.R.
1987-01-01
An image network that is in use in a large nuclear medicine department is described. This network was designed to efficiently handle a large volume of clinical data at reasonable cost. Small, limited function computers are attached to each scintillation camera for data acquisition. The images are transferred by cable network or floppy disc to a large, powerful central computer for processing and display. Cost is minimized by use of small acquisition computers not equipped with expensive video display systems or elaborate analysis software. Thus, financial expenditure can be concentrated in a powerful central computer providing a centralized data base, rapid processing, and an efficient environment for program development. Clinical work is greatly facilitated because the physicians can process and display all studies without leaving the main reading area. (orig.)
1979-12-01
because of the use of complex computational algorithms (Ref 25). Another important factor effecting the cost of soft- ware is the size of the development...involved the alignment and navigational algorithm portions of the software. The second avionics system application was the development of an inertial...001 1 COAT CONL CREA CINT CMAT CSTR COPR CAPP New Code .001 .001 .001 .001 1001 ,OO .00 Device TDAT T03NL TREA TINT Types o * Quantity OGAT OONL OREA
The Future Train Wreck: Paying for Medical Costs for Higher Education's Retirees
Biggs, John H.
2006-01-01
Trustees and administrators today confront one of two problems with post-retirement medical care. First, if institutions provide no support for their retirees' medical care, they implicitly offer a powerful incentive for senior faculty to stay on. The compensation and opportunity costs of this effect are obviously very high. But, second, if they…
Cloud computing: An innovative tool for library services
Sahu, R.
2015-01-01
Cloud computing is a new technique of information communication technology because of its potential benefits such as reduced cost, accessible anywhere any time as well as its elasticity and flexibility. In this Paper defines cloud Computing, definition, essential characteristics, model of cloud computing, components of cloud, advantages & drawbacks of cloud computing and also describe cloud computing in libraries.
Activity-based costing methodology as tool for costing in hematopathology laboratory.
Gujral, Sumeet; Dongre, Kanchan; Bhindare, Sonal; Subramanian, P G; Narayan, Hkv; Mahajan, Asim; Batura, Rekha; Hingnekar, Chitra; Chabbria, Meenu; Nair, C N
2010-01-01
Cost analysis in laboratories represents a necessary phase in their scientific progression. To calculate indirect cost and thus total cost per sample of various tests at Hematopathology laboratory (HPL). Activity-based costing (ABC) method is used to calculate per cost test of the hematopathology laboratory. Information is collected from registers, purchase orders, annual maintenance contracts (AMCs), payrolls, account books, hospital bills and registers along with informal interviews with hospital staff. Cost per test decreases as total number of samples increases. Maximum annual expense at the HPL is on reagents and consumables followed by manpower. Cost per test is higher for specialized tests which interpret morphological or flow data and are done by a pathologist. Despite several limitations and assumptions, this was an attempt to understand how the resources are consumed in a large size government-run laboratory. The rate structure needs to be revised for most of the tests, mainly for complete blood counts (CBC), bone marrow examination, coagulation tests and Immunophenotyping. This costing exercise is laboratory specific and each laboratory needs to do its own costing. Such an exercise may help a laboratory redesign its costing structure or at least understand the economics involved in the laboratory management.
Precise fixpoint computation through strategy iteration
DEFF Research Database (Denmark)
Gawlitza, Thomas; Seidl, Helmut
2007-01-01
We present a practical algorithm for computing least solutions of systems of equations over the integers with addition, multiplication with positive constants, maximum and minimum. The algorithm is based on strategy iteration. Its run-time (w.r.t. the uniform cost measure) is independent of the s......We present a practical algorithm for computing least solutions of systems of equations over the integers with addition, multiplication with positive constants, maximum and minimum. The algorithm is based on strategy iteration. Its run-time (w.r.t. the uniform cost measure) is independent...
Model reduction by weighted Component Cost Analysis
Kim, Jae H.; Skelton, Robert E.
1990-01-01
Component Cost Analysis considers any given system driven by a white noise process as an interconnection of different components, and assigns a metric called 'component cost' to each component. These component costs measure the contribution of each component to a predefined quadratic cost function. A reduced-order model of the given system may be obtained by deleting those components that have the smallest component costs. The theory of Component Cost Analysis is extended to include finite-bandwidth colored noises. The results also apply when actuators have dynamics of their own. Closed-form analytical expressions of component costs are also derived for a mechanical system described by its modal data. This is very useful to compute the modal costs of very high order systems. A numerical example for MINIMAST system is presented.
Learning Together; part 2: training costs and health gain - a cost analysis.
Cullen, Katherine; Riches, Wendy; Macaulay, Chloe; Spicer, John
2017-01-01
Learning Together is a complex educational intervention aimed at improving health outcomes for children and young people. There is an additional cost as two doctors are seeing patients together for a longer appointment than a standard general practice (GP) appointment. Our approach combines the impact of the training clinics on activity in South London in 2014-15 with health gain, using NICE guidance and standards to allow comparison of training options. Activity data was collected from Training Practices hosting Learning Together. A computer based model was developed to analyse the costs of the Learning Together intervention compared to usual training in a partial economic evaluation. The results of the model were used to value the health gain required to make the intervention cost effective. Data were returned for 363 patients booked into 61 clinics across 16 Training Practices. Learning Together clinics resulted in an increase in costs of £37 per clinic. Threshold analysis illustrated one child with a common illness like constipation needs to be well for two weeks, in one Practice hosting four training clinics for the clinics to be considered cost effective. Learning Together is of minimal training cost. Our threshold analysis produced a rubric that can be used locally to test cost effectiveness at a Practice or Programme level.
Computation of spot prices and congestion costs in large interconnected power systems
International Nuclear Information System (INIS)
Mukerji, R.; Jordan, G.A.; Clayton, R.; Haringa, G.E.
1995-01-01
Foremost among the new paradigms for the US utility industry is the ''poolco'' concept proposed by Prof. William W. Hogan of Harvard University. This concept uses a central pool or power exchange in which physical power is traded based on spot prices or market clearing prices. The rapid and accurate calculation of these ''spot'' prices and associated congestion costs for large interconnected power systems is the central tenet upon which the poolco concept is based. The market clearing price would be the same throughout the system if there were no system losses and transmission limitations did not exist. System losses cause small differences in market clearing prices as the cost of supplying a MW at various load buses includes the cost of losses. Transmission limits may cause large differences in market clearing prices between regions as low cost generation is blocked by the transmission constraints from serving certain loads. In models currently in use in the electric power industry spot price calculations range from ''bubble diagram'' type contract path models to full electrical representation such as GE-MAPS. The modeling aspects of the full electrical representation are included in the Appendix. The problem with the bubble diagram representation is that these models are liable to produce unacceptably large errors in the calculation of spot prices and congestion costs. The subtleties of the calculation of spot prices and congestion costs are illustrated in this paper
International Nuclear Information System (INIS)
Soma, Tsutomu; Takaki, Akihiro; Teraoka, Satomi; Ishikawa, Yasushi; Murase, Kenya; Koizumi, Kiyoshi
2008-01-01
We studied the behaviors of cost functions in the registration of thallium-201 ( 201 Tl) brain tumor single-photon emission computed tomography (SPECT) and magnetic resonance (MR) images, as the similarity index of image positioning. A marker for image registration [technetium-99m ( 99m Tc) point source] was attached at three sites on the heads of 13 patients with brain tumor, from whom 42 sets of 99m Tc- 201 Tl SPECT (the dual-isotope acquisition) and MR images were obtained. The 201 Tl SPECT and MR images were manually registered according to the markers. From the positions where the two images were registered, the position of the 201 Tl SPECT was moved to examine the behaviors of the three cost functions, i.e., ratio image uniformity (RIU), mutual information (MI), and normalized MI (NMI). The cost functions MI and NMI reached the maximum at positions adjacent to those where the SPECT and MR images were manually registered. As for the accuracy of image registration in terms of the cost functions MI and NMI, on average, the images were accurately registered within 3 deg of rotation around the X-, Y-, and Z-axes, and within 1.5 mm (within 2 pixels), 3 mm (within 3 pixels), and 4 mm (within 1 slice) of translation to the X-, Y-, and Z-axes, respectively. In terms of rotation around the Z-axis, the cost function RIU reached the minimum at positions where the manual registration of the two images was substantially inadequate. The MI and NMI were suitable cost functions in the registration of 201 Tl SPECT and MR images. The behavior of the RIU, in contrast, was unstable, being unsuitable as an index of image registration. (author)
Cost effective distributed computing for Monte Carlo radiation dosimetry
International Nuclear Information System (INIS)
Wise, K.N.; Webb, D.V.
2000-01-01
Full text: An inexpensive computing facility has been established for performing repetitive Monte Carlo simulations with the BEAM and EGS4/EGSnrc codes of linear accelerator beams, for calculating effective dose from diagnostic imaging procedures and of ion chambers and phantoms used for the Australian high energy absorbed dose standards. The facility currently consists of 3 dual-processor 450 MHz processor PCs linked by a high speed LAN. The 3 PCs can be accessed either locally from a single keyboard/monitor/mouse combination using a SwitchView controller or remotely via a computer network from PCs with suitable communications software (e.g. Telnet, Kermit etc). All 3 PCs are identically configured to have the Red Hat Linux 6.0 operating system. A Fortran compiler and the BEAM and EGS4/EGSnrc codes are available on the 3 PCs. The preparation of sequences of jobs utilising the Monte Carlo codes is simplified using load-distributing software (enFuzion 6.0 marketed by TurboLinux Inc, formerly Cluster from Active Tools) which efficiently distributes the computing load amongst all 6 processors. We describe 3 applications of the system - (a) energy spectra from radiotherapy sources, (b) mean mass-energy absorption coefficients and stopping powers for absolute absorbed dose standards and (c) dosimetry for diagnostic procedures; (a) and (b) are based on the transport codes BEAM and FLURZnrc while (c) is a Fortran/EGS code developed at ARPANSA. Efficiency gains ranged from 3 for (c) to close to the theoretical maximum of 6 for (a) and (b), with the gain depending on the amount of 'bookkeeping' to begin each task and the time taken to complete a single task. We have found the use of a load-balancing batch processing system with many PCs to be an economical way of achieving greater productivity for Monte Carlo calculations or of any computer intensive task requiring many runs with different parameters. Copyright (2000) Australasian College of Physical Scientists and
High performance computing in Windows Azure cloud
Ambruš, Dejan
2013-01-01
High performance, security, availability, scalability, flexibility and lower costs of maintenance have essentially contributed to the growing popularity of cloud computing in all spheres of life, especially in business. In fact cloud computing offers even more than this. With usage of virtual computing clusters a runtime environment for high performance computing can be efficiently implemented also in a cloud. There are many advantages but also some disadvantages of cloud computing, some ...
Model implementation for dynamic computation of system cost for advanced life support
Levri, J. A.; Vaccari, D. A.
2004-01-01
Life support system designs for long-duration space missions have a multitude of requirements drivers, such as mission objectives, political considerations, cost, crew wellness, inherent mission attributes, as well as many other influences. Evaluation of requirements satisfaction can be difficult, particularly at an early stage of mission design. Because launch cost is a critical factor and relatively easy to quantify, it is a point of focus in early mission design. The method used to determine launch cost influences the accuracy of the estimate. This paper discusses the appropriateness of dynamic mission simulation in estimating the launch cost of a life support system. This paper also provides an abbreviated example of a dynamic simulation life support model and possible ways in which such a model might be utilized for design improvement. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.
Computer tomography: a cost-saving examination?
International Nuclear Information System (INIS)
Barneveld Binkhuysen, F.H.; Puijlaert, C.B.A.J.
1987-01-01
The research concerns the influence of the body computer tomograph (BCT) on the efficiency in radiology and in the hospital as a whole in The Netherlands. Hospitals with CT are compared with hospitals without CT. In radiology the substitution effect is investigated, with use of the number of radiological performances per clinical patient as a parameter. This parameter proves to decrease in hospitals with a CT, in contrast to hospitals without a CT. The often-expressed opinion that the CT should specifically perform complementary examinations appears incorrect. As to the efficiency in the hospital this is related to the average hospital in-patient stay. The average hospital in-patient stay proves to be shorter in hospitals with a CT than in those without a CT. The CT has turned out to be a very effective expedient which unfortunately, however, is being used inefficiently in The Netherlands, owing to limited installation. 17 refs.; 6 figs.; 5 tabs
Energy Technology Data Exchange (ETDEWEB)
1979-08-01
A computer analysis of domestic economic incentive is presented. Included are the sample computer data set for ten combinations of reprocessing and reactor assumptions; basic data set and computer output; higher uranium availability computer output; 50 percent higher GCR fabrication cost computer output; 50 percent higher GCR reprocessing cost computer output; year 1990 and year 2000 GCR introduction scenario computer outputs; 75 percent perceived capacity factor for PBR computer output; and capital cost of GCRs 1.2 times that of LWRs.
Batteries: Lower cost than gasoline?
International Nuclear Information System (INIS)
Werber, Mathew; Fischer, Michael; Schwartz, Peter V.
2009-01-01
We compare the lifecycle costs of an electric car to a similar gasoline-powered vehicle under different scenarios of required driving range and cost of gasoline. An electric car is cost competitive for a significant portion of the scenarios: for cars of lower range and for higher gasoline prices. Electric cars with ∼150 km range are a technologically viable, cost competitive, high performance, high efficiency alternative that can presently suit the vast majority of consumers' needs.
Challenges and opportunities of cloud computing for atmospheric sciences
Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.
2016-04-01
Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.
A Comparative Cost Analysis of Picture Archiving and ...
African Journals Online (AJOL)
Method: An incremental cost analysis for chest radiographs,, computed tomography and magnetic resonance imaging brain scans with and without contrast were performed. The overall incremental cost for PACS in comparison with a conventional radiology site was determined. The net present value was also determined to ...
An opportunity cost model of subjective effort and task performance
Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus
2013-01-01
Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775