WorldWideScience

Sample records for higher computational cost

  1. Higher-order techniques in computational electromagnetics

    CERN Document Server

    Graglia, Roberto D

    2016-01-01

    Higher-Order Techniques in Computational Electromagnetics explains 'high-order' techniques that can significantly improve the accuracy, computational cost, and reliability of computational techniques for high-frequency electromagnetics, such as antennas, microwave devices and radar scattering applications.

  2. Cost Efficiency in Public Higher Education.

    Science.gov (United States)

    Robst, John

    This study used the frontier cost function framework to examine cost efficiency in public higher education. The frontier cost function estimates the minimum predicted cost for producing a given amount of output. Data from the annual Almanac issues of the "Chronicle of Higher Education" were used to calculate state level enrollments at two-year and…

  3. Collaborating to Cut Costs in Higher Education

    Science.gov (United States)

    Hassett, Tracy

    2017-01-01

    Tuition prices at colleges and universities are high. It is also true that salaries and benefits are the single biggest chunk of every higher education institution's (HEI) budget. And one of the largest and most difficult costs to contain is group employee health insurance. The situation is particularly difficult for smaller New England HEIs…

  4. Implementation of cloud computing in higher education

    Science.gov (United States)

    Asniar; Budiawan, R.

    2016-04-01

    Cloud computing research is a new trend in distributed computing, where people have developed service and SOA (Service Oriented Architecture) based application. This technology is very useful to be implemented, especially for higher education. This research is studied the need and feasibility for the suitability of cloud computing in higher education then propose the model of cloud computing service in higher education in Indonesia that can be implemented in order to support academic activities. Literature study is used as the research methodology to get a proposed model of cloud computing in higher education. Finally, SaaS and IaaS are cloud computing service that proposed to be implemented in higher education in Indonesia and cloud hybrid is the service model that can be recommended.

  5. Is higher nursing home quality more costly?

    Science.gov (United States)

    Giorgio, L Di; Filippini, M; Masiero, G

    2016-11-01

    Widespread issues regarding quality in nursing homes call for an improved understanding of the relationship with costs. This relationship may differ in European countries, where care is mainly delivered by nonprofit providers. In accordance with the economic theory of production, we estimate a total cost function for nursing home services using data from 45 nursing homes in Switzerland between 2006 and 2010. Quality is measured by means of clinical indicators regarding process and outcome derived from the minimum data set. We consider both composite and single quality indicators. Contrary to most previous studies, we use panel data and control for omitted variables bias. This allows us to capture features specific to nursing homes that may explain differences in structural quality or cost levels. Additional analysis is provided to address simultaneity bias using an instrumental variable approach. We find evidence that poor levels of quality regarding outcome, as measured by the prevalence of severe pain and weight loss, lead to higher costs. This may have important implications for the design of payment schemes for nursing homes.

  6. Activity-Based Costing Systems for Higher Education.

    Science.gov (United States)

    Day, Dennis H.

    1993-01-01

    Examines traditional costing models utilized in higher education and pinpoints shortcomings related to proper identification of costs. Describes activity-based costing systems as a superior alternative for cost identification, measurement, and allocation. (MLF)

  7. Higher costs confirmed for US supercollider

    CERN Multimedia

    Vaughan, C

    1990-01-01

    American Secratary of Energy, James Watkins told Congress that the SSC will cost at least one to two billion dollars more than its estimated cost. He admitted that the final cost may be so high that the collider is not worth building (3 paragraphs).

  8. Cost and Price Increases in Higher Education: Evidence of a Cost Disease on Higher Education Costs and Tuition Prices and the Implications for Higher Education Policy

    Science.gov (United States)

    Trombella, Jerry

    2011-01-01

    As concern over rapidly rising college costs and tuition sticker prices have increased, a variety of research has been conducted to determine potential causes. Most of this research has focused on factors unique to higher education. In contrast, cost disease theory attempts to create a comparative context to explain cost increases in higher…

  9. A cost modelling system for cloud computing

    OpenAIRE

    Ajeh, Daniel; Ellman, Jeremy; Keogh, Shelagh

    2014-01-01

    An advance in technology unlocks new opportunities for organizations to increase their productivity, efficiency and process automation while reducing the cost of doing business as well. The emergence of cloud computing addresses these prospects through the provision of agile systems that are scalable, flexible and reliable as well as cost effective. Cloud computing has made hosting and deployment of computing resources cheaper and easier with no up-front charges but pay per-use flexible payme...

  10. Higher order correlations in computed particle distributions

    International Nuclear Information System (INIS)

    Hanerfeld, H.; Herrmannsfeldt, W.; Miller, R.H.

    1989-03-01

    The rms emittances calculated for beam distributions using computer simulations are frequently dominated by higher order aberrations. Thus there are substantial open areas in the phase space plots. It has long been observed that the rms emittance is not an invariant to beam manipulations. The usual emittance calculation removes the correlation between transverse displacement and transverse momentum. In this paper, we explore the possibility of defining higher order correlations that can be removed from the distribution to result in a lower limit to the realizable emittance. The intent is that by inserting the correct combinations of linear lenses at the proper position, the beam may recombine in a way that cancels the effects of some higher order forces. An example might be the non-linear transverse space charge forces which cause a beam to spread. If the beam is then refocused so that the same non-linear forces reverse the inward velocities, the resulting phase space distribution may reasonably approximate the original distribution. The approach to finding the location and strength of the proper lens to optimize the transported beam is based on work by Bruce Carlsten of Los Alamos National Laboratory. 11 refs., 4 figs

  11. Costing Principles in Higher Education and Their Application (First Revision).

    Science.gov (United States)

    Sterns, A. A.

    This document provides a reason for applying known cost-accounting methodology within the realm of higher education and attempts to make the known techniques viable for sets of objectives within the university environment. The plan developed here is applied to a department, the lowest level in the university hierarchy, and demonstrates costs in…

  12. Incremental ALARA cost/benefit computer analysis

    International Nuclear Information System (INIS)

    Hamby, P.

    1987-01-01

    Commonwealth Edison Company has developed and is testing an enhanced Fortran Computer Program to be used for cost/benefit analysis of Radiation Reduction Projects at its six nuclear power facilities and Corporate Technical Support Groups. This paper describes a Macro-Diven IBM Mainframe Program comprised of two different types of analyses-an Abbreviated Program with fixed costs and base values, and an extended Engineering Version for a detailed, more through and time-consuming approach. The extended engineering version breaks radiation exposure costs down into two components-Health-Related Costs and Replacement Labor Costs. According to user input, the program automatically adjust these two cost components and applies the derivation to company economic analyses such as replacement power costs, carrying charges, debt interest, and capital investment cost. The results from one of more program runs using different parameters may be compared in order to determine the most appropriate ALARA dose reduction technique. Benefits of this particular cost / benefit analysis technique includes flexibility to accommodate a wide range of user data and pre-job preparation, as well as the use of proven and standardized company economic equations

  13. Towards higher reliability of CMS computing facilities

    International Nuclear Information System (INIS)

    Bagliesi, G; Bloom, K; Brew, C; Flix, J; Kreuzer, P; Sciabà, A

    2012-01-01

    The CMS experiment has adopted a computing system where resources are distributed worldwide in more than 50 sites. The operation of the system requires a stable and reliable behaviour of the underlying infrastructure. CMS has established procedures to extensively test all relevant aspects of a site and their capability to sustain the various CMS computing workflows at the required scale. The Site Readiness monitoring infrastructure has been instrumental in understanding how the system as a whole was improving towards LHC operations, measuring the reliability of sites when running CMS activities, and providing sites with the information they need to troubleshoot any problem. This contribution reviews the complete automation of the Site Readiness program, with the description of monitoring tools and their inclusion into the Site Status Board (SSB), the performance checks, the use of tools like HammerCloud, and the impact in improving the overall reliability of the Grid from the point of view of the CMS computing system. These results are used by CMS to select good sites to conduct workflows, in order to maximize workflows efficiencies. The performance against these tests seen at the sites during the first years of LHC running is as well reviewed.

  14. Computer Software for Life Cycle Cost.

    Science.gov (United States)

    1987-04-01

    34 111. 1111I .25 IL4 jj 16 MICROCOPY RESOLUTION TEST CHART hut FILE C AIR CoMMNAMN STFF COLLG STUJDET PORTO i COMpUTER SOFTWARE FOR LIFE CYCLE CO879...obsolete), physical life (utility before physically wearing out), or application life (utility in a given function)." (7:5) The costs are usually

  15. The Hidden Cost of Buying a Computer.

    Science.gov (United States)

    Johnson, Michael

    1983-01-01

    In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)

  16. Higher-Order and Symbolic Computation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Mason, Ian

    2008-01-01

    a series of implementaions that properly account for multiple invocations of the derivative-taking opeatro. In "Adapting Functional Programs to Higher-Order Logic," Scott Owens and Konrad Slind present a variety of examples of terminiation proofs of functional programs written in HOL proof systems. Since......-calculus programs, historically. The anaylsis determines the possible locations of ambients and mirrors the temporla sequencing of actions in the structure of types....

  17. Acute costs and predictors of higher treatment costs of trauma in New South Wales, Australia.

    Science.gov (United States)

    Curtis, Kate; Lam, Mary; Mitchell, Rebecca; Black, Deborah; Taylor, Colman; Dickson, Cara; Jan, Stephen; Palmer, Cameron S; Langcake, Mary; Myburgh, John

    2014-01-01

    Accurate economic data are fundamental for improving current funding models and ultimately in promoting the efficient delivery of services. The financial burden of a high trauma casemix to designated trauma centres in Australia has not been previously determined, and there is some evidence that the episode funding model used in Australia results in the underfunding of trauma. To describe the costs of acute trauma admissions in trauma centres, identify predictors of higher treatment costs and cost variance in New South Wales (NSW), Australia. Data linkage of admitted trauma patient and financial data provided by 12 Level 1 NSW trauma centres for the 08/09 financial year was performed. Demographic, injury details and injury scores were obtained from trauma registries. Individual patient general ledger costs (actual trauma patient costs), Australian Refined Diagnostic Related Groups (AR-DRG) and state-wide average costs (which form the basis of funding) were obtained. The actual costs incurred by the hospital were then compared with the state-wide AR-DRG average costs. Multivariable multiple linear regression was used for identifying predictors of costs. There were 17,522 patients, the average per patient cost was $10,603 and the median was $4628 (interquartile range: $2179-10,148). The actual costs incurred by trauma centres were on average $134 per bed day above AR-DRG costs-determined costs. Falls, road trauma and violence were the highest causes of total cost. Motor cyclists and pedestrians had higher median costs than motor vehicle occupants. As a result of greater numbers, patients with minor injury had comparable total costs with those generated by patients with severe injury. However the median cost of severely injured patients was nearly four times greater. The count of body regions injured, sex, length of stay, serious traumatic brain injury and admission to the Intensive Care Unit were significantly associated with increased costs (p<0.001). This

  18. Cost/Benefit Analysis of Leasing Versus Purchasing Computers

    National Research Council Canada - National Science Library

    Arceneaux, Alan

    1997-01-01

    .... In constructing this model, several factors were considered, including: The purchase cost of computer equipment, annual lease payments, depreciation costs, the opportunity cost of purchasing, tax revenue implications and various leasing terms...

  19. Low cost highly available digital control computer

    International Nuclear Information System (INIS)

    Silvers, M.W.

    1986-01-01

    When designing digital controllers for critical plant control it is important to provide several features. Among these are reliability, availability, maintainability, environmental protection, and low cost. An examination of several applications has lead to a design that can be produced for approximately $20,000 (1000 control points). This design is compatible with modern concepts in distributed and hierarchical control. The canonical controller element is a dual-redundant self-checking computer that communicates with a cross-strapped, electrically isolated input/output system. The input/output subsystem comprises multiple intelligent input/output cards. These cards accept commands from the primary processor which are validated, executed, and acknowledged. Each card may be hot replaced to facilitate sparing. The implementation of the dual-redundant computer architecture is discussed. Called the FS-86, this computer can be used for a variety of applications. It has most recently found application in the upgrade of San Francisco's Bay Area Rapid Transit (BART) train control currently in progress and has been proposed for feedwater control in a boiling water reactor

  20. Computer-Supported Collaborative Learning in Higher Education

    Science.gov (United States)

    Roberts, Tim, Ed.

    2005-01-01

    "Computer-Supported Collaborative Learning in Higher Education" provides a resource for researchers and practitioners in the area of computer-supported collaborative learning (also known as CSCL); particularly those working within a tertiary education environment. It includes articles of relevance to those interested in both theory and practice in…

  1. Computer-aided voice training in higher education: participants ...

    African Journals Online (AJOL)

    The training of performance singing in a multi lingual, multi cultural educational context presents unique problems and requires inventive teaching strategies. Computer-aided training offers objective visual feedback of the voice production that can be implemented as a teaching aid in higher education. This article reports on ...

  2. Cloud Computing and Information Technology Resource Cost Management for SMEs

    DEFF Research Database (Denmark)

    Kuada, Eric; Adanu, Kwame; Olesen, Henning

    2013-01-01

    This paper analyzes the decision-making problem confronting SMEs considering the adoption of cloud computing as an alternative to in-house computing services provision. The economics of choosing between in-house computing and a cloud alternative is analyzed by comparing the total economic costs...... in determining the relative value of cloud computing....

  3. Special issue of Higher-Order and Symbolic Computation

    DEFF Research Database (Denmark)

    solicited from papers presented at ASIAPEPM 02, the 2002 SIGPLAN Symposium on Partial Evaluation and Semantics-Based Program Manipulation [1]. The four articles were subjected to the usual process of journal reviewing. "Cost-Augmented Partial Evaluation of Functional Logic Programs" extends previous......The present issue is dedicated to Partial Evaluation and Semantics-Based Program Manipulation. Its first two articles were solicited from papers presented at PEPM 02, the 2002 ACMSIGPLANWorkshop on Partial Evaluation and Semantics-Based Program Manipulation [2], and its last two articles were...... narrowing-driven techniques of partial evaluation for functional-logic programs by the inclusion of abstract computation costs into the partial-evaluation process. A preliminary version of this work was presented at PEPM 02. "Specialization Scenarios: A Pragmatic Approach to Declaring Program Specialization...

  4. Endoscopic third ventriculostomy has no higher costs than ventriculoperitoneal shunt

    Directory of Open Access Journals (Sweden)

    Benicio Oton de Lima

    2014-07-01

    Full Text Available Objective: To evaluate the cost of endoscopic third ventriculostomy (ETV compared to ventriculoperitoneal shunt (VPS in the treatment of hydrocephalus in children. Method: We studied 103 children with hydrocephalus, 52 of which were treated with ETV and 51 with VPS in a prospective cohort. Treatment costs were compared within the first year after surgery, including subsequent surgery or hospitalization. Results: Twenty (38.4% of the 52 children treated with VPS needed another procedure due to shunt failure, compared to 11 (21.5% of 51 children in the ETV group. The average costs per patient in the group treated with ETV was USD$ 2,177,66±517.73 compared to USD$ 2,890.68±2,835.02 for the VPS group. Conclusions: In this series there was no significant difference in costs between the ETV and VPS groups.

  5. The Ability of implementing Cloud Computing in Higher Education - KRG

    Directory of Open Access Journals (Sweden)

    Zanyar Ali Ahmed

    2017-06-01

    Full Text Available Cloud computing is a new technology. CC is an online service can store and retrieve information, without the requirement for physical access to the files on hard drives. The information is available on a system, server where it can be accessed by clients when it’s needed. Lack of the ICT infrastructure of universities of the Kurdistan Regional Government (KRG can use  this new technology, because of economical advantages, enhanced data managements, better maintenance, high performance, improve availability and accessibility therefore achieving an easy maintenance  of organizational  institutes. The aim of this research is to find the ability and possibility to implement the cloud computing in higher education of the KRG. This research will help the universities to start establishing a cloud computing in their services. A survey has been conducted to evaluate the CC services that have been applied to KRG universities have by using cloud computing services. The results showed that the most of KRG universities are using SaaS. MHE-KRG universities and institutions are confronting many challenges and concerns in term of security, user privacy, lack of integration with current systems, and data and documents ownership.

  6. The Cost-Accounting Mechanism in Higher Educational Institutions.

    Science.gov (United States)

    Lukoshkin, A. P.; Min'ko, E. V.

    1990-01-01

    Examines the need to increase expenditures per student at Soviet technical institutes. Proposes seeking financial assistance from enterprises employing technical specialists. Outlines an experimental program in cost accounting. Suggests stipend and wage allotments and explains some of the contractual obligations involved. (CH)

  7. Higher threat avoidance costs reduce avoidance behaviour which in turn promotes fear extinction in humans.

    Science.gov (United States)

    Rattel, Julina A; Miedl, Stephan F; Blechert, Jens; Wilhelm, Frank H

    2017-09-01

    Theoretical models specifying the underlying mechanisms of the development and maintenance of anxiety and related disorders state that fear responses acquired through classical Pavlovian conditioning are maintained by repeated avoidance behaviour; thus, it is assumed that avoidance prevents fear extinction. The present study investigated behavioural avoidance decisions as a function of avoidance costs in a naturalistic fear conditioning paradigm. Ecologically valid avoidance costs - manipulated between participant groups - were represented via time-delays during a detour in a gamified computer task. After differential acquisitions of shock-expectancy to a predictive conditioned stimulus (CS+), participants underwent extinction where they could either take a risky shortcut, while anticipating shock signaled by the CS+, or choose a costly avoidance option (lengthy detour); thus, they were faced with an approach-avoidance conflict. Groups with higher avoidance costs (longer detours) showed lower proportions of avoiders. Avoiders gave heightened shock-expectancy ratings post-extinction, demonstrating 'protecting from extinction', i.e. failure to extinguish. Moreover, there was an indirect effect of avoidance costs on protection from extinction through avoidance behaviour. No moderating role of trait-anxiety was found. Theoretical implications of avoidance behaviour are discussed, considering the involvement of instrumental learning in the maintenance of fear responses. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. The Real University Cost in a ''Free'' Higher Education Country

    Science.gov (United States)

    Psacharopoulos, G.; Papakonstantinou, G.

    2005-01-01

    Using a sample of over 3000 first year university entrants in Greece, we investigate the time and expense incurred in preparation for the highly competitive higher education entry examinations, as well as what students spend privately while attending university. It is shown that in a constitutionally ''free for all'' higher education country,…

  9. Can Online Learning Bend the Higher Education Cost Curve?

    OpenAIRE

    David J. Deming; Claudia Goldin; Lawrence F. Katz; Noam Yuchtman

    2015-01-01

    We examine whether online learning technologies have led to lower prices in higher education. Using data from the Integrated Postsecondary Education Data System, we show that online education is concentrated in large for-profit chains and less-selective public institutions. We find that colleges with a higher share of online students charge lower tuition prices. We present evidence of declining real and relative prices for full-time undergraduate online education from 2006 to 2013. Although t...

  10. Computer tomography: a cost-saving examination?

    International Nuclear Information System (INIS)

    Barneveld Binkhuysen, F.H.; Puijlaert, C.B.A.J.

    1987-01-01

    The research concerns the influence of the body computer tomograph (BCT) on the efficiency in radiology and in the hospital as a whole in The Netherlands. Hospitals with CT are compared with hospitals without CT. In radiology the substitution effect is investigated, with use of the number of radiological performances per clinical patient as a parameter. This parameter proves to decrease in hospitals with a CT, in contrast to hospitals without a CT. The often-expressed opinion that the CT should specifically perform complementary examinations appears incorrect. As to the efficiency in the hospital this is related to the average hospital in-patient stay. The average hospital in-patient stay proves to be shorter in hospitals with a CT than in those without a CT. The CT has turned out to be a very effective expedient which unfortunately, however, is being used inefficiently in The Netherlands, owing to limited installation. 17 refs.; 6 figs.; 5 tabs

  11. Internationalization of Higher Education: Potential Benefits and Costs

    Science.gov (United States)

    Jibeen, Tahira; Khan, Masha Asad

    2015-01-01

    Internationalization of higher education is the top stage of international relations among universities and it is no longer regarded as a goal in itself, but as a means to improve the quality of education. The knowledge translation and acquisition, mobilization of talent in support of global research and enchantment of the curriculum with…

  12. Cloud Computing-An Ultimate Technique to Minimize Computing cost for Developing Countries

    OpenAIRE

    Narendra Kumar; Shikha Jain

    2012-01-01

    The presented paper deals with how remotely managed computing and IT resources can be beneficial in the developing countries like India and Asian sub-continent countries. This paper not only defines the architectures and functionalities of cloud computing but also indicates strongly about the current demand of Cloud computing to achieve organizational and personal level of IT supports in very minimal cost with high class flexibility. The power of cloud can be used to reduce the cost of IT - r...

  13. Cloud Computing in Higher Education Sector for Sustainable Development

    Science.gov (United States)

    Duan, Yuchao

    2016-01-01

    Cloud computing is considered a new frontier in the field of computing, as this technology comprises three major entities namely: software, hardware and network. The collective nature of all these entities is known as the Cloud. This research aims to examine the impacts of various aspects namely: cloud computing, sustainability, performance…

  14. Software Requirements for a System to Compute Mean Failure Cost

    Energy Technology Data Exchange (ETDEWEB)

    Aissa, Anis Ben [University of Tunis, Belvedere, Tunisia; Abercrombie, Robert K [ORNL; Sheldon, Frederick T [ORNL; Mili, Ali [New Jersey Insitute of Technology

    2010-01-01

    In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder. We also demonstrated this infrastructure through the results of security breakdowns for the ecommerce case. In this paper, we illustrate this infrastructure by an application that supports the computation of the Mean Failure Cost (MFC) for each stakeholder.

  15. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  16. Development of computer program for estimating decommissioning cost - 59037

    International Nuclear Information System (INIS)

    Kim, Hak-Soo; Park, Jong-Kil

    2012-01-01

    The programs for estimating the decommissioning cost have been developed for many different purposes and applications. The estimation of decommissioning cost is required a large amount of data such as unit cost factors, plant area and its inventory, waste treatment, etc. These make it difficult to use manual calculation or typical spreadsheet software such as Microsoft Excel. The cost estimation for eventual decommissioning of nuclear power plants is a prerequisite for safe, timely and cost-effective decommissioning. To estimate the decommissioning cost more accurately and systematically, KHNP, Korea Hydro and Nuclear Power Co. Ltd, developed a decommissioning cost estimating computer program called 'DeCAT-Pro', which is Decommission-ing Cost Assessment Tool - Professional. (Hereinafter called 'DeCAT') This program allows users to easily assess the decommissioning cost with various decommissioning options. Also, this program provides detailed reporting for decommissioning funding requirements as well as providing detail project schedules, cash-flow, staffing plan and levels, and waste volumes by waste classifications and types. KHNP is planning to implement functions for estimating the plant inventory using 3-D technology and for classifying the conditions of radwaste disposal and transportation automatically. (authors)

  17. An approximate fractional Gaussian noise model with computational cost

    KAUST Repository

    Sø rbye, Sigrunn H.; Myrvoll-Nilsen, Eirik; Rue, Haavard

    2017-01-01

    Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood

  18. Cost-effectiveness of PET and PET/Computed Tomography

    DEFF Research Database (Denmark)

    Gerke, Oke; Hermansson, Ronnie; Hess, Søren

    2015-01-01

    measure by means of incremental cost-effectiveness ratios when considering the replacement of the standard regimen by a new diagnostic procedure. This article discusses economic assessments of PET and PET/computed tomography reported until mid-July 2014. Forty-seven studies on cancer and noncancer...

  19. A survey of cost accounting in service-oriented computing

    NARCIS (Netherlands)

    de Medeiros, Robson W.A.; Rosa, Nelson S.; Campos, Glaucia M.M.; Ferreira Pires, Luis

    Nowadays, companies are increasingly offering their business services through computational services on the Internet in order to attract more customers and increase their revenues. However, these services have financial costs that need to be managed in order to maximize profit. Several models and

  20. Low cost spacecraft computers: Oxymoron or future trend?

    Science.gov (United States)

    Manning, Robert M.

    1993-01-01

    Over the last few decades, application of current terrestrial computer technology in embedded spacecraft control systems has been expensive and wrought with many technical challenges. These challenges have centered on overcoming the extreme environmental constraints (protons, neutrons, gamma radiation, cosmic rays, temperature, vibration, etc.) that often preclude direct use of commercial off-the-shelf computer technology. Reliability, fault tolerance and power have also greatly constrained the selection of spacecraft control system computers. More recently, new constraints are being felt, cost and mass in particular, that have again narrowed the degrees of freedom spacecraft designers once enjoyed. This paper discusses these challenges, how they were previously overcome, how future trends in commercial computer technology will simplify (or hinder) selection of computer technology for spacecraft control applications, and what spacecraft electronic system designers can do now to circumvent them.

  1. Cost-Cutting in Higher Education: Lessons Learned from Collaboration, Technology, and Outsourcing Initiatives. Draft.

    Science.gov (United States)

    Kaganoff, Tessa

    This document presents a review of cost-containment initiatives relevant to higher education institutions. Originally commissioned to examine cost containment initiatives carried out by institutions affiliated with the Foundation for Independent Higher Education (FIHE), the paper was expanded to include a sector-wide review of three types of…

  2. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  3. Higher cost of implementing Xpert(®) MTB/RIF in Ugandan peripheral settings: implications for cost-effectiveness.

    Science.gov (United States)

    Hsiang, E; Little, K M; Haguma, P; Hanrahan, C F; Katamba, A; Cattamanchi, A; Davis, J L; Vassall, A; Dowdy, D

    2016-09-01

    Initial cost-effectiveness evaluations of Xpert(®) MTB/RIF for tuberculosis (TB) diagnosis have not fully accounted for the realities of implementation in peripheral settings. To evaluate costs and diagnostic outcomes of Xpert testing implemented at various health care levels in Uganda. We collected empirical cost data from five health centers utilizing Xpert for TB diagnosis, using an ingredients approach. We reviewed laboratory and patient records to assess outcomes at these sites and10 sites without Xpert. We also estimated incremental cost-effectiveness of Xpert testing; our primary outcome was the incremental cost of Xpert testing per newly detected TB case. The mean unit cost of an Xpert test was US$21 based on a mean monthly volume of 54 tests per site, although unit cost varied widely (US$16-58) and was primarily determined by testing volume. Total diagnostic costs were 2.4-fold higher in Xpert clinics than in non-Xpert clinics; however, Xpert only increased diagnoses by 12%. The diagnostic costs of Xpert averaged US$119 per newly detected TB case, but were as high as US$885 at the center with the lowest volume of tests. Xpert testing can detect TB cases at reasonable cost, but may double diagnostic budgets for relatively small gains, with cost-effectiveness deteriorating with lower testing volumes.

  4. Scripting intercultural computer-supported collaborative learning in higher education

    NARCIS (Netherlands)

    Popov, V.

    2013-01-01

    Introduction of computer-supported collaborative learning (CSCL), specifically in an intercultural learning environment, creates both challenges and benefits. Among the challenges are the coordination of different attitudes, styles of communication, and patterns of behaving. Among the benefits are

  5. Granular computing and intelligent systems design with information granules of higher order and higher type

    CERN Document Server

    Pedrycz, Witold; Chen, Shyi-Ming

    2011-01-01

    Information granules are conceptual entities that aid the perception of complex phenomena. This book looks at granular computing techniques such as algorithmic pursuits and includes diverse applications and case studies from fields such as power engineering.

  6. Special issue of Higher-Order and Symbolic Computation

    DEFF Research Database (Denmark)

    Danvy, Olivier

    , they should have a large range of applicability for a large class of specifications or programs. Only general ideas could become the basis for an automatic system for program development. Bob’s APTS system is indeed the incarnation of most of the techniques he proposed (cf. Leonard and Heitmeyer...... specification, expressed in SCR notation, into C. Two translation strategies are discussed in the paper. Both were implemented using Bob Paige’s APTS programtransformation system. “Computational Divided Differencing and Divided-Difference Arithmetics” uses an approach conceptually similar to the Computational...

  7. Special issue of Higher-Order and Symbolic Computation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Sabry, Amr

    This issue of HOSC is dedicated to the general topic of continuations. It grew out of the third ACM SIGPLAN Workshop on Continuations (CW'01), which took place in London, UK on January 16, 2001 [3]. The notion of continuation is ubiquitous in many different areas of computer science, including...... and streamline Filinski's earlier work in the previous special issue of HOSC (then LISP and Symbolic Computation) that grew out of the first ACM SIGPLAN Workshop on Continuations [1, 2]. Hasegawa and Kakutani's article is the journal version of an article presented at FOSSACS 2001 and that received the EATCS...

  8. Exploring Issues about Computational Thinking in Higher Education

    Science.gov (United States)

    Czerkawski, Betul C.; Lyman, Eugene W., III

    2015-01-01

    The term computational thinking (CT) has been in academic discourse for decades, but gained new currency in 2006, when Jeanette Wing used it to describe a set of thinking skills that students in all fields may require in order to succeed. Wing's initial article and subsequent writings on CT have been broadly influential; experts in…

  9. User manual for PACTOLUS: a code for computing power costs

    International Nuclear Information System (INIS)

    Huber, H.D.; Bloomster, C.H.

    1979-02-01

    PACTOLUS is a computer code for calculating the cost of generating electricity. Through appropriate definition of the input data, PACTOLUS can calculate the cost of generating electricity from a wide variety of power plants, including nuclear, fossil, geothermal, solar, and other types of advanced energy systems. The purpose of PACTOLUS is to develop cash flows and calculate the unit busbar power cost (mills/kWh) over the entire life of a power plant. The cash flow information is calculated by two principal models: the Fuel Model and the Discounted Cash Flow Model. The Fuel Model is an engineering cost model which calculates the cash flow for the fuel cycle costs over the project lifetime based on input data defining the fuel material requirements, the unit costs of fuel materials and processes, the process lead and lag times, and the schedule of the capacity factor for the plant. For nuclear plants, the Fuel Model calculates the cash flow for the entire nuclear fuel cycle. For fossil plants, the Fuel Model calculates the cash flow for the fossil fuel purchases. The Discounted Cash Flow Model combines the fuel costs generated by the Fuel Model with input data on the capital costs, capital structure, licensing time, construction time, rates of return on capital, tax rates, operating costs, and depreciation method of the plant to calculate the cash flow for the entire lifetime of the project. The financial and tax structure for both investor-owned utilities and municipal utilities can be simulated through varying the rates of return on equity and debt, the debt-equity ratios, and tax rates. The Discounted Cash Flow Model uses the principal that the present worth of the revenues will be equal to the present worth of the expenses including the return on investment over the economic life of the project. This manual explains how to prepare the input data, execute cases, and interpret the output results with the updated version of PACTOLUS. 11 figures, 2 tables

  10. User manual for PACTOLUS: a code for computing power costs.

    Energy Technology Data Exchange (ETDEWEB)

    Huber, H.D.; Bloomster, C.H.

    1979-02-01

    PACTOLUS is a computer code for calculating the cost of generating electricity. Through appropriate definition of the input data, PACTOLUS can calculate the cost of generating electricity from a wide variety of power plants, including nuclear, fossil, geothermal, solar, and other types of advanced energy systems. The purpose of PACTOLUS is to develop cash flows and calculate the unit busbar power cost (mills/kWh) over the entire life of a power plant. The cash flow information is calculated by two principal models: the Fuel Model and the Discounted Cash Flow Model. The Fuel Model is an engineering cost model which calculates the cash flow for the fuel cycle costs over the project lifetime based on input data defining the fuel material requirements, the unit costs of fuel materials and processes, the process lead and lag times, and the schedule of the capacity factor for the plant. For nuclear plants, the Fuel Model calculates the cash flow for the entire nuclear fuel cycle. For fossil plants, the Fuel Model calculates the cash flow for the fossil fuel purchases. The Discounted Cash Flow Model combines the fuel costs generated by the Fuel Model with input data on the capital costs, capital structure, licensing time, construction time, rates of return on capital, tax rates, operating costs, and depreciation method of the plant to calculate the cash flow for the entire lifetime of the project. The financial and tax structure for both investor-owned utilities and municipal utilities can be simulated through varying the rates of return on equity and debt, the debt-equity ratios, and tax rates. The Discounted Cash Flow Model uses the principal that the present worth of the revenues will be equal to the present worth of the expenses including the return on investment over the economic life of the project. This manual explains how to prepare the input data, execute cases, and interpret the output results. (RWR)

  11. Bully University? The Cost of Workplace Bullying and Employee Disengagement in American Higher Education

    OpenAIRE

    Leah P. Hollis

    2015-01-01

    Workplace bullying has a detrimental effect on employees, yet few studies have examined its impact on personnel in American higher education administration. Therefore, two central research questions guided this study: (a) What is the extent of workplace bullying in higher education administration? and (b) What is the cost of workplace bullying specifically to higher education administration? Participants from 175 four-...

  12. Computational identification of candidate nucleotide cyclases in higher plants

    KAUST Repository

    Wong, Aloysius Tze; Gehring, Christoph A

    2013-01-01

    In higher plants guanylyl cyclases (GCs) and adenylyl cyclases (ACs) cannot be identified using BLAST homology searches based on annotated cyclic nucleotide cyclases (CNCs) of prokaryotes, lower eukaryotes, or animals. The reason is that CNCs

  13. Selectively Fortifying Reconfigurable Computing Device to Achieve Higher Error Resilience

    Directory of Open Access Journals (Sweden)

    Mingjie Lin

    2012-01-01

    Full Text Available With the advent of 10 nm CMOS devices and “exotic” nanodevices, the location and occurrence time of hardware defects and design faults become increasingly unpredictable, therefore posing severe challenges to existing techniques for error-resilient computing because most of them statically assign hardware redundancy and do not account for the error tolerance inherently existing in many mission-critical applications. This work proposes a novel approach to selectively fortifying a target reconfigurable computing device in order to achieve hardware-efficient error resilience for a specific target application. We intend to demonstrate that such error resilience can be significantly improved with effective hardware support. The major contributions of this work include (1 the development of a complete methodology to perform sensitivity and criticality analysis of hardware redundancy, (2 a novel problem formulation and an efficient heuristic methodology to selectively allocate hardware redundancy among a target design’s key components in order to maximize its overall error resilience, and (3 an academic prototype of SFC computing device that illustrates a 4 times improvement of error resilience for a H.264 encoder implemented with an FPGA device.

  14. An approximate fractional Gaussian noise model with computational cost

    KAUST Repository

    Sørbye, Sigrunn H.

    2017-09-18

    Fractional Gaussian noise (fGn) is a stationary time series model with long memory properties applied in various fields like econometrics, hydrology and climatology. The computational cost in fitting an fGn model of length $n$ using a likelihood-based approach is ${\\\\mathcal O}(n^{2})$, exploiting the Toeplitz structure of the covariance matrix. In most realistic cases, we do not observe the fGn process directly but only through indirect Gaussian observations, so the Toeplitz structure is easily lost and the computational cost increases to ${\\\\mathcal O}(n^{3})$. This paper presents an approximate fGn model of ${\\\\mathcal O}(n)$ computational cost, both with direct or indirect Gaussian observations, with or without conditioning. This is achieved by approximating fGn with a weighted sum of independent first-order autoregressive processes, fitting the parameters of the approximation to match the autocorrelation function of the fGn model. The resulting approximation is stationary despite being Markov and gives a remarkably accurate fit using only four components. The performance of the approximate fGn model is demonstrated in simulations and two real data examples.

  15. Technology and the Broken Higher Education Cost Model: Insights from the Delta Cost Project

    Science.gov (United States)

    Kirshstein, Rita; Wellman, Jane

    2012-01-01

    Although U.S. higher education has faced numerous crises and dilemmas in its history, the situation in which colleges and universities find themselves at the moment is indeed different. Shrinking public subsidies coupled with historic rises in tuitions come at the same time that colleges and universities have been tasked to dramatically increase…

  16. Computational identification of candidate nucleotide cyclases in higher plants

    KAUST Repository

    Wong, Aloysius Tze

    2013-09-03

    In higher plants guanylyl cyclases (GCs) and adenylyl cyclases (ACs) cannot be identified using BLAST homology searches based on annotated cyclic nucleotide cyclases (CNCs) of prokaryotes, lower eukaryotes, or animals. The reason is that CNCs are often part of complex multifunctional proteins with different domain organizations and biological functions that are not conserved in higher plants. For this reason, we have developed CNC search strategies based on functionally conserved amino acids in the catalytic center of annotated and/or experimentally confirmed CNCs. Here we detail this method which has led to the identification of >25 novel candidate CNCs in Arabidopsis thaliana, several of which have been experimentally confirmed in vitro and in vivo. We foresee that the application of this method can be used to identify many more members of the growing family of CNCs in higher plants. © Springer Science+Business Media New York 2013.

  17. Client-server computer architecture saves costs and eliminates bottlenecks

    International Nuclear Information System (INIS)

    Darukhanavala, P.P.; Davidson, M.C.; Tyler, T.N.; Blaskovich, F.T.; Smith, C.

    1992-01-01

    This paper reports that workstation, client-server architecture saved costs and eliminated bottlenecks that BP Exploration (Alaska) Inc. experienced with mainframe computer systems. In 1991, BP embarked on an ambitious project to change technical computing for its Prudhoe Bay, Endicott, and Kuparuk operations on Alaska's North Slope. This project promised substantial rewards, but also involved considerable risk. The project plan called for reservoir simulations (which historically had run on a Cray Research Inc. X-MP supercomputer in the company's Houston data center) to be run on small computer workstations. Additionally, large Prudhoe Bay, Endicott, and Kuparuk production and reservoir engineering data bases and related applications also would be moved to workstations, replacing a Digital Equipment Corp. VAX cluster in Anchorage

  18. Price-Cost Ratios in Higher Education: Subsidy Structure and Policy Implications

    Science.gov (United States)

    Xie, Yan

    2010-01-01

    The diversity of US institutions of higher education is manifested in many ways. This study looks at that diversity from the economic perspective by studying the subsidy structure through the distribution of institutional price-cost ratio (PCR), defined as the sum of net tuition price divided by total supplier cost and equals to one minus…

  19. Positive Attitude toward Healthy Eating Predicts Higher Diet Quality at All Cost Levels of Supermarkets☆

    Science.gov (United States)

    Aggarwal, Anju; Monsivais, Pablo; Cook, Andrea J.; Drewnowski, Adam

    2014-01-01

    Shopping at low-cost supermarkets has been associated with higher obesity rates. This study examined whether attitudes toward healthy eating are independently associated with diet quality among shoppers at low-cost, medium-cost, and high-cost supermarkets. Data on socioeconomic status (SES), attitudes toward healthy eating, and supermarket choice were collected using a telephone survey of a representative sample of adult residents of King County, WA. Dietary intake data were based on a food frequency questionnaire. Thirteen supermarket chains were stratified into three categories: low, medium, and high cost, based on a market basket of 100 commonly eaten foods. Diet-quality measures were energy density, mean adequacy ratio, and total servings of fruits and vegetables. The analytical sample consisted of 963 adults. Multivariable regressions with robust standard error examined relations between diet quality, supermarket type, attitudes, and SES. Shopping at higher-cost supermarkets was associated with higher-quality diets. These associations persisted after adjusting for SES, but were eliminated after taking attitudinal measures into account. Supermarket shoppers with positive attitudes toward healthy eating had equally higher-quality diets, even if they shopped at low-, medium-, or high-cost supermarkets, independent of SES and other covariates. These findings imply that shopping at low-cost supermarkets does not prevent consumers from having high-quality diets, as long as they attach importance to good nutrition. Promoting nutrition-education strategies among supermarkets, particularly those catering to low-income groups, can help to improve diet quality. PMID:23916974

  20. COMPUTER EXPERIMENTS WITH FINITE ELEMENTS OF HIGHER ORDER

    Directory of Open Access Journals (Sweden)

    Khomchenko A.

    2017-12-01

    Full Text Available The paper deals with the problem of constructing the basic functions of a quadrilateral finite element of the fifth order by the means of the computer algebra system Maple. The Lagrangian approximation of such a finite element contains 36 nodes: 20 nodes perimeter and 16 internal nodes. Alternative models with reduced number of internal nodes are considered. Graphs of basic functions and cognitive portraits of lines of zero level are presented. The work is aimed at studying the possibilities of using modern information technologies in the teaching of individual mathematical disciplines.

  1. Computer-Mediated Assessment of Higher-Order Thinking Development

    Science.gov (United States)

    Tilchin, Oleg; Raiyn, Jamal

    2015-01-01

    Solving complicated problems in a contemporary knowledge-based society requires higher-order thinking (HOT). The most productive way to encourage development of HOT in students is through use of the Problem-based Learning (PBL) model. This model organizes learning by solving corresponding problems relative to study courses. Students are directed…

  2. Higher-Order Integral Equation Methods in Computational Electromagnetics

    DEFF Research Database (Denmark)

    Jørgensen, Erik; Meincke, Peter

    Higher-order integral equation methods have been investigated. The study has focused on improving the accuracy and efficiency of the Method of Moments (MoM) applied to electromagnetic problems. A new set of hierarchical Legendre basis functions of arbitrary order is developed. The new basis...

  3. Appraising the Cost Efficiency of Higher Technological and Vocational Education Institutions in Taiwan Using the Metafrontier Cost-Function Model

    Science.gov (United States)

    Lu, Yung-Hsiang; Chen, Ku-Hsieh

    2013-01-01

    This paper aims at appraising the cost efficiency and technology of institutions of higher technological and vocational education. Differing from conventional literature, it considers the potential influence of inherent discrepancies in output quality and characteristics of school systems for institutes of technology (ITs) and universities of…

  4. Addressing the computational cost of large EIT solutions

    International Nuclear Information System (INIS)

    Boyle, Alistair; Adler, Andy; Borsic, Andrea

    2012-01-01

    Electrical impedance tomography (EIT) is a soft field tomography modality based on the application of electric current to a body and measurement of voltages through electrodes at the boundary. The interior conductivity is reconstructed on a discrete representation of the domain using a finite-element method (FEM) mesh and a parametrization of that domain. The reconstruction requires a sequence of numerically intensive calculations. There is strong interest in reducing the cost of these calculations. An improvement in the compute time for current problems would encourage further exploration of computationally challenging problems such as the incorporation of time series data, wide-spread adoption of three-dimensional simulations and correlation of other modalities such as CT and ultrasound. Multicore processors offer an opportunity to reduce EIT computation times but may require some restructuring of the underlying algorithms to maximize the use of available resources. This work profiles two EIT software packages (EIDORS and NDRM) to experimentally determine where the computational costs arise in EIT as problems scale. Sparse matrix solvers, a key component for the FEM forward problem and sensitivity estimates in the inverse problem, are shown to take a considerable portion of the total compute time in these packages. A sparse matrix solver performance measurement tool, Meagre-Crowd, is developed to interface with a variety of solvers and compare their performance over a range of two- and three-dimensional problems of increasing node density. Results show that distributed sparse matrix solvers that operate on multiple cores are advantageous up to a limit that increases as the node density increases. We recommend a selection procedure to find a solver and hardware arrangement matched to the problem and provide guidance and tools to perform that selection. (paper)

  5. Addressing the computational cost of large EIT solutions.

    Science.gov (United States)

    Boyle, Alistair; Borsic, Andrea; Adler, Andy

    2012-05-01

    Electrical impedance tomography (EIT) is a soft field tomography modality based on the application of electric current to a body and measurement of voltages through electrodes at the boundary. The interior conductivity is reconstructed on a discrete representation of the domain using a finite-element method (FEM) mesh and a parametrization of that domain. The reconstruction requires a sequence of numerically intensive calculations. There is strong interest in reducing the cost of these calculations. An improvement in the compute time for current problems would encourage further exploration of computationally challenging problems such as the incorporation of time series data, wide-spread adoption of three-dimensional simulations and correlation of other modalities such as CT and ultrasound. Multicore processors offer an opportunity to reduce EIT computation times but may require some restructuring of the underlying algorithms to maximize the use of available resources. This work profiles two EIT software packages (EIDORS and NDRM) to experimentally determine where the computational costs arise in EIT as problems scale. Sparse matrix solvers, a key component for the FEM forward problem and sensitivity estimates in the inverse problem, are shown to take a considerable portion of the total compute time in these packages. A sparse matrix solver performance measurement tool, Meagre-Crowd, is developed to interface with a variety of solvers and compare their performance over a range of two- and three-dimensional problems of increasing node density. Results show that distributed sparse matrix solvers that operate on multiple cores are advantageous up to a limit that increases as the node density increases. We recommend a selection procedure to find a solver and hardware arrangement matched to the problem and provide guidance and tools to perform that selection.

  6. Estimating pressurized water reactor decommissioning costs: A user's manual for the PWR Cost Estimating Computer Program (CECP) software

    International Nuclear Information System (INIS)

    Bierschbach, M.C.; Mencinsky, G.J.

    1993-10-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the US Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personnel computer, provides estimates for the cost of decommissioning PWR plant stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  7. Investment Evaluation of Higher Education through Cost-Benefit Analysis: Evidence from Adrar University-Algeria

    Science.gov (United States)

    Hocine, Benlaria; Sofiane, Mostéfaoui

    2017-01-01

    This study aims to measure the social and individual rates of return for investment in higher education at Adrar University. The approach adopted looks for investigating the costs and benefits of the human capital. The study found that the economic feasibility of investment in higher education exists at both the individual and social levels, where…

  8. The Case for Higher Computational Density in the Memory-Bound FDTD Method within Multicore Environments

    Directory of Open Access Journals (Sweden)

    Mohammed F. Hadi

    2012-01-01

    Full Text Available It is argued here that more accurate though more compute-intensive alternate algorithms to certain computational methods which are deemed too inefficient and wasteful when implemented within serial codes can be more efficient and cost-effective when implemented in parallel codes designed to run on today's multicore and many-core environments. This argument is most germane to methods that involve large data sets with relatively limited computational density—in other words, algorithms with small ratios of floating point operations to memory accesses. The examples chosen here to support this argument represent a variety of high-order finite-difference time-domain algorithms. It will be demonstrated that a three- to eightfold increase in floating-point operations due to higher-order finite-differences will translate to only two- to threefold increases in actual run times using either graphical or central processing units of today. It is hoped that this argument will convince researchers to revisit certain numerical techniques that have long been shelved and reevaluate them for multicore usability.

  9. Higher Dietary Cost Is Associated with Higher Diet Quality: A Cross-Sectional Study among Selected Malaysian Adults

    Directory of Open Access Journals (Sweden)

    Ibnteesam Pondor

    2017-09-01

    Full Text Available Food price is a determining factor of food choices; however its relationship with diet quality is unclear in Malaysia. This study aimed to examine socio-economic characteristics and daily dietary cost (DDC in relation to diet quality in the state of Selangor, Malaysia. Dietary intake was assessed using a Food Frequency Questionnaire (FFQ and diet quality was estimated using a Malaysian Healthy Eating Index (M-HEI. DDC in Malaysian Ringgit (RM was calculated from dietary intake and national food prices. Linear regression models were fitted to determine associations between DDC and M-HEI scores and predictors of diet quality. The mean M-HEI score of respondents was 61.31 ± 10.88 and energy adjusted DDC was RM10.71/2000 kcal (USD 2.49. The highest quintile of adjusted DDC had higher M-HEI scores for all respondents (Q1: 57.14 ± 10.07 versus Q5: 63.26 ± 11.54, p = 0.001. There were also positive associations between DDC and M-HEI scores for fruits (p < 0.001 and vegetables (p = 0.017 for all respondents. Predictors of diet quality included carbohydrate (β = 0290; p < 0.001 and fat intakes (β = −0.242; p < 0.001 and energy adjusted DDC (β = 0.196; p < 0.001. Higher dietary cost is associated with healthy eating among Malaysian adults.

  10. Software network analyzer for computer network performance measurement planning over heterogeneous services in higher educational institutes

    OpenAIRE

    Ismail, Mohd Nazri

    2009-01-01

    In 21st century, convergences of technologies and services in heterogeneous environment have contributed multi-traffic. This scenario will affect computer network on learning system in higher educational Institutes. Implementation of various services can produce different types of content and quality. Higher educational institutes should have a good computer network infrastructure to support usage of various services. The ability of computer network should consist of i) higher bandwidth; ii) ...

  11. Positive attitude toward healthy eating predicts higher diet quality at all cost levels of supermarkets.

    Science.gov (United States)

    Aggarwal, Anju; Monsivais, Pablo; Cook, Andrea J; Drewnowski, Adam

    2014-02-01

    Shopping at low-cost supermarkets has been associated with higher obesity rates. This study examined whether attitudes toward healthy eating are independently associated with diet quality among shoppers at low-cost, medium-cost, and high-cost supermarkets. Data on socioeconomic status (SES), attitudes toward healthy eating, and supermarket choice were collected using a telephone survey of a representative sample of adult residents of King County, WA. Dietary intake data were based on a food frequency questionnaire. Thirteen supermarket chains were stratified into three categories: low, medium, and high cost, based on a market basket of 100 commonly eaten foods. Diet-quality measures were energy density, mean adequacy ratio, and total servings of fruits and vegetables. The analytical sample consisted of 963 adults. Multivariable regressions with robust standard error examined relations between diet quality, supermarket type, attitudes, and SES. Shopping at higher-cost supermarkets was associated with higher-quality diets. These associations persisted after adjusting for SES, but were eliminated after taking attitudinal measures into account. Supermarket shoppers with positive attitudes toward healthy eating had equally higher-quality diets, even if they shopped at low-, medium-, or high-cost supermarkets, independent of SES and other covariates. These findings imply that shopping at low-cost supermarkets does not prevent consumers from having high-quality diets, as long as they attach importance to good nutrition. Promoting nutrition-education strategies among supermarkets, particularly those catering to low-income groups, can help to improve diet quality. Copyright © 2014 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  12. Prediction of higher cost of antiretroviral therapy (ART) according to clinical complexity. A validated clinical index.

    Science.gov (United States)

    Velasco, Cesar; Pérez, Inaki; Podzamczer, Daniel; Llibre, Josep Maria; Domingo, Pere; González-García, Juan; Puig, Inma; Ayala, Pilar; Martín, Mayte; Trilla, Antoni; Lázaro, Pablo; Gatell, Josep Maria

    2016-03-01

    The financing of antiretroviral therapy (ART) is generally determined by the cost incurred in the previous year, the number of patients on treatment, and the evidence-based recommendations, but not the clinical characteristics of the population. To establish a score relating the cost of ART and patient clinical complexity in order to understand the costing differences between hospitals in the region that could be explained by the clinical complexity of their population. Retrospective analysis of patients receiving ART in a tertiary hospital between 2009 and 2011. Factors potentially associated with a higher cost of ART were assessed by bivariate and multivariate analysis. Two predictive models of "high-cost" were developed. The normalized estimated (adjusted for the complexity scores) costs were calculated and compared with the normalized real costs. In the Hospital Index, 631 (16.8%) of the 3758 patients receiving ART were responsible for a "high-cost" subgroup, defined as the highest 25% of spending on ART. Baseline variables that were significant predictors of high cost in the Clinic-B model in the multivariate analysis were: route of transmission of HIV, AIDS criteria, Spanish nationality, year of initiation of ART, CD4+ lymphocyte count nadir, and number of hospital admissions. The Clinic-B score ranged from 0 to 13, and the mean value (5.97) was lower than the overall mean value of the four hospitals (6.16). The clinical complexity of the HIV patient influences the cost of ART. The Clinic-B and Clinic-BF scores predicted patients with high cost of ART and could be used to compare and allocate costs corrected for the patient clinical complexity. Copyright © 2015 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  13. The performance of low-cost commercial cloud computing as an alternative in computational chemistry.

    Science.gov (United States)

    Thackston, Russell; Fortenberry, Ryan C

    2015-05-05

    The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services. © 2015 Wiley Periodicals, Inc.

  14. Fixed-point image orthorectification algorithms for reduced computational cost

    Science.gov (United States)

    French, Joseph Clinton

    Imaging systems have been applied to many new applications in recent years. With the advent of low-cost, low-power focal planes and more powerful, lower cost computers, remote sensing applications have become more wide spread. Many of these applications require some form of geolocation, especially when relative distances are desired. However, when greater global positional accuracy is needed, orthorectification becomes necessary. Orthorectification is the process of projecting an image onto a Digital Elevation Map (DEM), which removes terrain distortions and corrects the perspective distortion by changing the viewing angle to be perpendicular to the projection plane. Orthorectification is used in disaster tracking, landscape management, wildlife monitoring and many other applications. However, orthorectification is a computationally expensive process due to floating point operations and divisions in the algorithm. To reduce the computational cost of on-board processing, two novel algorithm modifications are proposed. One modification is projection utilizing fixed-point arithmetic. Fixed point arithmetic removes the floating point operations and reduces the processing time by operating only on integers. The second modification is replacement of the division inherent in projection with a multiplication of the inverse. The inverse must operate iteratively. Therefore, the inverse is replaced with a linear approximation. As a result of these modifications, the processing time of projection is reduced by a factor of 1.3x with an average pixel position error of 0.2% of a pixel size for 128-bit integer processing and over 4x with an average pixel position error of less than 13% of a pixel size for a 64-bit integer processing. A secondary inverse function approximation is also developed that replaces the linear approximation with a quadratic. The quadratic approximation produces a more accurate approximation of the inverse, allowing for an integer multiplication calculation

  15. Bully University? The Cost of Workplace Bullying and Employee Disengagement in American Higher Education

    Directory of Open Access Journals (Sweden)

    Leah P. Hollis

    2015-06-01

    Full Text Available Workplace bullying has a detrimental effect on employees, yet few studies have examined its impact on personnel in American higher education administration. Therefore, two central research questions guided this study: (a What is the extent of workplace bullying in higher education administration? and (b What is the cost of workplace bullying specifically to higher education administration? Participants from 175 four-year colleges and universities were surveyed to reveal that 62% of higher education administrators had experienced or witnessed workplace bullying in the 18 months prior to the study. Race and gender were not parameters considered in the sample. A total of 401 (n = 401 higher education respondents completed the instrument from various departments on a campus: academic affairs, student affairs, athletics, development/advancement, admissions/financial aid, information technology, arts faculty, sciences faculty, and executives. Employment disengagement served as the theoretical lens to analyze the financial cost to higher education when employees mentally disengage from organizational missions and objectives. With this lens, the study examined staff hours lost through employee disengagement and the associated costs.

  16. Public Concepts of the Values and Costs of Higher Education, 1963-1974. A Preliminary Analysis.

    Science.gov (United States)

    Minor, Michael J.; Murray, James R.

    Statistical data are presented on interviews conducted through the Continuous National Survey (CNS) at the National Opinion Research Center in Chicago and based on results reprinted from "Public Concepts of the Values and Costs of Higher Education," by Angus Campbell and William C. Eckerman. The CNS results presented in this report are…

  17. Efficiency, Costs, Rankings and Heterogeneity: The Case of US Higher Education

    Science.gov (United States)

    Agasisti, Tommaso; Johnes, Geraint

    2015-01-01

    Among the major trends in the higher education (HE) sector, the development of rankings as a policy and managerial tool is of particular relevance. However, despite the diffusion of these instruments, it is still not clear how they relate with traditional performance measures, like unit costs and efficiency scores. In this paper, we estimate a…

  18. Multi-Product Total Cost of Function for Higher Education: A Case of Bible Colleges.

    Science.gov (United States)

    Koshal, Rajindar K.; Koshal, Manjulika; Gupta, Ashok

    2001-01-01

    This study empirically estimates a multiproduct total cost function and output relationship for comprehensive U.S. universities. Statistical results for 184 Bible colleges suggest that there are both economies of scale and of scope in higher education. Additionally, product-specific economies of scope exist for all output levels and activities.…

  19. Lowering the Cost Barrier to Higher Education for Undocumented Students: A Promising University-Level Intervention

    Science.gov (United States)

    Thangasamy, Andrew; Horan, Deborah

    2016-01-01

    Undocumented students, many of Hispanic origin, face among the strictest cost barriers to higher education in the United States. Lack of legal status excludes them from most state and all federal financial aid programs. Furthermore, most states require them to pay out-of-state tuition rates at publicly supported institutions. In a new direction,…

  20. Computer Aided Design of a Low-Cost Painting Robot

    Directory of Open Access Journals (Sweden)

    SYEDA MARIA KHATOON ZAIDI

    2017-10-01

    Full Text Available The application of robots or robotic systems for painting parts is becoming increasingly conventional; to improve reliability, productivity, consistency and to decrease waste. However, in Pakistan only highend Industries are able to afford the luxury of a robotic system for various purposes. In this study we propose an economical Painting Robot that a small-scale industry can install in their plant with ease. The importance of this robot is that being cost effective, it can easily be replaced in small manufacturing industries and therefore, eliminate health problems occurring to the individual in charge of painting parts on an everyday basis. To achieve this aim, the robot is made with local parts with only few exceptions, to cut costs; and the programming language is kept at a mediocre level. Image processing is used to establish object recognition and it can be programmed to paint various simple geometries. The robot is placed on a conveyer belt to maximize productivity. A four DoF (Degree of Freedom arm increases the working envelope and accessibility of painting different shaped parts with ease. This robot is capable of painting up, front, back, left and right sides of the part with a single colour. Initially CAD (Computer Aided Design models of the robot were developed which were analyzed, modified and improved to withstand loading condition and perform its task efficiently. After design selection, appropriate motors and materials were selected and the robot was developed. Throughout the development phase, minor problems and errors were fixed accordingly as they arose. Lastly the robot was integrated with the computer and image processing for autonomous control. The final results demonstrated that the robot is economical and reduces paint wastage.

  1. Computer aided design of a low-cost painting robot

    International Nuclear Information System (INIS)

    Zaidi, S.M.; Janejo, F.; Mujtaba, S.B.

    2017-01-01

    The application of robots or robotic systems for painting parts is becoming increasingly conventional; to improve reliability, productivity, consistency and to decrease waste. However, in Pakistan only highend Industries are able to afford the luxury of a robotic system for various purposes. In this study we propose an economical Painting Robot that a small-scale industry can install in their plant with ease. The importance of this robot is that being cost effective, it can easily be replaced in small manufacturing industries and therefore, eliminate health problems occurring to the individual in charge of painting parts on an everyday basis. To achieve this aim, the robot is made with local parts with only few exceptions, to cut costs; and the programming language is kept at a mediocre level. Image processing is used to establish object recognition and it can be programmed to paint various simple geometries. The robot is placed on a conveyer belt to maximize productivity. A four DoF (Degree of Freedom) arm increases the working envelope and accessibility of painting different shaped parts with ease. This robot is capable of painting up, front, back, left and right sides of the part with a single colour. Initially CAD (Computer Aided Design) models of the robot were developed which were analyzed, modified and improved to withstand loading condition and perform its task efficiently. After design selection, appropriate motors and materials were selected and the robot was developed. Throughout the development phase, minor problems and errors were fixed accordingly as they arose. Lastly the robot was integrated with the computer and image processing for autonomous control. The final results demonstrated that the robot is economical and reduces paint wastage. (author)

  2. Decommissioning costing approach based on the standardised list of costing items. Lessons learnt by the OMEGA computer code

    International Nuclear Information System (INIS)

    Daniska, Vladimir; Rehak, Ivan; Vasko, Marek; Ondra, Frantisek; Bezak, Peter; Pritrsky, Jozef; Zachar, Matej; Necas, Vladimir

    2011-01-01

    The document 'A Proposed Standardised List of Items for Costing Purposes' was issues in 1999 by OECD/NEA, IAEA and European Commission (EC) for promoting the harmonisation in decommissioning costing. It is a systematic list of decommissioning activities classified in chapters 01 to 11 with three numbered levels. Four cost group are defined for cost at each level. Document constitutes the standardised matrix of decommissioning activities and cost groups with definition of content of items. Knowing what is behind the items makes the comparison of cost for decommissioning projects transparent. Two approaches are identified for use of the standardised cost structure. First approach converts the cost data from existing specific cost structures into the standardised cost structure for the purpose of cost presentation. Second approach uses the standardised cost structure as the base for the cost calculation structure; the calculated cost data are formatted in the standardised cost format directly; several additional advantages may be identified in this approach. The paper presents the costing methodology based on the standardised cost structure and lessons learnt from last ten years of the implementation of the standardised cost structure as the cost calculation structure in the computer code OMEGA. Code include also on-line management of decommissioning waste, decay of radioactively, evaluation of exposure, generation and optimisation of the Gantt chart of a decommissioning project, which makes the OMEGA code an effective tool for planning and optimisation of decommissioning processes. (author)

  3. Manual of phosphoric acid fuel cell power plant cost model and computer program

    Science.gov (United States)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    Cost analysis of phosphoric acid fuel cell power plant includes two parts: a method for estimation of system capital costs, and an economic analysis which determines the levelized annual cost of operating the system used in the capital cost estimation. A FORTRAN computer has been developed for this cost analysis.

  4. Higher Dietary Cost Is Associated with Higher Diet Quality: A Cross-Sectional Study among Selected Malaysian Adults.

    Science.gov (United States)

    Pondor, Ibnteesam; Gan, Wan Ying; Appannah, Geeta

    2017-09-16

    Food price is a determining factor of food choices; however its relationship with diet quality is unclear in Malaysia. This study aimed to examine socio-economic characteristics and daily dietary cost (DDC) in relation to diet quality in the state of Selangor, Malaysia. Dietary intake was assessed using a Food Frequency Questionnaire (FFQ) and diet quality was estimated using a Malaysian Healthy Eating Index (M-HEI). DDC in Malaysian Ringgit (RM) was calculated from dietary intake and national food prices. Linear regression models were fitted to determine associations between DDC and M-HEI scores and predictors of diet quality. The mean M-HEI score of respondents was 61.31 ± 10.88 and energy adjusted DDC was RM10.71/2000 kcal (USD 2.49). The highest quintile of adjusted DDC had higher M-HEI scores for all respondents (Q1: 57.14 ± 10.07 versus Q5: 63.26 ± 11.54, p = 0.001). There were also positive associations between DDC and M-HEI scores for fruits ( p diet quality included carbohydrate (β = 0290; p healthy eating among Malaysian adults.

  5. Cheaper fuel and higher health costs among the poor in rural Nepal

    Energy Technology Data Exchange (ETDEWEB)

    Pant, Krishna Prasad [Ministry of Agriculture and Cooperatives, Vidhya Lane, Devnagar, Kathmandu (Nepal)], email: kppant@yahoo.com

    2012-03-15

    Biomass fuels are used by the majority of resource poor households in low-income countries. Though biomass fuels, such as dung-briquette and firewood are apparently cheaper than the modern fuels indoor pollution from burning biomass fuels incurs high health costs. But, the health costs of these conventional fuels, mostly being indirect, are poorly understood. To address this gap, this study develops probit regression models using survey data generated through interviews from households using either dung-briquette or biogas as the primary source of fuel for cooking. The study investigates factors affecting the use of dung-briquette, assesses its impact on human health, and estimates the associated household health costs. Analysis suggests significant effects of dung-briquette on asthma and eye diseases. Despite of the perception of it being a cheap fuel, the annual health cost per household due to burning dung-briquette (US$ 16.94) is 61.3% higher than the annual cost of biogas (US$ 10.38), an alternative cleaner fuel for rural households. For reducing the use of dung-briquette and its indirect health costs, the study recommends three interventions: (1) educate women and aboriginal people, in particular, and make them aware of the benefits of switching to biogas; (2) facilitate tree planting in communal as well as private lands; and (3) create rural employment and income generation opportunities.

  6. High School Computer Science Education Paves the Way for Higher Education: The Israeli Case

    Science.gov (United States)

    Armoni, Michal; Gal-Ezer, Judith

    2014-01-01

    The gap between enrollments in higher education computing programs and the high-tech industry's demands is widely reported, and is especially prominent for women. Increasing the availability of computer science education in high school is one of the strategies suggested in order to address this gap. We look at the connection between exposure to…

  7. Academic Computing Facilities and Services in Higher Education--A Survey.

    Science.gov (United States)

    Warlick, Charles H.

    1986-01-01

    Presents statistics about academic computing facilities based on data collected over the past six years from 1,753 institutions in the United States, Canada, Mexico, and Puerto Rico for the "Directory of Computing Facilities in Higher Education." Organizational, functional, and financial characteristics are examined as well as types of…

  8. Development of computer software for pavement life cycle cost analysis.

    Science.gov (United States)

    1988-01-01

    The life cycle cost analysis program (LCCA) is designed to automate and standardize life cycle costing in Virginia. It allows the user to input information necessary for the analysis, and it then completes the calculations and produces a printed copy...

  9. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Pendergrass, J.H.

    1978-01-01

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  10. How Much Is Too Much? Controlling Administrative Costs through Effective Oversight. A Guide for Higher Education Trustees

    Science.gov (United States)

    Alacbay, Armand; Barden, Danielle

    2017-01-01

    With recent research from the Institute for Higher Education Policy showing that college is unaffordable for as many as 70% of working- and middle-class students, concerns about college costs are mounting. The cost of operating an institution of higher education, with very few exceptions, is reflected in the price of attendance that students,…

  11. Higher Dietary Cost Is Associated with Higher Diet Quality: A Cross-Sectional Study among Selected Malaysian Adults

    OpenAIRE

    Ibnteesam Pondor; Wan Ying Gan; Geeta Appannah

    2017-01-01

    Food price is a determining factor of food choices; however its relationship with diet quality is unclear in Malaysia. This study aimed to examine socio-economic characteristics and daily dietary cost (DDC) in relation to diet quality in the state of Selangor, Malaysia. Dietary intake was assessed using a Food Frequency Questionnaire (FFQ) and diet quality was estimated using a Malaysian Healthy Eating Index (M-HEI). DDC in Malaysian Ringgit (RM) was calculated from dietary intake and nationa...

  12. Low-Budget Computer Programming in Your School (An Alternative to the Cost of Large Computers). Illinois Series on Educational Applications of Computers. No. 14.

    Science.gov (United States)

    Dennis, J. Richard; Thomson, David

    This paper is concerned with a low cost alternative for providing computer experience to secondary school students. The brief discussion covers the programmable calculator and its relevance for teaching the concepts and the rudiments of computer programming and for computer problem solving. A list of twenty-five programming activities related to…

  13. DECOST: computer routine for decommissioning cost and funding analysis

    International Nuclear Information System (INIS)

    Mingst, B.C.

    1979-12-01

    One of the major controversies surrounding the decommissioning of nuclear facilities is the lack of financial information on just what the eventual costs will be. The Nuclear Regulatory Commission has studies underway to analyze the costs of decommissioning of nuclear fuel cycle facilities and some other similar studies have also been done by other groups. These studies all deal only with the final cost outlays needed to finance decommissioning in an unchangeable set of circumstances. Funding methods and planning to reduce the costs and financial risks are usually not attempted. The DECOST program package is intended to fill this void and allow wide-ranging study of the various options available when planning for the decommissioning of nuclear facilities

  14. The WHATs and HOWs of Maturing Computational and Software Engineering Skills in Russian Higher Education Institutions

    Science.gov (United States)

    Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.

    2018-01-01

    Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and…

  15. Productization and Commercialization of IT-Enabled Higher Education in Computer Science: A Systematic Literature Review

    Science.gov (United States)

    Kankaanpää, Irja; Isomäki, Hannakaisa

    2013-01-01

    This paper reviews research literature on the production and commercialization of IT-enabled higher education in computer science. Systematic literature review (SLR) was carried out in order to find out to what extent this area has been studied, more specifically how much it has been studied and to what detail. The results of this paper make a…

  16. Business Models of High Performance Computing Centres in Higher Education in Europe

    Science.gov (United States)

    Eurich, Markus; Calleja, Paul; Boutellier, Roman

    2013-01-01

    High performance computing (HPC) service centres are a vital part of the academic infrastructure of higher education organisations. However, despite their importance for research and the necessary high capital expenditures, business research on HPC service centres is mostly missing. From a business perspective, it is important to find an answer to…

  17. Computational cost of isogeometric multi-frontal solvers on parallel distributed memory machines

    KAUST Repository

    Woźniak, Maciej

    2015-02-01

    This paper derives theoretical estimates of the computational cost for isogeometric multi-frontal direct solver executed on parallel distributed memory machines. We show theoretically that for the Cp-1 global continuity of the isogeometric solution, both the computational cost and the communication cost of a direct solver are of order O(log(N)p2) for the one dimensional (1D) case, O(Np2) for the two dimensional (2D) case, and O(N4/3p2) for the three dimensional (3D) case, where N is the number of degrees of freedom and p is the polynomial order of the B-spline basis functions. The theoretical estimates are verified by numerical experiments performed with three parallel multi-frontal direct solvers: MUMPS, PaStiX and SuperLU, available through PETIGA toolkit built on top of PETSc. Numerical results confirm these theoretical estimates both in terms of p and N. For a given problem size, the strong efficiency rapidly decreases as the number of processors increases, becoming about 20% for 256 processors for a 3D example with 1283 unknowns and linear B-splines with C0 global continuity, and 15% for a 3D example with 643 unknowns and quartic B-splines with C3 global continuity. At the same time, one cannot arbitrarily increase the problem size, since the memory required by higher order continuity spaces is large, quickly consuming all the available memory resources even in the parallel distributed memory version. Numerical results also suggest that the use of distributed parallel machines is highly beneficial when solving higher order continuity spaces, although the number of processors that one can efficiently employ is somehow limited.

  18. The WHATs and HOWs of maturing computational and software engineering skills in Russian higher education institutions

    Science.gov (United States)

    Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.

    2018-05-01

    Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and Information Science courses. The model combines a Frontal Competitive Approach and a Project-Driven Learning (PDL) framework. The PDL framework has been developed by stating and solving three design problems: (i) enhance the diversity of project assignments on specific computation methods algorithmic approaches, (ii) balance similarity and dissimilarity of the project assignments, and (iii) develop a software assessment tool suitable for evaluating the technological maturity of students' project deliverables and thus reducing instructor's workload and possible overlook. The positive experience accumulated over 15 years shows that implementing the PCEM keeps students motivated to strive for success in rising to higher levels of their computational and software engineering skills.

  19. Photonic Integrated Circuits for Cost-Effective, High Port Density, and Higher Capacity Optical Communications Systems

    Science.gov (United States)

    Chiappa, Pierangelo

    Bandwidth-hungry services, such as higher speed Internet, voice over IP (VoIP), and IPTV, allow people to exchange and store huge amounts of data among worldwide locations. In the age of global communications, domestic users, companies, and organizations around the world generate new contents making bandwidth needs grow exponentially, along with the need for new services. These bandwidth and connectivity demands represent a concern for operators who require innovative technologies to be ready for scaling. To respond efficiently to these demands, Alcatel-Lucent is fast moving toward photonic integration circuits technologies as the key to address best performances at the lowest "bit per second" cost. This article describes Alcatel-Lucent's contribution in strategic directions or achievements, as well as possible new developments.

  20. USAGE AND MAGNETIZATION OF CLOUD COMPUTING IN HIGHER STUDIES – RAJASTHAN

    Directory of Open Access Journals (Sweden)

    Ranjan Upadhyaya

    2013-07-01

    Full Text Available The Young India is a doorstep of another revolution of Cloud Computing Technology and the whole world adores the true colors of Indian Information revolution in the Global Recession. The India biggest and heavily densely populated country (1.6 Million according 20011 census surveys India comprises of new age aspirants roughly 50% to 60% and out of these only 30% are Cloud Computing savvy. The uphill task lies ahead for the motherland is to train the new breads so that they can get their livelihoods and well connect them to the outer world. The inspiration of late Rajiv Gandhi’s and Prof Yashpal dream is propagating into the reality but still more work is mingled up. The submergence of the Cloud Computing revolution is taking its all time cost and bring a lot more changes which was never expected or though off in our India. Cloud computing the ladder for success for the uncultivated breeds in our nation. The nation is marching ahead with the Sculpture of ubiquitous Cloud Computing in this liberalization, privatization and globalization era.

  1. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    Science.gov (United States)

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  2. Factors cost effectively improved using computer simulations of ...

    African Journals Online (AJOL)

    LPhidza

    effectively managed using computer simulations in semi-arid conditions pertinent to much of sub-Saharan Africa. ... small scale farmers to obtain optimal crop yields thus ensuring their food security and livelihood is ... those that simultaneously incorporate and simulate processes involved throughout the course of crop ...

  3. Scilab software as an alternative low-cost computing in solving the linear equations problem

    Science.gov (United States)

    Agus, Fahrul; Haviluddin

    2017-02-01

    Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.

  4. FOREIGN AND DOMESTIC EXPERIENCE OF INTEGRATING CLOUD COMPUTING INTO PEDAGOGICAL PROCESS OF HIGHER EDUCATIONAL ESTABLISHMENTS

    Directory of Open Access Journals (Sweden)

    Nataliia A. Khmil

    2016-01-01

    Full Text Available In the present article foreign and domestic experience of integrating cloud computing into pedagogical process of higher educational establishments (H.E.E. has been generalized. It has been stated that nowadays a lot of educational services are hosted in the cloud, e.g. infrastructure as a service (IaaS, platform as a service (PaaS and software as a service (SaaS. The peculiarities of implementing cloud technologies by H.E.E. in Ukraine and abroad have been singled out; the products developed by the leading IT companies for using cloud computing in higher education system, such as Microsoft for Education, Google Apps for Education and Amazon AWS Educate have been reviewed. The examples of concrete types, methods and forms of learning and research work based on cloud services have been provided.

  5. Exploring the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms: A Formative Analysis

    OpenAIRE

    Robin H. Kay; Sharon Lauricella

    2011-01-01

    Because of decreased prices, increased convenience, and wireless access, an increasing number of college and university students are using laptop computers in their classrooms. This recent trend has forced instructors to address the educational consequences of using these mobile devices. The purpose of the current study was to analyze and assess beneficial and challenging laptop behaviours in higher education classrooms. Both quantitative and qualitative data were collected from 177 undergrad...

  6. Measuring the Cost of Quality in Higher Education: A Faculty Perspective

    Science.gov (United States)

    Ruhupatty, LeRoy; Maguad, Ben A.

    2015-01-01

    Most critical activities in colleges and universities are driven by financial considerations. It is thus important that revenues are found to support these activities or ways identified to streamline costs. One way to cut cost is to improve the efficiency of schools to address the issue of poor quality. In this paper, the cost of poor quality in…

  7. Adaptive Radar Signal Processing-The Problem of Exponential Computational Cost

    National Research Council Canada - National Science Library

    Rangaswamy, Muralidhar

    2003-01-01

    .... Extensions to handle the case of non-Gaussian clutter statistics are presented. Current challenges of limited training data support, computational cost, and severely heterogeneous clutter backgrounds are outlined...

  8. Computational cost of isogeometric multi-frontal solvers on parallel distributed memory machines

    KAUST Repository

    Woźniak, Maciej; Paszyński, Maciej R.; Pardo, D.; Dalcin, Lisandro; Calo, Victor M.

    2015-01-01

    This paper derives theoretical estimates of the computational cost for isogeometric multi-frontal direct solver executed on parallel distributed memory machines. We show theoretically that for the Cp-1 global continuity of the isogeometric solution

  9. Higher energy: is it necessary, is it worth the cost for radiation oncology?

    Science.gov (United States)

    Das, I J; Kase, K R

    1992-01-01

    The physical characteristics of the interactions of megavoltage photons and electrons with matter provide distinct advantages, relative to low-energy (orthovoltage) x rays, that lead to better radiation dose distributions in patients. Use of these high-energy radiations has resulted in better patient care, which has been reflected in improved radiation treatment outcome in recent years. But, as the desire for higher energy radiation beams increases, it becomes important to determine whether the physical characteristics that make megavoltage beams beneficial continue to provide a net advantage. It is demonstrated that, in fact, there is an energy range from 4 to 15 MV for photons and 4 to 20 MeV for electrons that is optimally suited for the treatment of cancer in humans. Radiation beams that exceed these maximum energies were found to add no advantage. This is because the costs (price of unit, installation, maintenance, shielding for neutron and photons) are not justified by either improved physical characteristics of the radiation (penetration, skin sparing, dose distribution) or treatment outcome. In fact, for photon beams some physical characteristics result in less desirable dose distributions, less accurate dosimetry, and increased safety problems as the energy increases for example, increasingly diffuse beam edges, loss of electron equilibrium, uncertainty in dose perturbations at interfaces, increased neutron contamination, and potential for higher personnel dose. The special features that make electron beams useful at lower energies, for example, skin sparing and small penetration, are lost at high energies. These physical factors are analyzed together with the economic factors related to radiation therapy patient care using megavoltage beams.

  10. Computer code for the costing and sizing of TNS tokamaks

    International Nuclear Information System (INIS)

    Sink, D.A.; Iwinski, E.M.

    1977-01-01

    A FORTRAN code for the COsting And Sizing of Tokamaks (COAST) is described. The code was written to conduct detailed analyses on the engineering features of the next tokamak fusion device following TFTR. The ORNL/Westinghouse study of TNS (The Next Step) has involved the investigation of a number of device options, each over a wide range of plasma sizes. A generalized description of TNS is incorporated in the code and includes refined modeling of over forty systems and subsystems. Considerable detailed design and analyses have provided the basis for the thermal, electrical, mechanical, nuclear, chemical, vacuum, and facility engineering of the various subsystems. Currently, the code provides a tool for the systematic comparison of four toroidal field (TF) coil technologies allowing both D-shaped and circular coils. The coil technologies are: (1) copper (both room temperature and liquid-nitrogen cooled), (2) superconducting NbTi, (3) superconducting Nb 3 Sn, and (4) a Cu/NbTi/ hybrid. For the poloidal field (PF) coil systems copper conductors are assumed. The ohmic heating (OH) coils are located within the machine bore and have an air core, while the shaping field (SF) coils are located either within or outside the TF coils. The PF coil self and mutual inductances are calculated from the geometry, and the PF coil power supplies are modeled to account for time-dependent profiles for voltages and currents as governed by input data. Plasma heating is assumed to be by neutral beams, and impurity control is either passive or by a poloidal divertor system. The size modeling allows considerable freedom in specifying physics assumptions, operating scenarios, TF operating margin, and component geometric and performance parameters. Cost relationships have been developed for both plant and capital equipment and for annual utility and fuel expenses. The code has been used successfully to reproduce the sizing and costing of TFTR in order to calibrate the various models

  11. Computing Cost Price for Cataract Surgery by Activity Based Costing (ABC Method at Hazrat-E-Zahra Hospital, Isfahan University of Medical Sciences, 2014

    Directory of Open Access Journals (Sweden)

    Masuod Ferdosi

    2016-10-01

    Full Text Available Background: Hospital managers need to have accurate information about actual costs to make efficient and effective decisions. In activity based costing method, first, activities are recognized and then direct and indirect costs are computed based on allocation methods. The aim of this study was to compute the cost price for cataract surgery by Activity Based Costing (ABC method at Hazrat-e-Zahra Hospital, Isfahan University of Medical Sciences. Methods: This was a cross- sectional study for computing the costs of cataract surgery by activity based costing technique in Hazrat-e-Zahra Hospital in Isfahan University of Medical Sciences, 2014. Data were collected through interview and direct observation and analyzed by Excel software. Results: According to the results of this study, total cost in cataract surgery was 8,368,978 Rials. Personnel cost included 62.2% (5,213,574 Rials of total cost of cataract surgery that is the highest share of surgery costs. The cost of consumables was 7.57% (1,992,852 Rials of surgery costs. Conclusion: Based on the results, there was different between cost price of the services and public Tariff which appears as hazards or financial crises to the hospital. Therefore, it is recommended to use the right methods to compute the costs relating to Activity Based Costing. Cost price of cataract surgery can be reduced by strategies such as decreasing the cost of consumables.

  12. Cost effective distributed computing for Monte Carlo radiation dosimetry

    International Nuclear Information System (INIS)

    Wise, K.N.; Webb, D.V.

    2000-01-01

    Full text: An inexpensive computing facility has been established for performing repetitive Monte Carlo simulations with the BEAM and EGS4/EGSnrc codes of linear accelerator beams, for calculating effective dose from diagnostic imaging procedures and of ion chambers and phantoms used for the Australian high energy absorbed dose standards. The facility currently consists of 3 dual-processor 450 MHz processor PCs linked by a high speed LAN. The 3 PCs can be accessed either locally from a single keyboard/monitor/mouse combination using a SwitchView controller or remotely via a computer network from PCs with suitable communications software (e.g. Telnet, Kermit etc). All 3 PCs are identically configured to have the Red Hat Linux 6.0 operating system. A Fortran compiler and the BEAM and EGS4/EGSnrc codes are available on the 3 PCs. The preparation of sequences of jobs utilising the Monte Carlo codes is simplified using load-distributing software (enFuzion 6.0 marketed by TurboLinux Inc, formerly Cluster from Active Tools) which efficiently distributes the computing load amongst all 6 processors. We describe 3 applications of the system - (a) energy spectra from radiotherapy sources, (b) mean mass-energy absorption coefficients and stopping powers for absolute absorbed dose standards and (c) dosimetry for diagnostic procedures; (a) and (b) are based on the transport codes BEAM and FLURZnrc while (c) is a Fortran/EGS code developed at ARPANSA. Efficiency gains ranged from 3 for (c) to close to the theoretical maximum of 6 for (a) and (b), with the gain depending on the amount of 'bookkeeping' to begin each task and the time taken to complete a single task. We have found the use of a load-balancing batch processing system with many PCs to be an economical way of achieving greater productivity for Monte Carlo calculations or of any computer intensive task requiring many runs with different parameters. Copyright (2000) Australasian College of Physical Scientists and

  13. Is computer aided detection (CAD) cost effective in screening mammography? A model based on the CADET II study

    Science.gov (United States)

    2011-01-01

    Background Single reading with computer aided detection (CAD) is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£), year 2007/08) of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner). Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate) CAD is unlikely to be a cost effective alternative to double reading for mammography screening in UK. This study

  14. Costs incurred by applying computer-aided design/computer-aided manufacturing techniques for the reconstruction of maxillofacial defects.

    Science.gov (United States)

    Rustemeyer, Jan; Melenberg, Alex; Sari-Rieger, Aynur

    2014-12-01

    This study aims to evaluate the additional costs incurred by using a computer-aided design/computer-aided manufacturing (CAD/CAM) technique for reconstructing maxillofacial defects by analyzing typical cases. The medical charts of 11 consecutive patients who were subjected to the CAD/CAM technique were considered, and invoices from the companies providing the CAD/CAM devices were reviewed for every case. The number of devices used was significantly correlated with cost (r = 0.880; p costs were found between cases in which prebent reconstruction plates were used (€3346.00 ± €29.00) and cases in which they were not (€2534.22 ± €264.48; p costs of two, three and four devices, even when ignoring the cost of reconstruction plates. Additional fees provided by statutory health insurance covered a mean of 171.5% ± 25.6% of the cost of the CAD/CAM devices. Since the additional fees provide financial compensation, we believe that the CAD/CAM technique is suited for wide application and not restricted to complex cases. Where additional fees/funds are not available, the CAD/CAM technique might be unprofitable, so the decision whether or not to use it remains a case-to-case decision with respect to cost versus benefit. Copyright © 2014 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  15. hPIN/hTAN: Low-Cost e-Banking Secure against Untrusted Computers

    Science.gov (United States)

    Li, Shujun; Sadeghi, Ahmad-Reza; Schmitz, Roland

    We propose hPIN/hTAN, a low-cost token-based e-banking protection scheme when the adversary has full control over the user's computer. Compared with existing hardware-based solutions, hPIN/hTAN depends on neither second trusted channel, nor secure keypad, nor computationally expensive encryption module.

  16. Modelling the Intention to Adopt Cloud Computing Services: A Transaction Cost Theory Perspective

    Directory of Open Access Journals (Sweden)

    Ogan Yigitbasioglu

    2014-11-01

    Full Text Available This paper uses transaction cost theory to study cloud computing adoption. A model is developed and tested with data from an Australian survey. According to the results, perceived vendor opportunism and perceived legislative uncertainty around cloud computing were significantly associated with perceived cloud computing security risk. There was also a significant negative relationship between perceived cloud computing security risk and the intention to adopt cloud services. This study also reports on adoption rates of cloud computing in terms of applications, as well as the types of services used.

  17. Estimating boiling water reactor decommissioning costs. A user's manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    International Nuclear Information System (INIS)

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  18. Higher Education Cloud Computing in South Africa: Towards Understanding Trust and Adoption issues

    Directory of Open Access Journals (Sweden)

    Karl Van Der Schyff

    2014-12-01

    Full Text Available This paper sets out to study the views of key stakeholders on the issue of cloud information security within institutions of Higher Education. A specific focus is on understanding trust and the adoption of cloud computing in context of the unique operational requirements of South African universities. Contributions are made on both a methodological and theoretical level. Methodologically the study contributes by employing an Interpretivist approach and using Thematic Analysis in a topic area often studied quantitatively, thus affording researchers the opportunity to gain the necessary in-depth insight into how key stakeholders view cloud security and trust. A theoretical contribution is made in the form of a trust-centric conceptual framework that illustrates how the qualitative data relates to concepts innate to cloud computing trust and adoption. Both these contributions lend credence to the fact that there is a need to address cloud information security with a specific focus on the contextual elements that surround South African universities. The paper concludes with some considerations for implementing and investigating cloud computing services in Higher Education contexts in South Africa.

  19. Reflections on Costing, Pricing and Income Measurement at UK Higher Education Institutions

    Science.gov (United States)

    Oduoza, Chike F.

    2009-01-01

    In these days of radical contraction of funding and expansion in student numbers, universities are under pressure to prioritise their resources, as well as to achieve effective costing and pricing to support judgement and decision making for funding and any external work undertaken. This study reviews costing, pricing and income measurement in…

  20. The Cost of Chaos in the Curriculum. Perspectives on Higher Education

    Science.gov (United States)

    Capaldi Phillips, Elizabeth D.; Poliakoff, Michael B.

    2015-01-01

    ACTA's report "The Cost of Chaos in the Curriculum" reveals that the vast array of course choices given to college students is a cause of exploding costs and poor academic outcomes. And a bloated undergraduate curriculum is particularly detrimental to the success of students from lower socioeconomic backgrounds. The report documents how…

  1. Counting the Cost, Reconciling the Benefits: Understanding Employer Investment in Higher Apprenticeships in Accounting

    Science.gov (United States)

    Gambin, Lynn; Hogarth, Terence

    2016-01-01

    Lack of progression to higher education amongst those who complete an Advanced Apprenticeship in England and the country's need for higher level skills led to the introduction of Higher Apprenticeships in 2009. Whilst Higher Apprenticeships would be expected to facilitate learner progression, the volume of these has remained low. In this paper,…

  2. Replacement power costs due to nuclear-plant outages: a higher standard of care

    International Nuclear Information System (INIS)

    Gransee, M.F.

    1982-01-01

    This article examines recent state public utility commission cases that deal with the high costs of replacement power that utilities must purchase after a nuclear power plant outage. Although most commissions have approved such expenses, it may be that there is a trend toward splitting the costs of such expenses between ratepayer and stockholder. Commissions are demanding a management prudence test to determine the cause of the outage and whether it meets the reasonable man standard before allowing these costs to be passed along to ratepayers. Unless the standard is applied with flexibility, however, utility companies could invoke the defenses covering traditional common law negligence

  3. Plant process computer replacements - techniques to limit installation schedules and costs

    International Nuclear Information System (INIS)

    Baker, M.D.; Olson, J.L.

    1992-01-01

    Plant process computer systems, a standard fixture in all nuclear power plants, are used to monitor and display important plant process parameters. Scanning thousands of field sensors and alarming out-of-limit values, these computer systems are heavily relied on by control room operators. The original nuclear steam supply system (NSSS) vendor for the power plant often supplied the plant process computer. Designed using sixties and seventies technology, a plant's original process computer has been obsolete for some time. Driven by increased maintenance costs and new US Nuclear Regulatory Commission regulations such as NUREG-0737, Suppl. 1, many utilities have replaced their process computers with more modern computer systems. Given that computer systems are by their nature prone to rapid obsolescence, this replacement cycle will likely repeat. A process computer replacement project can be a significant capital expenditure and must be performed during a scheduled refueling outage. The object of the installation process is to install a working system on schedule. Experience gained by supervising several computer replacement installations has taught lessons that, if applied, will shorten the schedule and limit the risk of costly delays. Examples illustrating this technique are given. This paper and these examples deal only with the installation process and assume that the replacement computer system has been adequately designed, and development and factory tested

  4. Low rank approach to computing first and higher order derivatives using automatic differentiation

    International Nuclear Information System (INIS)

    Reed, J. A.; Abdel-Khalik, H. S.; Utke, J.

    2012-01-01

    This manuscript outlines a new approach for increasing the efficiency of applying automatic differentiation (AD) to large scale computational models. By using the principles of the Efficient Subspace Method (ESM), low rank approximations of the derivatives for first and higher orders can be calculated using minimized computational resources. The output obtained from nuclear reactor calculations typically has a much smaller numerical rank compared to the number of inputs and outputs. This rank deficiency can be exploited to reduce the number of derivatives that need to be calculated using AD. The effective rank can be determined according to ESM by computing derivatives with AD at random inputs. Reduced or pseudo variables are then defined and new derivatives are calculated with respect to the pseudo variables. Two different AD packages are used: OpenAD and Rapsodia. OpenAD is used to determine the effective rank and the subspace that contains the derivatives. Rapsodia is then used to calculate derivatives with respect to the pseudo variables for the desired order. The overall approach is applied to two simple problems and to MATWS, a safety code for sodium cooled reactors. (authors)

  5. Optimization of economic load dispatch of higher order general cost polynomials and its sensitivity using modified particle swarm optimization

    International Nuclear Information System (INIS)

    Saber, Ahmed Yousuf; Chakraborty, Shantanu; Abdur Razzak, S.M.; Senjyu, Tomonobu

    2009-01-01

    This paper presents a modified particle swarm optimization (MPSO) for constrained economic load dispatch (ELD) problem. Real cost functions are more complex than conventional second order cost functions when multi-fuel operations, valve-point effects, accurate curve fitting, etc., are considering in deregulated changing market. The proposed modified particle swarm optimization (PSO) consists of problem dependent variable number of promising values (in velocity vector), unit vector and error-iteration dependent step length. It reliably and accurately tracks a continuously changing solution of the complex cost function and no extra concentration/effort is needed for the complex higher order cost polynomials in ELD. Constraint management is incorporated in the modified PSO. The modified PSO has balance between local and global searching abilities, and an appropriate fitness function helps to converge it quickly. To avoid the method to be frozen, stagnated/idle particles are reset. Sensitivity of the higher order cost polynomials is also analyzed visually to realize the importance of the higher order cost polynomials for the optimization of ELD. Finally, benchmark data sets and methods are used to show the effectiveness of the proposed method. (author)

  6. Exploring perceptions and beliefs about the cost of fruit and vegetables and whether they are barriers to higher consumption.

    Science.gov (United States)

    Chapman, Kathryn; Goldsbury, David; Watson, Wendy; Havill, Michelle; Wellard, Lyndal; Hughes, Clare; Bauman, Adrian; Allman-Farinelli, Margaret

    2017-06-01

    Fruit and vegetable (F&V) consumption is below recommendations, and cost may be a barrier to meeting recommendations. Limited evidence exists on individual perceptions about the cost, actual spending and consumption of F&V. This study investigated perceptions and beliefs about cost of F&V and whether this is a barrier to higher consumption. An online survey of Australian adults (n = 2474) measured F&V consumption; expenditure on F&V and food; and perceived barriers to consumption. Multivariable logistic regression examined associations between participants' responses about cost of F&V and demographic factors, and with actual consumption and expenditure on F&V. Cost was identified as a barrier for 29% of people not meeting recommended fruit servings and for 14% of people not meeting recommendations for vegetables. Cost was a more common barrier for those on lower incomes (fruit aOR 1.89; 95% CI 1.20-2.98 and vegetables aOR 2.94; 95% CI 1.97-4.39) and less common for older participants (fruit aOR 0.33; 95% CI 0.17-0.62 and vegetables aOR 0.31; 95% CI 0.18-0.52). There was no association between the perceived barriers and actual F&V spending. Twenty percent of participants said F&V were not affordable; 39% said cost made it difficult to buy F&V, and for 23% the cost of F&V meant they bought less than desired. A minority reported F&V were not affordable where they shopped and that cost was a barrier to higher consumption. However, it is apparent that young adults and those on low incomes eat less than they would like because of cost. Strategies that remove financial impediments to consumption are indicated for these population sub-groups. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Bach, Matthias

    2014-07-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  8. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    International Nuclear Information System (INIS)

    Bach, Matthias

    2014-01-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  9. Tuberculosis screening of travelers to higher-incidence countries: A cost-effectiveness analysis

    Directory of Open Access Journals (Sweden)

    Menzies Dick

    2008-06-01

    Full Text Available Abstract Background Travelers to countries with high tuberculosis incidence can acquire infection during travel. We sought to compare four screening interventions for travelers from low-incidence countries, who visit countries with varying tuberculosis incidence. Methods Decision analysis model: We considered hypothetical cohorts of 1,000 travelers, 21 years old, visiting Mexico, the Dominican Republic, or Haiti for three months. Travelers departed from and returned to the United States or Canada; they were born in the United States, Canada, or the destination countries. The time horizon was 20 years, with 3% annual discounting of future costs and outcomes. The analysis was conducted from the health care system perspective. Screening involved tuberculin skin testing (post-travel in three strategies, with baseline pre-travel tests in two, or chest radiography post-travel (one strategy. Returning travelers with tuberculin conversion (one strategy or other evidence of latent tuberculosis (three strategies were offered treatment. The main outcome was cost (in 2005 US dollars per tuberculosis case prevented. Results For all travelers, a single post-trip tuberculin test was most cost-effective. The associated cost estimate per case prevented ranged from $21,406 for Haitian-born travelers to Haiti, to $161,196 for US-born travelers to Mexico. In all sensitivity analyses, the single post-trip tuberculin test remained most cost-effective. For US-born travelers to Haiti, this strategy was associated with cost savings for trips over 22 months. Screening was more cost-effective with increasing trip duration and infection risk, and less so with poorer treatment adherence. Conclusion A single post-trip tuberculin skin test was the most cost-effective strategy considered, for travelers from the United States or Canada. The analysis did not evaluate the use of interferon-gamma release assays, which would be most relevant for travelers who received BCG

  10. Comparative cost analysis -- computed tomography vs. alternative diagnostic procedures, 1977-1980

    International Nuclear Information System (INIS)

    Gempel, P.A.; Harris, G.H.; Evans, R.G.

    1977-12-01

    In comparing the total national cost of utilizing computed tomography (CT) for medically indicated diagnoses with that of conventional x-ray, ultrasonography, nuclear medicine, and exploratory surgery, this investigation concludes that there was little, if any, added net cost from CT use in 1977 or will there be in 1980. Computed tomography, generally recognized as a reliable and useful diagnostic modality, has the potential to reduce net costs provided that an optimal number of units can be made available to physicians and patients to achieve projected reductions in alternative procedures. This study examines the actual cost impact of CT on both cranial and body diagnostic procedures. For abdominal and mediastinal disorders, CT scanning is just beginning to emerge as a diagnostic modality. As such, clinical experience is somewhat limited and the authors assume that no significant reduction in conventional procedures took place in 1977. It is estimated that the approximately 375,000 CT body procedures performed in 1977 represent only a 5 percent cost increase over use of other diagnostic modalities. It is projected that 2,400,000 CT body procedures will be performed in 1980 and, depending on assumptions used, total body diagnostic costs will increase only slightly or be reduced. Thirty-one tables appear throughout the text presenting cost data broken down by types of diagnostic procedures used and projections by years. Appendixes present technical cost components for diagnostic procedures, the comparative efficacy of CT as revealed in abstracts of published literature, selected medical diagnoses, and references

  11. Costs of cloud computing for a biometry department. A case study.

    Science.gov (United States)

    Knaus, J; Hieke, S; Binder, H; Schwarzer, G

    2013-01-01

    "Cloud" computing providers, such as the Amazon Web Services (AWS), offer stable and scalable computational resources based on hardware virtualization, with short, usually hourly, billing periods. The idea of pay-as-you-use seems appealing for biometry research units which have only limited access to university or corporate data center resources or grids. This case study compares the costs of an existing heterogeneous on-site hardware pool in a Medical Biometry and Statistics department to a comparable AWS offer. The "total cost of ownership", including all direct costs, is determined for the on-site hardware, and hourly prices are derived, based on actual system utilization during the year 2011. Indirect costs, which are difficult to quantify are not included in this comparison, but nevertheless some rough guidance from our experience is given. To indicate the scale of costs for a methodological research project, a simulation study of a permutation-based statistical approach is performed using AWS and on-site hardware. In the presented case, with a system utilization of 25-30 percent and 3-5-year amortization, on-site hardware can result in smaller costs, compared to hourly rental in the cloud dependent on the instance chosen. Renting cloud instances with sufficient main memory is a deciding factor in this comparison. Costs for on-site hardware may vary, depending on the specific infrastructure at a research unit, but have only moderate impact on the overall comparison and subsequent decision for obtaining affordable scientific computing resources. Overall utilization has a much stronger impact as it determines the actual computing hours needed per year. Taking this into ac count, cloud computing might still be a viable option for projects with limited maturity, or as a supplement for short peaks in demand.

  12. Marketing Policy and Its Cost in a College of Higher Education.

    Science.gov (United States)

    Riley, Eric

    1984-01-01

    Discusses the development of advertising and publicity strategies and policy for student recruitment purposes at a college of education in the United Kingdom between 1972 and 1982. Covers changes in staff attitudes, selection of media, organization of administration, and cost factors. (PGD)

  13. Value Added: The Costs and Benefits of College Preparatory Programs. American Higher Education Report Series

    Science.gov (United States)

    Swail, Watson Scott

    2004-01-01

    Rarely do stakeholders ask about the effectiveness of outreach programs or whether they are an efficient use of tax dollars and philanthropic funds. As government budgets continue to be constrained and philanthropic investment gets more competitive, there is a growing acknowledgment of the need to look at the cost/benefit of these programs and…

  14. The Future Train Wreck: Paying for Medical Costs for Higher Education's Retirees

    Science.gov (United States)

    Biggs, John H.

    2006-01-01

    Trustees and administrators today confront one of two problems with post-retirement medical care. First, if institutions provide no support for their retirees' medical care, they implicitly offer a powerful incentive for senior faculty to stay on. The compensation and opportunity costs of this effect are obviously very high. But, second, if they…

  15. The concept of computer software designed to identify and analyse logistics costs in agricultural enterprises

    Directory of Open Access Journals (Sweden)

    Karol Wajszczyk

    2009-01-01

    Full Text Available The study comprised research, development and computer programming works concerning the development of a concept for the IT tool to be used in the identification and analysis of logistics costs in agricultural enterprises in terms of the process-based approach. As a result of research and programming work an overall functional and IT concept of software was developed for the identification and analysis of logistics costs for agricultural enterprises.

  16. A low-cost vector processor boosting compute-intensive image processing operations

    Science.gov (United States)

    Adorf, Hans-Martin

    1992-01-01

    Low-cost vector processing (VP) is within reach of everyone seriously engaged in scientific computing. The advent of affordable add-on VP-boards for standard workstations complemented by mathematical/statistical libraries is beginning to impact compute-intensive tasks such as image processing. A case in point in the restoration of distorted images from the Hubble Space Telescope. A low-cost implementation is presented of the standard Tarasko-Richardson-Lucy restoration algorithm on an Intel i860-based VP-board which is seamlessly interfaced to a commercial, interactive image processing system. First experience is reported (including some benchmarks for standalone FFT's) and some conclusions are drawn.

  17. Political Economy of Cost-Sharing in Higher Education: The Case of Jordan

    Science.gov (United States)

    Kanaan, Taher H.; Al-Salamat, Mamdouh N.; Hanania, May D.

    2011-01-01

    This article analyzes patterns of expenditure on higher education in Jordan, explores the current system's adequacy, efficiency, and equity, and identifies its strengths and weaknesses in light of current constraints and future challenges. Among the constraints are the relatively low public expenditure on higher education, leaving households to…

  18. Cost Economies in the Provision of Higher Education for International Students: Australian Evidence

    Science.gov (United States)

    Zhang, Liang-Cheng; Worthington, Andrew C.; Hu, Mingyan

    2017-01-01

    In the past few decades, the additional revenues available via higher education exports (through both relatively higher prices and increased enrolments) have attracted the attention of providers in many developed countries, not least in Anglophone countries like the USA, the UK, Canada and Australia. However, while the revenue case is strong, the…

  19. Implementation of Cost Sharing in the Ethiopian Higher Education Landscape: Critical Assessment and the Way Forward

    Science.gov (United States)

    Yizengaw, Teshome

    2007-01-01

    Higher education participation in Ethiopia is very low (about 1.5 per cent) and is the major source of the critical shortage of educated and skilled human resource. The higher education system in Ethiopia is moving away from exclusive and dismally low enrolments towards increasing participation. To expand access, to redress inequitable subsidies…

  20. Strategies for compensating for higher costs of geothermal electricity with environmental benefits

    International Nuclear Information System (INIS)

    Murphy, H.; Niitsuma, Hiroaki

    1999-01-01

    After very high growth in the 1980s, geothermal electricity production has slowed in the mid- and late-1990s. While Japanese, Indonesian and Philippine geothermal growth has remained high as a consequence of supportive government policies, geothermal electricity production has been flat or reduced in much of Europe and North America. Low prices for coal and natural gas, combined with deregulation, means that in much of the world electricity from new fuel-burning electricity plants can be provided at half the cost of new geothermal electricity. Cost-cutting must be pursued, but is unlikely to close the price gap by itself. Geothermal production is widely perceived as being environmentally clean, but this is not unambiguously true, and requires reinjection to be fully realized. Strategies for monetizing the environmental advantages of geothermal, including the carbon tax, are discussed. (author)

  1. The Rapid Transit System That Achieves Higher Performance with Lower Life-Cycle Costs

    Science.gov (United States)

    Sone, Satoru; Takagi, Ryo

    In the age of traction system made of inverter and ac traction motors, distributed traction system with pure electric brake of regenerative mode has been recognised very advantageous. This paper proposes a new system as the lowest life-cycle cost system for high performance rapid transit, a new architecture and optimum parameters of power feeding system, and a new running method of trains. In Japan, these components of this proposal, i.e. pure electric brake and various countermeasures of reducing loss of regeneration have been already popular but not as yet the new running method for better utilisation of the equipment and for lower life-cycle cost. One example of what are proposed in this paper will be made as Tsukuba Express, which is under construction as the most modern commuter railway in Greater Tokyo area.

  2. Exploring the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms: A Formative Analysis

    Directory of Open Access Journals (Sweden)

    Robin H. Kay

    2011-04-01

    Full Text Available Because of decreased prices, increased convenience, and wireless access, an increasing number of college and university students are using laptop computers in their classrooms. This recent trend has forced instructors to address the educational consequences of using these mobile devices. The purpose of the current study was to analyze and assess beneficial and challenging laptop behaviours in higher education classrooms. Both quantitative and qualitative data were collected from 177 undergraduate university students (89 males, 88 females. Key benefits observed include note-taking activities, in-class laptop-based academic tasks, collaboration, increased focus, improved organization and efficiency, and addressing special needs. Key challenges noted include other student’s distracting laptop behaviours, instant messaging, surfing the web, playing games, watching movies, and decreased focus. Nearly three-quarters of the students claimed that laptops were useful in supporting their academic experience. Twice as many benefits were reported compared to challenges. It is speculated that the integration of meaningful laptop activities is a critical determinant of benefits and challenges experienced in higher education classrooms.

  3. Managing the higher risks of low-cost high-efficiency advanced power generation technologies

    International Nuclear Information System (INIS)

    Pearson, M.

    1997-01-01

    Independent power producers operate large coal-fired installations and gas turbine combined-cycle (GTCC) facilities. Combined cycle units are complex and their reliability and availability is greatly influenced by mechanical, instrumentation and control weaknesses. It was suggested that these weaknesses could be avoided by tighter specifications and more rigorous functional testing before acceptance by the owner. For the present, the difficulties of developing reliable, lower installed cost/kw, more efficient GTCC designs, pressure for lower NO x emissions with 'dry' combustors continue to be the most difficult challenges for all GT manufacturers

  4. An Alternative Method for Computing Unit Costs and Productivity Ratios. AIR 1984 Annual Forum Paper.

    Science.gov (United States)

    Winstead, Wayland H.; And Others

    An alternative measure for evaluating the performance of academic departments was studied. A comparison was made with the traditional manner for computing unit costs and productivity ratios: prorating the salary and effort of each faculty member to each course level based on the personal mix of course taught. The alternative method used averaging…

  5. Effectiveness of Multimedia Elements in Computer Supported Instruction: Analysis of Personalization Effects, Students' Performances and Costs

    Science.gov (United States)

    Zaidel, Mark; Luo, XiaoHui

    2010-01-01

    This study investigates the efficiency of multimedia instruction at the college level by comparing the effectiveness of multimedia elements used in the computer supported learning with the cost of their preparation. Among the various technologies that advance learning, instructors and students generally identify interactive multimedia elements as…

  6. The cognitive dynamics of computer science cost-effective large scale software development

    CERN Document Server

    De Gyurky, Szabolcs Michael; John Wiley & Sons

    2006-01-01

    This book has three major objectives: To propose an ontology for computer software; To provide a methodology for development of large software systems to cost and schedule that is based on the ontology; To offer an alternative vision regarding the development of truly autonomous systems.

  7. Low-cost addition-subtraction sequences for the final exponentiation computation in pairings

    DEFF Research Database (Denmark)

    Guzmán-Trampe, Juan E; Cruz-Cortéz, Nareli; Dominguez Perez, Luis

    2014-01-01

    In this paper, we address the problem of finding low cost addition–subtraction sequences for situations where a doubling step is significantly cheaper than a non-doubling one. One application of this setting appears in the computation of the final exponentiation step of the reduced Tate pairing d...

  8. Low cost, scalable proteomics data analysis using Amazon's cloud computing services and open source search algorithms.

    Science.gov (United States)

    Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N

    2009-06-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).

  9. Low-cost computer mouse for the elderly or disabled in Taiwan.

    Science.gov (United States)

    Chen, C-C; Chen, W-L; Chen, B-N; Shih, Y-Y; Lai, J-S; Chen, Y-L

    2014-01-01

    A mouse is an important communication interface between a human and a computer, but it is still difficult to use for the elderly or disabled. To develop a low-cost computer mouse auxiliary tool. The principal structure of the low-cost mouse auxiliary tool is the IR (infrared ray) array module and the Wii icon sensor module, which combine with reflective tape and the SQL Server database. This has several benefits including cheap hardware cost, fluent control, prompt response, adaptive adjustment and portability. Also, it carries the game module with the function of training and evaluation; to the trainee, it is really helpful to upgrade the sensitivity of consciousness/sense and the centralization of attention. The intervention phase/maintenance phase, with regard to clicking accuracy and use of time, p value (p< 0.05) reach the level of significance. The development of the low cost adaptive computer mouse auxiliary tool was completed during the study and was also verified as having the characteristics of low cost, easy operation and the adaptability. To patients with physical disabilities, if they have independent control action parts of their limbs, the mouse auxiliary tool is suitable for them to use, i.e. the user only needs to paste the reflective tape by the independent control action parts of the body to operate the mouse auxiliary tool.

  10. A Low Cost Ferritic Stainless Steel Microalloyed by Higher Nb for Automotive Exhaust System

    Science.gov (United States)

    Chen, Erhu; Wang, Xuelin; Shang, Chengjia

    Automotive engine exhaust gas after combustion of fuel, and the gas will be liquefied in the rear of automotive exhaust system. A lot of corrosive anions existing in the condensate make corrosion of the exhaust system materials. Therefore, once pitting perforation, automotive exhaust system will fail directly. In 1980s, automotive exhaust manifold was made of Si-Mo ductile iron, mufflers and the tail pipe were made of carbon steel or aluminized steel. But with higher emission standards carried out, the improvement of engine performance and the higher exhaust temperature as well as the needs of the automotive light-weighting, we need the higher corrosion resistance of the material for automotive exhaust systems to meet the requirements.

  11. Integrating Social Media Technologies in Higher Education: Costs-Benefits Analysis

    Science.gov (United States)

    Okoro, Ephraim

    2012-01-01

    Social networking and electronic channels of communication are effective tools in the process of teaching and learning and have increasingly improved the quality of students' learning outcomes in higher education in recent years. The process encourages students' active engagement, collaboration, and participation in class activities and group…

  12. British Asian Women and the Costs of Higher Education in England

    Science.gov (United States)

    Bhopal, Kalwant

    2016-01-01

    This article will examine Asian women's experiences of financial support in higher education. The article is based on 30 in-depth interviews with Asian women who were studying at a "new" (post-1992) university in the South East of England. Women identified themselves as Muslim, Hindu and Sikh. The findings reveal that women's religious…

  13. Development of a computer program for the cost analysis of spent fuel management

    International Nuclear Information System (INIS)

    Choi, Heui Joo; Lee, Jong Youl; Choi, Jong Won; Cha, Jeong Hun; Whang, Joo Ho

    2009-01-01

    So far, a substantial amount of spent fuels have been generated from the PWR and CANDU reactors. They are being temporarily stored at the nuclear power plant sites. It is expected that the temporary storage facility will be full of spent fuels by around 2016. The government plans to solve the problem by constructing an interim storage facility soon. The radioactive management act was enacted in 2008 to manage the spent fuels safety in Korea. According to the act, the radioactive waste management fund which will be used for the transportation, interim storage, and the final disposal of spent fuels has been established. The cost for the management of spent fuels is surprisingly high and could include a lot of uncertainty. KAERI and Kyunghee University have developed cost estimation tools to evaluate the cost for a spent fuel management based on an engineering design and calculation. It is not easy to develop a tool for a cost estimation under the situation that the national policy on a spent fuel management has not yet been fixed at all. Thus, the current version of the computer program is based on the current conceptual design of each management system. The main purpose of this paper is to introduce the computer program developed for the cost analysis of a spent fuel management. In order to show the application of the program, a spent fuel management scenario is prepared, and the cost for the scenario is estimated

  14. Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors

    Science.gov (United States)

    Taylor, Estelle; Goede, Roelien; Steyn, Tjaart

    2011-01-01

    Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…

  15. Opinions on Computing Education in Korean K-12 System: Higher Education Perspective

    Science.gov (United States)

    Kim, Dae-Kyoo; Jeong, Dongwon; Lu, Lunjin; Debnath, Debatosh; Ming, Hua

    2015-01-01

    The need for computing education in the K-12 curriculum has grown globally. The Republic of Korea is not an exception. In response to the need, the Korean Ministry of Education has announced an outline for software-centric computing education in the K-12 system, which aims at enhancing the current computing education with software emphasis. In…

  16. A low cost computer-controlled electrochemical measurement system for education and research

    International Nuclear Information System (INIS)

    Cottis, R.A.

    1989-01-01

    With the advent of low cost computers of significant processing power, it has become economically attractive, as well as offering practical advantages, to replace conventional electrochemical instrumentation with computer-based equipment. For example, the equipment to be described can perform all of the functions required for the measurement of a potentiodynamic polarization curve, replacing the conventional arrangement of sweep generator, potentiostat and chart recorder at a cost (based on the purchase cost of parts) which is less than that of most chart recorders alone. Additionally the use of computer control at a relatively low level provides a versatility (assuming the development of suitable software) which cannot easily be matched by conventional instruments. As a result of these considerations a simple computer-controlled electrochemical measurement system has been developed, with a primary aim being its use in teaching an MSc class in corrosion science and engineering, with additional applications in MSc and PhD research. For education reasons the design of the user interface has tried to make the internal operation of the unit as obvious as possible, and thereby minimize the tendency for students to treat the unit as a 'black box' with incomprehensible inner workings. This has resulted in a unit in which the three main components of function generator, potentiostat and recorder are presented as independent areas on the front panel, and can be configured by the user in exactly the same way as conventional instruments. (author) 11 figs

  17. A low cost computer-controlled electrochemical measurement system for education and research

    Energy Technology Data Exchange (ETDEWEB)

    Cottis, R A [Manchester Univ. (UK). Inst. of Science and Technology

    1989-01-01

    With the advent of low cost computers of significant processing power, it has become economically attractive, as well as offering practical advantages, to replace conventional electrochemical instrumentation with computer-based equipment. For example, the equipment to be described can perform all of the functions required for the measurement of a potentiodynamic polarization curve, replacing the conventional arrangement of sweep generator, potentiostat and chart recorder at a cost (based on the purchase cost of parts) which is less than that of most chart recorders alone. Additionally the use of computer control at a relatively low level provides a versatility (assuming the development of suitable software) which cannot easily be matched by conventional instruments. As a result of these considerations a simple computer-controlled electrochemical measurement system has been developed, with a primary aim being its use in teaching an MSc class in corrosion science and engineering, with additional applications in MSc and PhD research. For education reasons the design of the user interface has tried to make the internal operation of the unit as obvious as possible, and thereby minimize the tendency for students to treat the unit as a 'black box' with incomprehensible inner workings. This has resulted in a unit in which the three main components of function generator, potentiostat and recorder are presented as independent areas on the front panel, and can be configured by the user in exactly the same way as conventional instruments. (author) 11 figs.

  18. Hybrid Cloud Computing Architecture Optimization by Total Cost of Ownership Criterion

    Directory of Open Access Journals (Sweden)

    Elena Valeryevna Makarenko

    2014-12-01

    Full Text Available Achieving the goals of information security is a key factor in the decision to outsource information technology and, in particular, to decide on the migration of organizational data, applications, and other resources to the infrastructure, based on cloud computing. And the key issue in the selection of optimal architecture and the subsequent migration of business applications and data to the cloud organization information environment is the question of the total cost of ownership of IT infrastructure. This paper focuses on solving the problem of minimizing the total cost of ownership cloud.

  19. Low cost phantom for computed radiology; Objeto de teste de baixo custo para radiologia computadorizada

    Energy Technology Data Exchange (ETDEWEB)

    Travassos, Paulo Cesar B.; Magalhaes, Luis Alexandre G., E-mail: pctravassos@ufrj.br [Universidade do Estado do Rio de Janeiro (IBRGA/UERJ), RJ (Brazil). Laboratorio de Ciencias Radiologicas; Augusto, Fernando M.; Sant' Yves, Thalis L.A.; Goncalves, Elicardo A.S. [Instituto Nacional de Cancer (INCA), Rio de Janeiro, RJ (Brazil); Botelho, Marina A. [Hospital Universitario Pedro Ernesto (UERJ), Rio de Janeiro, RJ (Brazil)

    2012-08-15

    This article presents the results obtained from a low cost phantom, used to analyze Computed Radiology (CR) equipment. The phantom was constructed to test a few parameters related to image quality, as described in [1-9]. Materials which can be easily purchased were used in the construction of the phantom, with total cost of approximately U$100.00. A bar pattern was placed only to verify the efficacy of the grids in the spatial resolution determination, and was not included in the budget because the data was acquired from the grids. (author)

  20. Cost-effective computational method for radiation heat transfer in semi-crystalline polymers

    Science.gov (United States)

    Boztepe, Sinan; Gilblas, Rémi; de Almeida, Olivier; Le Maoult, Yannick; Schmidt, Fabrice

    2018-05-01

    This paper introduces a cost-effective numerical model for infrared (IR) heating of semi-crystalline polymers. For the numerical and experimental studies presented here semi-crystalline polyethylene (PE) was used. The optical properties of PE were experimentally analyzed under varying temperature and the obtained results were used as input in the numerical studies. The model was built based on optically homogeneous medium assumption whereas the strong variation in the thermo-optical properties of semi-crystalline PE under heating was taken into account. Thus, the change in the amount radiative energy absorbed by the PE medium was introduced in the model induced by its temperature-dependent thermo-optical properties. The computational study was carried out considering an iterative closed-loop computation, where the absorbed radiation was computed using an in-house developed radiation heat transfer algorithm -RAYHEAT- and the computed results was transferred into the commercial software -COMSOL Multiphysics- for solving transient heat transfer problem to predict temperature field. The predicted temperature field was used to iterate the thermo-optical properties of PE that varies under heating. In order to analyze the accuracy of the numerical model experimental analyses were carried out performing IR-thermographic measurements during the heating of the PE plate. The applicability of the model in terms of computational cost, number of numerical input and accuracy was highlighted.

  1. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Science.gov (United States)

    Guerrero, Ginés D.; Imbernón, Baldomero; García, José M.

    2014-01-01

    Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC) platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs) has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO). This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor. PMID:25025055

  2. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Ginés D. Guerrero

    2014-01-01

    Full Text Available Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO. This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor.

  3. Low-cost autonomous perceptron neural network inspired by quantum computation

    Science.gov (United States)

    Zidan, Mohammed; Abdel-Aty, Abdel-Haleem; El-Sadek, Alaa; Zanaty, E. A.; Abdel-Aty, Mahmoud

    2017-11-01

    Achieving low cost learning with reliable accuracy is one of the important goals to achieve intelligent machines to save time, energy and perform learning process over limited computational resources machines. In this paper, we propose an efficient algorithm for a perceptron neural network inspired by quantum computing composite from a single neuron to classify inspirable linear applications after a single training iteration O(1). The algorithm is applied over a real world data set and the results are outer performs the other state-of-the art algorithms.

  4. The Optimal Pricing of Computer Software and Other Products with High Switching Costs

    OpenAIRE

    Pekka Ahtiala

    2004-01-01

    The paper studies the determinants of the optimum prices of computer programs and their upgrades. It is based on the notion that because of the human capital invested in the use of a computer program by its user, this product has high switching costs, and on the finding that pirates are responsible for generating over 80 per cent of new software sales. A model to maximize the present value of the program to the program house is constructed to determine the optimal prices of initial programs a...

  5. The costs and benefits of self-monitoring for higher functioning children and adolescents with autism.

    Science.gov (United States)

    Henderson, Heather A; Ono, Kim E; McMahon, Camilla M; Schwartz, Caley B; Usher, Lauren V; Mundy, Peter C

    2015-02-01

    The ability to regulate behaviors and emotions depends in part on the ability to flexibly monitor one's own progress toward a goal. Atypical patterns of response monitoring have been reported in individuals with autism spectrum disorders (ASD). In the current study we examined the error related negativity (ERN), an electrophysiological index of response monitoring, in relation to behavioral, social cognitive, and emotional presentation in higher functioning children (8-16 years) diagnosed with autism (HFA: N = 38) and an age- and IQ-matched sample of children without autism (COM: N = 36). Both HFA and COM participants displayed larger amplitude responses to error compared to correct response trials and these amplitudes did not differ by diagnostic group. For participants with HFA, larger ERN amplitudes were associated with more parent-reported autistic symptoms and more self-reported internalizing problems. However, across the full sample, larger ERN amplitudes were associated with better performance on theory of mind tasks. The results are discussed in terms of the utility of electrophysiological measures for understanding essential moderating processes that contribute to the spectrum of behavioral expression in the development of ASD.

  6. Computational cost for detecting inspiralling binaries using a network of laser interferometric detectors

    International Nuclear Information System (INIS)

    Pai, Archana; Bose, Sukanta; Dhurandhar, Sanjeev

    2002-01-01

    We extend a coherent network data-analysis strategy developed earlier for detecting Newtonian waveforms to the case of post-Newtonian (PN) waveforms. Since the PN waveform depends on the individual masses of the inspiralling binary, the parameter-space dimension increases by one from that of the Newtonian case. We obtain the number of templates and estimate the computational costs for PN waveforms: for a lower mass limit of 1M o-dot , for LIGO-I noise and with 3% maximum mismatch, the online computational speed requirement for single detector is a few Gflops; for a two-detector network it is hundreds of Gflops and for a three-detector network it is tens of Tflops. Apart from idealistic networks, we obtain results for realistic networks comprising of LIGO and VIRGO. Finally, we compare costs incurred in a coincidence detection strategy with those incurred in the coherent strategy detailed above

  7. Computational cost for detecting inspiralling binaries using a network of laser interferometric detectors

    CERN Document Server

    Pai, A; Dhurandhar, S V

    2002-01-01

    We extend a coherent network data-analysis strategy developed earlier for detecting Newtonian waveforms to the case of post-Newtonian (PN) waveforms. Since the PN waveform depends on the individual masses of the inspiralling binary, the parameter-space dimension increases by one from that of the Newtonian case. We obtain the number of templates and estimate the computational costs for PN waveforms: for a lower mass limit of 1M sub o sub - sub d sub o sub t , for LIGO-I noise and with 3% maximum mismatch, the online computational speed requirement for single detector is a few Gflops; for a two-detector network it is hundreds of Gflops and for a three-detector network it is tens of Tflops. Apart from idealistic networks, we obtain results for realistic networks comprising of LIGO and VIRGO. Finally, we compare costs incurred in a coincidence detection strategy with those incurred in the coherent strategy detailed above.

  8. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment

    Science.gov (United States)

    Boevé, Anja J.; Meijer, Rob R.; Albers, Casper J.; Beetsma, Yta; Bosker, Roel J.

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration. PMID:26641632

  9. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment.

    Science.gov (United States)

    Boevé, Anja J; Meijer, Rob R; Albers, Casper J; Beetsma, Yta; Bosker, Roel J

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration.

  10. POPCYCLE: a computer code for calculating nuclear and fossil plant levelized life-cycle power costs

    International Nuclear Information System (INIS)

    Hardie, R.W.

    1982-02-01

    POPCYCLE, a computer code designed to calculate levelized life-cycle power costs for nuclear and fossil electrical generating plants is described. Included are (1) derivations of the equations and a discussion of the methodology used by POPCYCLE, (2) a description of the input required by the code, (3) a listing of the input for a sample case, and (4) the output for a sample case

  11. Reduced computational cost in the calculation of worst case response time for real time systems

    OpenAIRE

    Urriza, José M.; Schorb, Lucas; Orozco, Javier D.; Cayssials, Ricardo

    2009-01-01

    Modern Real Time Operating Systems require reducing computational costs even though the microprocessors become more powerful each day. It is usual that Real Time Operating Systems for embedded systems have advance features to administrate the resources of the applications that they support. In order to guarantee either the schedulability of the system or the schedulability of a new task in a dynamic Real Time System, it is necessary to know the Worst Case Response Time of the Real Time tasks ...

  12. A feasibility study on direct methanol fuel cells for laptop computers based on a cost comparison with lithium-ion batteries

    International Nuclear Information System (INIS)

    Wee, Jung-Ho

    2007-01-01

    This paper compares the total cost of direct methanol fuel cell (DMFC) and lithium (Li)-ion battery systems when applied as the power supply for laptop computers in the Korean environment. The average power output and operational time of the laptop computers were assumed to be 20 W and 3000 h, respectively. Considering the status of their technologies and with certain conditions assumed, the total costs were calculated to be US$140 for the Li-ion battery and US$362 for DMFC. The manufacturing costs of the DMFC and Li-ion battery systems were calculated to be $16.65 W -1 and $0.77 W h -1 , and the energy consumption costs to be $0.00051 W h -1 and $0.00032 W h -1 , respectively. The higher fuel consumption cost of the DMFC system was due to the methanol (MeOH) crossover loss. Therefore, the requirements for DMFCs to be able to compete with Li-ion batteries in terms of energy cost include reducing the crossover level to at an order magnitude of -9 and the MeOH price to under $0.5 kg -1 . Under these conditions, if the DMFC manufacturing cost could be reduced to $6.30 W -1 , then the DMFC system would become at least as competitive as the Li-ion battery system for powering laptop computers in Korea. (author)

  13. Wigner higher-order spectra: definition, properties, computation and application to transient signal analysis

    OpenAIRE

    Rodríguez Fonollosa, Javier; Nikias, Chrysostomos L.

    1993-01-01

    The Wigner higher order moment spectra (WHOS) are defined as extensions of the Wigner-Ville distribution (WD) to higher order moment spectra domains. A general class of time-frequency higher order moment spectra is also defined in terms of arbitrary higher order moments of the signal as generalizations of the Cohen’s general class of time-frequency representations. The properties of the general class of time-frequency higher order moment spectra can be related to the properties...

  14. Influence of studying in higher educational establishment on students’ harmful computer habits

    Directory of Open Access Journals (Sweden)

    M.D. Kudryavtsev

    2016-10-01

    Full Text Available Purpose: to determine influence of educational process on prevalence of students’ harmful computer habits. Material: in the research 1st-3rd year students (803 boys and 596 girls participated. All they specialized in discipline Physical culture. The students had no health disorders. Results: it was found that in average students have 2 computer habits everyone. The most probable and dangerous in respect to addicting are habits to use internet and computer games. Student, who has these habits, spends more than 4 hours a day for them. 33% of 1st year boys and 16% of 1st year girls spend more than 2 hours a day for computer games. 15-20 % of boys and 25-30% of year girls waste more than 4 hours a day in internet. 10-15% of boys spend more than 4 hours a day for computer games. It is very probable that these students already have computer games’ addiction. Conclusions: recent time dangerous tendency to watching anime has been appearing. Physical culture faculties and departments shall take additional measures on reduction of students’ computer addictions. Teachers of all disciplines shall organize educational process with the help of electronic resources so that not to provoke progressing of students’ computer habits.

  15. Above-Campus Services: Shaping the Promise of Cloud Computing for Higher Education

    Science.gov (United States)

    Wheeler, Brad; Waggener, Shelton

    2009-01-01

    The concept of today's cloud computing may date back to 1961, when John McCarthy, retired Stanford professor and Turing Award winner, delivered a speech at MIT's Centennial. In that speech, he predicted that in the future, computing would become a "public utility." Yet for colleges and universities, the recent growth of pervasive, very high speed…

  16. A practical technique for benefit-cost analysis of computer-aided design and drafting systems

    International Nuclear Information System (INIS)

    Shah, R.R.; Yan, G.

    1979-03-01

    Analysis of benefits and costs associated with the operation of Computer-Aided Design and Drafting Systems (CADDS) are needed to derive economic justification for acquiring new systems, as well as to evaluate the performance of existing installations. In practice, however, such analyses are difficult to perform since most technical and economic advantages of CADDS are ΣirreduciblesΣ, i.e. cannot be readily translated into monetary terms. In this paper, a practical technique for economic analysis of CADDS in a drawing office environment is presented. A Σworst caseΣ approach is taken since increase in productivity of existing manpower is the only benefit considered, while all foreseen costs are taken into account. Methods of estimating benefits and costs are described. The procedure for performing the analysis is illustrated by a case study based on the drawing office activities at Atomic Energy of Canada Limited. (auth)

  17. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Science.gov (United States)

    2010-01-01

    ... Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions (a... loan cost rate for various transactions, as well as instructions, explanations, and examples for.... (2) Term of the transaction. For purposes of total annual loan cost disclosures, the term of a...

  18. Cost-Effectiveness Analysis (CEA) of Intravenous Urography (IVU) and Unenhanced Multidetector Computed Tomography (MDCT) for Initial Investigation of Suspected Acute Ureterolithiasis

    International Nuclear Information System (INIS)

    Eikefjord, E.; Askildsen, J.E.; Roervik, J.

    2008-01-01

    Background: It is important to compare the cost and effectiveness of multidetector computed tomography (MDCT) and intravenous urography (IVU) to determine the most cost-effective alternative for the initial investigation of acute ureterolithiasis. Purpose: To analyze the task-specific variable costs combined with the diagnostic effect of MDCT and IVU for patients with acute flank pain, and to determine which is most cost effective. Material and Methods: 119 patients with acute flank pain suggestive of stone disease (ureterolithiasis) were examined by both MDCT and IVU. Variable costs related to medical equipment, consumption material, equipment control, and personnel were calculated. The diagnostic effect was assessed. Results: The variable costs of MDCT versus IVU were EUR 32 and EUR 117, respectively. This significant difference was mainly due to savings in examination time, higher annual examination frequency, lower material costs, and no use of contrast media. As for diagnostic effect, MDCT proved considerably more accurate in the diagnosis of stone disease than IVU and markedly more accurate concerning differential diagnoses. Conclusion: MDCT had lower differential costs and a higher capacity to determine correctly stone disease and differential diagnoses, as compared to IVU, in patients with acute flank pain. Consequently, MDCT is a dominant alternative to IVU when evaluated exclusively from a cost-effective perspective

  19. Cost and resource utilization associated with use of computed tomography to evaluate chest pain in the emergency department: the Rule Out Myocardial Infarction using Computer Assisted Tomography (ROMICAT) study.

    Science.gov (United States)

    Hulten, Edward; Goehler, Alexander; Bittencourt, Marcio Sommer; Bamberg, Fabian; Schlett, Christopher L; Truong, Quynh A; Nichols, John; Nasir, Khurram; Rogers, Ian S; Gazelle, Scott G; Nagurney, John T; Hoffmann, Udo; Blankstein, Ron

    2013-09-01

    Coronary computed tomographic angiography (cCTA) allows rapid, noninvasive exclusion of obstructive coronary artery disease (CAD). However, concern exists whether implementation of cCTA in the assessment of patients presenting to the emergency department with acute chest pain will lead to increased downstream testing and costs compared with alternative strategies. Our aim was to compare observed actual costs of usual care (UC) with projected costs of a strategy including early cCTA in the evaluation of patients with acute chest pain in the Rule Out Myocardial Infarction Using Computer Assisted Tomography I (ROMICAT I) study. We compared cost and hospital length of stay of UC observed among 368 patients enrolled in the ROMICAT I study with projected costs of management based on cCTA. Costs of UC were determined by an electronic cost accounting system. Notably, UC was not influenced by cCTA results because patients and caregivers were blinded to the cCTA results. Costs after early implementation of cCTA were estimated assuming changes in management based on cCTA findings of the presence and severity of CAD. Sensitivity analysis was used to test the influence of key variables on both outcomes and costs. We determined that in comparison with UC, cCTA-guided triage, whereby patients with no CAD are discharged, could reduce total hospital costs by 23% (Pcost increases such that when the prevalence of ≥ 50% stenosis is >28% to 33%, the use of cCTA becomes more costly than UC. cCTA may be a cost-saving tool in acute chest pain populations that have a prevalence of potentially obstructive CAD cost would be anticipated in populations with higher prevalence of disease.

  20. A model for evaluating the institutional costs and benefits of ICT initiatives in teaching and learning in higher education

    Directory of Open Access Journals (Sweden)

    David Nicol

    2003-12-01

    Full Text Available Significant investments are being made in the application of new information and communications technologies (ICT to teaching and learning in higher education. However, until recently, there has been little progress in devising an integrated costbenefit model that decision-makers can use to appraise ICT investment options from the wider institutional perspective. This paper describes and illustrates a model that has been developed to enable evaluations of the costs and benefits of the use of ICT. The strengths and limitations of the model are highlighted and discussed

  1. Collaborative Creativity: A Computational Approach: Raw Shaping Form Finding in Higher Education Domain

    NARCIS (Netherlands)

    Wendrich, Robert E.; Guerrero, J.E.

    2013-01-01

    This paper examines the conceptual synthesis processes in conjunction with assistive computational support for individual and collaborative interaction. We present findings from two educational design interaction experiments in product creation processing (PCP). We focus on metacognitive aspects of

  2. COMPUTER SYSTEM FOR DETERMINATION OF COST DAILY SUGAR PRODUCTION AND INCIDENTS DECISIONS FOR COMPANIES SUGAR (SACODI

    Directory of Open Access Journals (Sweden)

    Alejandro Álvarez-Navarro

    2016-01-01

    Full Text Available The process of sugar production is complex; anything that affects this chain has direct repercussions in the sugar production’s costs, it’s synthetic and decisive indicator for the taking of decisions. Currently the Cuban sugar factory determine this cost weekly, for that, its process of taking of decisions is affected. Looking for solutions to this problem, the present work, being part of a territorial project approved by CITMA, intended to calculate the cost of production daily, weekly, monthly and accumulated until indicated date, according to an adaptation to the methodology used by the National Costs System of sugarcane created by the MINAZ, it’s supported by a computer system denominated SACODI. This adaptation registers the physical and economic indicators of all direct and indirect expenses of the  sugarcane and besides this information generates an economic-mathematical model of goal programming whose solution indicates the best balance in amount of sugar of the entities of the sugar factory, in short term. The implementation of the system in the sugar factory «Julio A. Mella» in Santiago de Cuba in the sugar-cane production 08-09 produced an estimate of decrease of the cost of until 3,5 % for the taking of better decisions. 

  3. Cost-effectiveness of implementing computed tomography screening for lung cancer in Taiwan.

    Science.gov (United States)

    Yang, Szu-Chun; Lai, Wu-Wei; Lin, Chien-Chung; Su, Wu-Chou; Ku, Li-Jung; Hwang, Jing-Shiang; Wang, Jung-Der

    2017-06-01

    A screening program for lung cancer requires more empirical evidence. Based on the experience of the National Lung Screening Trial (NLST), we developed a method to adjust lead-time bias and quality-of-life changes for estimating the cost-effectiveness of implementing computed tomography (CT) screening in Taiwan. The target population was high-risk (≥30 pack-years) smokers between 55 and 75 years of age. From a nation-wide, 13-year follow-up cohort, we estimated quality-adjusted life expectancy (QALE), loss-of-QALE, and lifetime healthcare expenditures per case of lung cancer stratified by pathology and stage. Cumulative stage distributions for CT-screening and no-screening were assumed equal to those for CT-screening and radiography-screening in the NLST to estimate the savings of loss-of-QALE and additional costs of lifetime healthcare expenditures after CT screening. Costs attributable to screen-negative subjects, false-positive cases and radiation-induced lung cancer were included to obtain the incremental cost-effectiveness ratio from the public payer's perspective. The incremental costs were US$22,755 per person. After dividing this by savings of loss-of-QALE (1.16 quality-adjusted life year (QALY)), the incremental cost-effectiveness ratio was US$19,683 per QALY. This ratio would fall to US$10,947 per QALY if the stage distribution for CT-screening was the same as that of screen-detected cancers in the NELSON trial. Low-dose CT screening for lung cancer among high-risk smokers would be cost-effective in Taiwan. As only about 5% of our women are smokers, future research is necessary to identify the high-risk groups among non-smokers and increase the coverage. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  4. MONITOR: A computer model for estimating the costs of an integral monitored retrievable storage facility

    International Nuclear Information System (INIS)

    Reimus, P.W.; Sevigny, N.L.; Schutz, M.E.; Heller, R.A.

    1986-12-01

    The MONITOR model is a FORTRAN 77 based computer code that provides parametric life-cycle cost estimates for a monitored retrievable storage (MRS) facility. MONITOR is very flexible in that it can estimate the costs of an MRS facility operating under almost any conceivable nuclear waste logistics scenario. The model can also accommodate input data of varying degrees of complexity and detail (ranging from very simple to more complex) which makes it ideal for use in the MRS program, where new designs and new cost data are frequently offered for consideration. MONITOR can be run as an independent program, or it can be interfaced with the Waste System Transportation and Economic Simulation (WASTES) model, a program that simulates the movement of waste through a complete nuclear waste disposal system. The WASTES model drives the MONITOR model by providing it with the annual quantities of waste that are received, stored, and shipped at the MRS facility. Three runs of MONITOR are documented in this report. Two of the runs are for Version 1 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2A (backup) version of the MRS cost estimate. In one of these runs MONITOR was run as an independent model, and in the other run MONITOR was run using an input file generated by the WASTES model. The two runs correspond to identical cases, and the fact that they gave identical results verified that the code performed the same calculations in both modes of operation. The third run was made for Version 2 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2B (integral) version of the MRS cost estimate. This run was made with MONITOR being run as an independent model. The results of several cases have been verified by hand calculations

  5. Cost-Effectiveness of Computed Tomographic Colonography: A Prospective Comparison with Colonoscopy

    International Nuclear Information System (INIS)

    Arnesen, R.B.; Ginnerup-Pedersen, B.; Poulsen, P.B.; Benzon, K. von; Adamsen, S.; Laurberg, S.; Hart-Hansen, O.

    2007-01-01

    Purpose: To estimate the cost-effectiveness of detecting colorectal polyps with computed tomographic colonography (CTC) and subsequent polypectomy with primary colonoscopy (CC), using CC as the alternative strategy. Material and Methods: A marginal analysis was performed regarding 103 patients who had had CTC prior to same-day CC at two hospitals, H-I (n 53) and H-II (n = 50). The patients were randomly chosen from surveillance and symptomatic study populations (148 at H-I and 231 at H-II). Populations, organizations, and procedures were compared. Cost data on time consumption, medication, and minor equipment were collected prospectively, while data on salaries and major equipment were collected retrospectively. The effect was the (previously published) sensitivities of CTC and CC for detection of colorectal polyps ≥6 mm (H-I, n = 148) or ≥5 mm (H-II, n = 231). Results: Thirteen patients at each center had at least one colorectal polyp ≥6 mm or ≥5 mm. CTC was the cost-effective alternative at H-I (Euro 187 vs. Euro 211), while CC was the cost-effective alternative at H-II (Euro 239 vs. Euro 192). The cost-effectiveness (costs per finding) mainly depended on the sensitivity of CTC and CC, but the depreciation of equipment and the staff's use of time were highly influential as well. Conclusion: Detection of colorectal polyps ≥6 mm or ≥5 mm with CTC, followed by polypectomy by CC, can be performed cost-effectively at some institutions with the appropriate hardware and organization keywords

  6. Comparison of different strategies in prenatal screening for Down's syndrome: cost effectiveness analysis of computer simulation.

    Science.gov (United States)

    Gekas, Jean; Gagné, Geneviève; Bujold, Emmanuel; Douillard, Daniel; Forest, Jean-Claude; Reinharz, Daniel; Rousseau, François

    2009-02-13

    To assess and compare the cost effectiveness of three different strategies for prenatal screening for Down's syndrome (integrated test, sequential screening, and contingent screenings) and to determine the most useful cut-off values for risk. Computer simulations to study integrated, sequential, and contingent screening strategies with various cut-offs leading to 19 potential screening algorithms. The computer simulation was populated with data from the Serum Urine and Ultrasound Screening Study (SURUSS), real unit costs for healthcare interventions, and a population of 110 948 pregnancies from the province of Québec for the year 2001. Cost effectiveness ratios, incremental cost effectiveness ratios, and screening options' outcomes. The contingent screening strategy dominated all other screening options: it had the best cost effectiveness ratio ($C26,833 per case of Down's syndrome) with fewer procedure related euploid miscarriages and unnecessary terminations (respectively, 6 and 16 per 100,000 pregnancies). It also outperformed serum screening at the second trimester. In terms of the incremental cost effectiveness ratio, contingent screening was still dominant: compared with screening based on maternal age alone, the savings were $C30,963 per additional birth with Down's syndrome averted. Contingent screening was the only screening strategy that offered early reassurance to the majority of women (77.81%) in first trimester and minimised costs by limiting retesting during the second trimester (21.05%). For the contingent and sequential screening strategies, the choice of cut-off value for risk in the first trimester test significantly affected the cost effectiveness ratios (respectively, from $C26,833 to $C37,260 and from $C35,215 to $C45,314 per case of Down's syndrome), the number of procedure related euploid miscarriages (from 6 to 46 and from 6 to 45 per 100,000 pregnancies), and the number of unnecessary terminations (from 16 to 26 and from 16 to 25 per 100

  7. Use of an Interculturally Enriched Collaboration Script in Computer-Supported Collaborative Learning in Higher Education

    Science.gov (United States)

    Popov, Vitaliy; Biemans, Harm J. A.; Kuznetsov, Andrei N.; Mulder, Martin

    2014-01-01

    In this exploratory study, the authors introduced an interculturally enriched collaboration script (IECS) for working in culturally diverse groups within a computer-supported collaborative learning (CSCL) environment and then assessed student online collaborative behaviour, learning performance and experiences. The question was if and how these…

  8. Collaborative and Competitive Video Games for Teaching Computing in Higher Education

    Science.gov (United States)

    Smith, Spencer; Chan, Samantha

    2017-01-01

    This study measures the success of using a collaborative and competitive video game, named Space Race, to teach computing to first year engineering students. Space Race is played by teams of four, each with their own tablet, collaborating to compete against the other teams in the class. The impact of the game on student learning was studied…

  9. Exploring the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms: A Formative Analysis

    Science.gov (United States)

    Kay, Robin H.; Lauricella, Sharon

    2011-01-01

    Because of decreased prices, increased convenience, and wireless access, an increasing number of college and university students are using laptop computers in their classrooms. This recent trend has forced instructors to address the educational consequences of using these mobile devices. The purpose of the current study was to analyze and assess…

  10. Cost-effectiveness of routine computed tomography in the evaluation of idiopathic unilateral vocal fold paralysis.

    Science.gov (United States)

    Hojjat, Houmehr; Svider, Peter F; Folbe, Adam J; Raza, Syed N; Carron, Michael A; Shkoukani, Mahdi A; Merati, Albert L; Mayerhoff, Ross M

    2017-02-01

    To evaluate the cost-effectiveness of routine computed tomography (CT) in individuals with unilateral vocal fold paralysis (UVFP) STUDY DESIGN: Health Economics Decision Tree Analysis METHODS: A decision tree was constructed to determine the incremental cost-effectiveness ratio (ICER) of CT imaging in UVFP patients. Univariate sensitivity analysis was utilized to calculate what the probability of having an etiology of the paralysis discovered would have to be to make CT with contrast more cost-effective than no imaging. We used two studies examining findings in UVFP patients. The decision pathways were utilizing CT neck with intravenous contrast after diagnostic laryngoscopy versus laryngoscopy alone. The probability of detecting an etiology for UVFP and associated costs were extracted to construct the decision tree. The only incorrect diagnosis was missing a mass in the no-imaging decision branch, which rendered an effectiveness of 0. The ICER of using CT was $3,306, below most acceptable willingness-to-pay (WTP) thresholds. Additionally, univariate sensitivity analysis indicated that at the WTP threshold of $30,000, obtaining CT imaging was the most cost-effective choice when the probability of having a lesion was above 1.7%. Multivariate probabilistic sensitivity analysis with Monte Carlo simulations also showed that at the WTP of $30,000, CT scanning is more cost-effective, with 99.5% certainty. Particularly in the current healthcare environment characterized by increasing consciousness of utilization defensive medicine, economic evaluations represent evidence-based findings that can be employed to facilitate appropriate decision making and enhance physician-patient communication. This economic evaluation strongly supports obtaining CT imaging in patients with newly diagnosed UVFP. 2c. Laryngoscope, 2016 127:440-444, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  11. Social incidence and economic costs of carbon limits; A computable general equilibrium analysis for Switzerland

    Energy Technology Data Exchange (ETDEWEB)

    Stephan, G.; Van Nieuwkoop, R.; Wiedmer, T. (Institute for Applied Microeconomics, Univ. of Bern (Switzerland))

    1992-01-01

    Both distributional and allocational effects of limiting carbon dioxide emissions in a small and open economy are discussed. It starts from the assumption that Switzerland attempts to stabilize its greenhouse gas emissions over the next 25 years, and evaluates costs and benefits of the respective reduction programme. From a methodological viewpoint, it is illustrated how a computable general equilibrium approach can be adopted for identifying economic effects of cutting greenhouse gas emissions on the national level. From a political economy point of view it considers the social incidence of a greenhouse policy. It shows in particular that public acceptance can be increased and economic costs of greenhouse policies can be reduced, if carbon taxes are accompanied by revenue redistribution. 8 tabs., 1 app., 17 refs.

  12. On the role of cost-sensitive learning in multi-class brain-computer interfaces.

    Science.gov (United States)

    Devlaminck, Dieter; Waegeman, Willem; Wyns, Bart; Otte, Georges; Santens, Patrick

    2010-06-01

    Brain-computer interfaces (BCIs) present an alternative way of communication for people with severe disabilities. One of the shortcomings in current BCI systems, recently put forward in the fourth BCI competition, is the asynchronous detection of motor imagery versus resting state. We investigated this extension to the three-class case, in which the resting state is considered virtually lying between two motor classes, resulting in a large penalty when one motor task is misclassified into the other motor class. We particularly focus on the behavior of different machine-learning techniques and on the role of multi-class cost-sensitive learning in such a context. To this end, four different kernel methods are empirically compared, namely pairwise multi-class support vector machines (SVMs), two cost-sensitive multi-class SVMs and kernel-based ordinal regression. The experimental results illustrate that ordinal regression performs better than the other three approaches when a cost-sensitive performance measure such as the mean-squared error is considered. By contrast, multi-class cost-sensitive learning enables us to control the number of large errors made between two motor tasks.

  13. Computational Comparison of Several Greedy Algorithms for the Minimum Cost Perfect Matching Problem on Large Graphs

    DEFF Research Database (Denmark)

    Wøhlk, Sanne; Laporte, Gilbert

    2017-01-01

    The aim of this paper is to computationally compare several algorithms for the Minimum Cost Perfect Matching Problem on an undirected complete graph. Our work is motivated by the need to solve large instances of the Capacitated Arc Routing Problem (CARP) arising in the optimization of garbage...... collection in Denmark. Common heuristics for the CARP involve the optimal matching of the odd-degree nodes of a graph. The algorithms used in the comparison include the CPLEX solution of an exact formulation, the LEDA matching algorithm, a recent implementation of the Blossom algorithm, as well as six...

  14. Expedited Holonomic Quantum Computation via Net Zero-Energy-Cost Control in Decoherence-Free Subspace.

    Science.gov (United States)

    Pyshkin, P V; Luo, Da-Wei; Jing, Jun; You, J Q; Wu, Lian-Ao

    2016-11-25

    Holonomic quantum computation (HQC) may not show its full potential in quantum speedup due to the prerequisite of a long coherent runtime imposed by the adiabatic condition. Here we show that the conventional HQC can be dramatically accelerated by using external control fields, of which the effectiveness is exclusively determined by the integral of the control fields in the time domain. This control scheme can be realized with net zero energy cost and it is fault-tolerant against fluctuation and noise, significantly relaxing the experimental constraints. We demonstrate how to realize the scheme via decoherence-free subspaces. In this way we unify quantum robustness merits of this fault-tolerant control scheme, the conventional HQC and decoherence-free subspace, and propose an expedited holonomic quantum computation protocol.

  15. Counting loop diagrams: computational complexity of higher-order amplitude evaluation

    International Nuclear Information System (INIS)

    Eijk, E. van; Kleiss, R.; Lazopoulos, A.

    2004-01-01

    We discuss the computational complexity of the perturbative evaluation of scattering amplitudes, both by the Caravaglios-Moretti algorithm and by direct evaluation of the individual diagrams. For a self-interacting scalar theory, we determine the complexity as a function of the number of external legs. We describe a method for obtaining the number of topologically inequivalent Feynman graphs containing closed loops, and apply this to 1- and 2-loop amplitudes. We also compute the number of graphs weighted by their symmetry factors, thus arriving at exact and asymptotic estimates for the average symmetry factor of diagrams. We present results for the asymptotic number of diagrams up to 10 loops, and prove that the average symmetry factor approaches unity as the number of external legs becomes large. (orig.)

  16. Cost-effective computations with boundary interface operators in elliptic problems

    International Nuclear Information System (INIS)

    Khoromskij, B.N.; Mazurkevich, G.E.; Nikonov, E.G.

    1993-01-01

    The numerical algorithm for fast computations with interface operators associated with the elliptic boundary value problems (BVP) defined on step-type domains is presented. The algorithm is based on the asymptotically almost optimal technique developed for treatment of the discrete Poincare-Steklov (PS) operators associated with the finite-difference Laplacian on rectangles when using the uniform grid with a 'displacement by h/2'. The approach can be regarded as an extension of the method proposed for the partial solution of the finite-difference Laplace equation to the case of displaced grids and mixed boundary conditions. It is shown that the action of the PS operator for the Dirichlet problem and mixed BVP can be computed with expenses of the order of O(Nlog 2 N) both for arithmetical operations and computer memory needs, where N is the number of unknowns on the rectangle boundary. The single domain algorithm is applied to solving the multidomain elliptic interface problems with piecewise constant coefficients. The numerical experiments presented confirm almost linear growth of the computational costs and memory needs with respect to the dimension of the discrete interface problem. 14 refs., 3 figs., 4 tabs

  17. Direct costs and cost-effectiveness of dual-source computed tomography and invasive coronary angiography in patients with an intermediate pretest likelihood for coronary artery disease.

    Science.gov (United States)

    Dorenkamp, Marc; Bonaventura, Klaus; Sohns, Christian; Becker, Christoph R; Leber, Alexander W

    2012-03-01

    The study aims to determine the direct costs and comparative cost-effectiveness of latest-generation dual-source computed tomography (DSCT) and invasive coronary angiography for diagnosing coronary artery disease (CAD) in patients suspected of having this disease. The study was based on a previously elaborated cohort with an intermediate pretest likelihood for CAD and on complementary clinical data. Cost calculations were based on a detailed analysis of direct costs, and generally accepted accounting principles were applied. Based on Bayes' theorem, a mathematical model was used to compare the cost-effectiveness of both diagnostic approaches. Total costs included direct costs, induced costs and costs of complications. Effectiveness was defined as the ability of a diagnostic test to accurately identify a patient with CAD. Direct costs amounted to €98.60 for DSCT and to €317.75 for invasive coronary angiography. Analysis of model calculations indicated that cost-effectiveness grew hyperbolically with increasing prevalence of CAD. Given the prevalence of CAD in the study cohort (24%), DSCT was found to be more cost-effective than invasive coronary angiography (€970 vs €1354 for one patient correctly diagnosed as having CAD). At a disease prevalence of 49%, DSCT and invasive angiography were equally effective with costs of €633. Above a threshold value of disease prevalence of 55%, proceeding directly to invasive coronary angiography was more cost-effective than DSCT. With proper patient selection and consideration of disease prevalence, DSCT coronary angiography is cost-effective for diagnosing CAD in patients with an intermediate pretest likelihood for it. However, the range of eligible patients may be smaller than previously reported.

  18. Feasibility Study and Cost Benefit Analysis of Thin-Client Computer System Implementation Onboard United States Navy Ships

    National Research Council Canada - National Science Library

    Arbulu, Timothy D; Vosberg, Brian J

    2007-01-01

    The purpose of this MBA project was to conduct a feasibility study and a cost benefit analysis of using thin-client computer systems instead of traditional networks onboard United States Navy ships...

  19. Proposing Hybrid Architecture to Implement Cloud Computing in Higher Education Institutions Using a Meta-synthesis Appro

    Directory of Open Access Journals (Sweden)

    hamid reza bazi

    2017-12-01

    Full Text Available Cloud computing is a new technology that considerably helps Higher Education Institutions (HEIs to develop and create competitive advantage with inherent characteristics such as flexibility, scalability, accessibility, reliability, fault tolerant and economic efficiency. Due to the numerous advantages of cloud computing, and in order to take advantage of cloud computing infrastructure, services of universities and HEIs need to migrate to the cloud. However, this transition involves many challenges, one of which is lack or shortage of appropriate architecture for migration to the technology. Using a reliable architecture for migration ensures managers to mitigate risks in the cloud computing technology. Therefore, organizations always search for suitable cloud computing architecture. In previous studies, these important features have received less attention and have not been achieved in a comprehensive way. The aim of this study is to use a meta-synthesis method for the first time to analyze the previously published studies and to suggest appropriate hybrid cloud migration architecture (IUHEC. We reviewed many papers from relevant journals and conference proceedings. The concepts extracted from these papers are classified to related categories and sub-categories. Then, we developed our proposed hybrid architecture based on these concepts and categories. The proposed architecture was validated by a panel of experts and Lawshe’s model was used to determine the content validity. Due to its innovative yet user-friendly nature, comprehensiveness, and high security, this architecture can help HEIs have an effective migration to cloud computing environment.

  20. Computational cost estimates for parallel shared memory isogeometric multi-frontal solvers

    KAUST Repository

    Woźniak, Maciej

    2014-06-01

    In this paper we present computational cost estimates for parallel shared memory isogeometric multi-frontal solvers. The estimates show that the ideal isogeometric shared memory parallel direct solver scales as O( p2log(N/p)) for one dimensional problems, O(Np2) for two dimensional problems, and O(N4/3p2) for three dimensional problems, where N is the number of degrees of freedom, and p is the polynomial order of approximation. The computational costs of the shared memory parallel isogeometric direct solver are compared with those corresponding to the sequential isogeometric direct solver, being the latest equal to O(N p2) for the one dimensional case, O(N1.5p3) for the two dimensional case, and O(N2p3) for the three dimensional case. The shared memory version significantly reduces both the scalability in terms of N and p. Theoretical estimates are compared with numerical experiments performed with linear, quadratic, cubic, quartic, and quintic B-splines, in one and two spatial dimensions. © 2014 Elsevier Ltd. All rights reserved.

  1. Computational cost estimates for parallel shared memory isogeometric multi-frontal solvers

    KAUST Repository

    Woźniak, Maciej; Kuźnik, Krzysztof M.; Paszyński, Maciej R.; Calo, Victor M.; Pardo, D.

    2014-01-01

    In this paper we present computational cost estimates for parallel shared memory isogeometric multi-frontal solvers. The estimates show that the ideal isogeometric shared memory parallel direct solver scales as O( p2log(N/p)) for one dimensional problems, O(Np2) for two dimensional problems, and O(N4/3p2) for three dimensional problems, where N is the number of degrees of freedom, and p is the polynomial order of approximation. The computational costs of the shared memory parallel isogeometric direct solver are compared with those corresponding to the sequential isogeometric direct solver, being the latest equal to O(N p2) for the one dimensional case, O(N1.5p3) for the two dimensional case, and O(N2p3) for the three dimensional case. The shared memory version significantly reduces both the scalability in terms of N and p. Theoretical estimates are compared with numerical experiments performed with linear, quadratic, cubic, quartic, and quintic B-splines, in one and two spatial dimensions. © 2014 Elsevier Ltd. All rights reserved.

  2. Computational sensing of herpes simplex virus using a cost-effective on-chip microscope

    KAUST Repository

    Ray, Aniruddha

    2017-07-03

    Caused by the herpes simplex virus (HSV), herpes is a viral infection that is one of the most widespread diseases worldwide. Here we present a computational sensing technique for specific detection of HSV using both viral immuno-specificity and the physical size range of the viruses. This label-free approach involves a compact and cost-effective holographic on-chip microscope and a surface-functionalized glass substrate prepared to specifically capture the target viruses. To enhance the optical signatures of individual viruses and increase their signal-to-noise ratio, self-assembled polyethylene glycol based nanolenses are rapidly formed around each virus particle captured on the substrate using a portable interface. Holographic shadows of specifically captured viruses that are surrounded by these self-assembled nanolenses are then reconstructed, and the phase image is used for automated quantification of the size of each particle within our large field-of-view, ~30 mm2. The combination of viral immuno-specificity due to surface functionalization and the physical size measurements enabled by holographic imaging is used to sensitively detect and enumerate HSV particles using our compact and cost-effective platform. This computational sensing technique can find numerous uses in global health related applications in resource-limited environments.

  3. Cost Savings Associated with the Adoption of a Cloud Computing Data Transfer System for Trauma Patients.

    Science.gov (United States)

    Feeney, James M; Montgomery, Stephanie C; Wolf, Laura; Jayaraman, Vijay; Twohig, Michael

    2016-09-01

    Among transferred trauma patients, challenges with the transfer of radiographic studies include problems loading or viewing the studies at the receiving hospitals, and problems manipulating, reconstructing, or evalu- ating the transferred images. Cloud-based image transfer systems may address some ofthese problems. We reviewed the charts of patients trans- ferred during one year surrounding the adoption of a cloud computing data transfer system. We compared the rates of repeat imaging before (precloud) and af- ter (postcloud) the adoption of the cloud-based data transfer system. During the precloud period, 28 out of 100 patients required 90 repeat studies. With the cloud computing transfer system in place, three out of 134 patients required seven repeat films. There was a statistically significant decrease in the proportion of patients requiring repeat films (28% to 2.2%, P < .0001). Based on an annualized volume of 200 trauma patient transfers, the cost savings estimated using three methods of cost analysis, is between $30,272 and $192,453.

  4. Collaborative and Competitive Video Games for Teaching Computing in Higher Education

    Science.gov (United States)

    Smith, Spencer; Chan, Samantha

    2017-08-01

    This study measures the success of using a collaborative and competitive video game, named Space Race, to teach computing to first year engineering students. Space Race is played by teams of four, each with their own tablet, collaborating to compete against the other teams in the class. The impact of the game on student learning was studied through measurements using 485 students, over one term. Surveys were used to gauge student reception of the game. Pre and post-tests, and in-course examinations were used to quantify student performance. The game was well received with at least 82% of the students that played it recommending it to others. In some cases, game participants outperformed non-participants on course exams. On the final course exam, all of the statistically significant ( pgame participants on the questions, with a maximum grade improvement of 41%. The findings also suggest that some students retain the knowledge obtained from Space Race for at least 7 weeks. The results of this study provide strong evidence that a collaborative and competitive video game can be an effective tool for teaching computing in post-secondary education.

  5. A novel cost based model for energy consumption in cloud computing.

    Science.gov (United States)

    Horri, A; Dastghaibyfard, Gh

    2015-01-01

    Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment.

  6. Omniscopes: Large area telescope arrays with only NlogN computational cost

    International Nuclear Information System (INIS)

    Tegmark, Max; Zaldarriaga, Matias

    2010-01-01

    We show that the class of antenna layouts for telescope arrays allowing cheap analysis hardware (with correlator cost scaling as NlogN rather than N 2 with the number of antennas N) is encouragingly large, including not only previously discussed rectangular grids but also arbitrary hierarchies of such grids, with arbitrary rotations and shears at each level. We show that all correlations for such a 2D array with an n-level hierarchy can be efficiently computed via a fast Fourier transform in not two but 2n dimensions. This can allow major correlator cost reductions for science applications requiring exquisite sensitivity at widely separated angular scales, for example, 21 cm tomography (where short baselines are needed to probe the cosmological signal and long baselines are needed for point source removal), helping enable future 21 cm experiments with thousands or millions of cheap dipolelike antennas. Such hierarchical grids combine the angular resolution advantage of traditional array layouts with the cost advantage of a rectangular fast Fourier transform telescope. We also describe an algorithm for how a subclass of hierarchical arrays can efficiently use rotation synthesis to produce global sky maps with minimal noise and a well-characterized synthesized beam.

  7. A Comprehensive and Cost-Effective Computer Infrastructure for K-12 Schools

    Science.gov (United States)

    Warren, G. P.; Seaton, J. M.

    1996-01-01

    Since 1993, NASA Langley Research Center has been developing and implementing a low-cost Internet connection model, including system architecture, training, and support, to provide Internet access for an entire network of computers. This infrastructure allows local area networks which exceed 50 machines per school to independently access the complete functionality of the Internet by connecting to a central site, using state-of-the-art commercial modem technology, through a single standard telephone line. By locating high-cost resources at this central site and sharing these resources and their costs among the school districts throughout a region, a practical, efficient, and affordable infrastructure for providing scale-able Internet connectivity has been developed. As the demand for faster Internet access grows, the model has a simple expansion path that eliminates the need to replace major system components and re-train personnel. Observations of optical Internet usage within an environment, particularly school classrooms, have shown that after an initial period of 'surfing,' the Internet traffic becomes repetitive. By automatically storing requested Internet information on a high-capacity networked disk drive at the local site (network based disk caching), then updating this information only when it changes, well over 80 percent of the Internet traffic that leaves a location can be eliminated by retrieving the information from the local disk cache.

  8. INTERACTIONS: DESIGN, IMPLEMENTATION AND EVALUATION OF A COMPUTATIONAL TOOL FOR TEACHING INTERMOLECULAR FORCES IN HIGHER EDUCATION

    Directory of Open Access Journals (Sweden)

    Francisco Geraldo Barbosa

    2015-12-01

    Full Text Available Intermolecular forces are a useful concept that can explain the attraction between particulate matter as well as numerous phenomena in our lives such as viscosity, solubility, drug interactions, and dyeing of fibers. However, studies show that students have difficulty understanding this important concept, which has led us to develop a free educational software in English and Portuguese. The software can be used interactively by teachers and students, thus facilitating better understanding. Professors and students, both graduate and undergraduate, were questioned about the software quality and its intuitiveness of use, facility of navigation, and pedagogical application using a Likert scale. The results led to the conclusion that the developed computer application can be characterized as an auxiliary tool to assist teachers in their lectures and students in their learning process of intermolecular forces.

  9. Net Shape Spin Formed Cryogenic Aluminum Lithium Cryogenic Tank Domes for Lower Cost Higher Performance Launch Vehicles

    Science.gov (United States)

    Curreri, Peter A.; Hoffman, Eric; Domack, Marcia; Brewster, Jeb; Russell, Carolyn

    2013-01-01

    With the goal of lower cost (simplified manufacturing and lower part count) and higher performance (higher strength to weight alloys) the NASA Technical Maturation Program in 2006 funded a proposal to investigate spin forming of space launch vehicle cryogenic tank domes. The project funding continued under the NASA Exploration Technology Development Program through completion in FY12. The first phase of the project involved spin forming of eight, 1 meter diameter "path finder" domes. Half of these were processed using a concave spin form process (MT Aerospace, Augsburg Germany) and the other half using a convex process (Spincraft, Boston MA). The convex process has been used to produce the Ares Common Bulkhead and the concave process has been used to produce dome caps for the Space Shuttle light weight external tank and domes for the NASDA H2. Aluminum Lithium material was chosen because of its higher strength to weight ratio than the Aluminum 2219 baseline. Aluminum lithium, in order to obtain the desired temper (T8), requires a cold stretch after the solution heat treatment and quench. This requirement favors the concave spin form process which was selected for scale up. This paper describes the results of processing four, 5.5 meter diameter (upper stage scale) net shaped spin formed Aluminum Lithium domes. In order to allow scalability beyond the limits of foundry and rolling mills (about 12 foot width) the circular blank contained one friction stir weld (heavy lifter scales require a flat blank containing two welds). Mechanical properties data (tensile, fracture toughness, stress corrosion, and simulated service testing) for the parent metal and weld will also be discussed.

  10. Assessing Tax Form Distribution Costs: A Proposed Method for Computing the Dollar Value of Tax Form Distribution in a Public Library.

    Science.gov (United States)

    Casey, James B.

    1998-01-01

    Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…

  11. Use of several Cloud Computing approaches for climate modelling: performance, costs and opportunities

    Science.gov (United States)

    Perez Montes, Diego A.; Añel Cabanelas, Juan A.; Wallom, David C. H.; Arribas, Alberto; Uhe, Peter; Caderno, Pablo V.; Pena, Tomas F.

    2017-04-01

    Cloud Computing is a technological option that offers great possibilities for modelling in geosciences. We have studied how two different climate models, HadAM3P-HadRM3P and CESM-WACCM, can be adapted in two different ways to run on Cloud Computing Environments from three different vendors: Amazon, Google and Microsoft. Also, we have evaluated qualitatively how the use of Cloud Computing can affect the allocation of resources by funding bodies and issues related to computing security, including scientific reproducibility. Our first experiments were developed using the well known ClimatePrediction.net (CPDN), that uses BOINC, over the infrastructure from two cloud providers, namely Microsoft Azure and Amazon Web Services (hereafter AWS). For this comparison we ran a set of thirteen month climate simulations for CPDN in Azure and AWS using a range of different virtual machines (VMs) for HadRM3P (50 km resolution over South America CORDEX region) nested in the global atmosphere-only model HadAM3P. These simulations were run on a single processor and took between 3 and 5 days to compute depending on the VM type. The last part of our simulation experiments was running WACCM over different VMS on the Google Compute Engine (GCE) and make a comparison with the supercomputer (SC) Finisterrae1 from the Centro de Supercomputacion de Galicia. It was shown that GCE gives better performance than the SC for smaller number of cores/MPI tasks but the model throughput shows clearly how the SC performance is better after approximately 100 cores (related with network speed and latency differences). From a cost point of view, Cloud Computing moves researchers from a traditional approach where experiments were limited by the available hardware resources to monetary resources (how many resources can be afforded). As there is an increasing movement and recommendation for budgeting HPC projects on this technology (budgets can be calculated in a more realistic way) we could see a shift on

  12. Costs and role of ultrasound follow-up of polytrauma patients after initial computed tomography

    International Nuclear Information System (INIS)

    Maurer, M.H.; Winkler, A.; Powerski, M.J.; Elgeti, F.; Huppertz, A.; Roettgen, R.; Marnitz, T.; Wichlas, F.

    2012-01-01

    Purpose: To assess the costs and diagnostic gain of abdominal ultrasound follow-up of polytrauma patients initially examined by whole-body computed tomography (CT). Materials and Methods: A total of 176 patients with suspected multiple trauma (126 men, 50 women; age 43.5 ± 17.4 years) were retrospectively analyzed with regard to supplementary and new findings obtained by ultrasound follow-up compared with the results of exploratory FAST (focused assessment with sonography for trauma) at admission and the findings of whole-body CT. A process model was used to document the staff, materials, and total costs of the ultrasound follow-up examinations. Results: FAST yielded 26 abdominal findings (organ injury and/or free intra-abdominal fluid) in 19 patients, while the abdominal scan of whole-body CT revealed 32 findings in 25 patients. FAST had 81 % sensitivity and 100 % specificity. Follow-up ultrasound examinations revealed new findings in 2 of the 25 patients with abdominal injuries detected with initial CT. In the 151 patients without abdominal injuries in the initial CT scan, ultrasound follow-up did not yield any supplementary or new findings. The total costs of an ultrasound follow-up examination were EUR 28.93. The total costs of all follow-up ultrasound examinations performed in the study population were EUR 5658.23. Conclusion: Follow-up abdominal ultrasound yields only a low overall diagnostic gain in polytrauma patients in whom initial CT fails to detect any abdominal injuries but incurs high personnel expenses for radiological departments. (orig.)

  13. Server Operation and Virtualization to Save Energy and Cost in Future Sustainable Computing

    Directory of Open Access Journals (Sweden)

    Jun-Ho Huh

    2018-06-01

    Full Text Available Since the introduction of the LTE (Long Term Evolution service, we have lived in a time of expanding amounts of data. The amount of data produced has increased every year with the increase of smart phone distribution in particular. Telecommunication service providers have to struggle to secure sufficient network capacity in order to maintain quick access to necessary data by consumers. Nonetheless, maintaining the maximum capacity and bandwidth at all times requires considerable cost and excessive equipment. Therefore, to solve such a problem, telecommunication service providers need to maintain an appropriate level of network capacity and to provide sustainable service to customers through a quick network development in case of shortage. So far, telecommunication service providers have bought and used the network equipment directly produced by network equipment manufacturers such as Ericsson, Nokia, Cisco, and Samsung. Since the equipment is specialized for networking, which satisfied consumers with their excellent performances, they are very costly because they are developed with advanced technologies. Moreover, it takes much time due to the purchase process wherein the telecommunication service providers place an order and the manufacturer produces and delivers. Accordingly, there are cases that require signaling and two-way data traffic as well as capacity because of the diversity of IoT devices. For these purposes, the need for NFV (Network Function Virtualization is raised. Equipment virtualization is performed so that it is operated on an x86-based compatible server instead of working on the network equipment manufacturer’s dedicated hardware. By operating in some compatible servers, it can reduce the wastage of hardware and cope with the change thanks to quick hardware development. This study proposed an efficient system of reducing cost in network server operation using such NFV technology and found that the cost was reduced by 24

  14. The cost-effectiveness of the RSI QuickScan intervention programme for computer workers: Results of an economic evaluation alongside a randomised controlled trial.

    Science.gov (United States)

    Speklé, Erwin M; Heinrich, Judith; Hoozemans, Marco J M; Blatter, Birgitte M; van der Beek, Allard J; van Dieën, Jaap H; van Tulder, Maurits W

    2010-11-11

    The costs of arm, shoulder and neck symptoms are high. In order to decrease these costs employers implement interventions aimed at reducing these symptoms. One frequently used intervention is the RSI QuickScan intervention programme. It establishes a risk profile of the target population and subsequently advises interventions following a decision tree based on that risk profile. The purpose of this study was to perform an economic evaluation, from both the societal and companies' perspective, of the RSI QuickScan intervention programme for computer workers. In this study, effectiveness was defined at three levels: exposure to risk factors, prevalence of arm, shoulder and neck symptoms, and days of sick leave. The economic evaluation was conducted alongside a randomised controlled trial (RCT). Participating computer workers from 7 companies (N = 638) were assigned to either the intervention group (N = 320) or the usual care group (N = 318) by means of cluster randomisation (N = 50). The intervention consisted of a tailor-made programme, based on a previously established risk profile. At baseline, 6 and 12 month follow-up, the participants completed the RSI QuickScan questionnaire. Analyses to estimate the effect of the intervention were done according to the intention-to-treat principle. To compare costs between groups, confidence intervals for cost differences were computed by bias-corrected and accelerated bootstrapping. The mean intervention costs, paid by the employer, were 59 euro per participant in the intervention and 28 euro in the usual care group. Mean total health care and non-health care costs per participant were 108 euro in both groups. As to the cost-effectiveness, improvement in received information on healthy computer use as well as in their work posture and movement was observed at higher costs. With regard to the other risk factors, symptoms and sick leave, only small and non-significant effects were found. In this study, the RSI Quick

  15. A precise goniometer/tensiometer using a low cost single-board computer

    Science.gov (United States)

    Favier, Benoit; Chamakos, Nikolaos T.; Papathanasiou, Athanasios G.

    2017-12-01

    Measuring the surface tension and the Young contact angle of a droplet is extremely important for many industrial applications. Here, considering the booming interest for small and cheap but precise experimental instruments, we have constructed a low-cost contact angle goniometer/tensiometer, based on a single-board computer (Raspberry Pi). The device runs an axisymmetric drop shape analysis (ADSA) algorithm written in Python. The code, here named DropToolKit, was developed in-house. We initially present the mathematical framework of our algorithm and then we validate our software tool against other well-established ADSA packages, including the commercial ramé-hart DROPimage Advanced as well as the DropAnalysis plugin in ImageJ. After successfully testing for various combinations of liquids and solid surfaces, we concluded that our prototype device would be highly beneficial for industrial applications as well as for scientific research in wetting phenomena compared to the commercial solutions.

  16. Unenhanced computed tomography in acute renal colic reduces cost outside radiology department

    DEFF Research Database (Denmark)

    Lauritsen, J.; Andersen, J.R.; Nordling, J.

    2008-01-01

    BACKGROUND: Unenhanced multidetector computed tomography (UMDCT) is well established as the procedure of choice for radiologic evaluation of patients with renal colic. The procedure has both clinical and financial consequences for departments of surgery and radiology. However, the financial effect...... outside the radiology department is poorly elucidated. PURPOSE: To evaluate the financial consequences outside of the radiology department, a retrospective study comparing the ward occupation of patients examined with UMDCT to that of intravenous urography (IVU) was performed. MATERIAL AND METHODS......) saved the hospital USD 265,000 every 6 months compared to the use of IVU. CONCLUSION: Use of UMDCT compared to IVU in patients with renal colic leads to cost savings outside the radiology department Udgivelsesdato: 2008/12...

  17. Low-cost, high-performance and efficiency computational photometer design

    Science.gov (United States)

    Siewert, Sam B.; Shihadeh, Jeries; Myers, Randall; Khandhar, Jay; Ivanov, Vitaly

    2014-05-01

    Researchers at the University of Alaska Anchorage and University of Colorado Boulder have built a low cost high performance and efficiency drop-in-place Computational Photometer (CP) to test in field applications ranging from port security and safety monitoring to environmental compliance monitoring and surveying. The CP integrates off-the-shelf visible spectrum cameras with near to long wavelength infrared detectors and high resolution digital snapshots in a single device. The proof of concept combines three or more detectors into a single multichannel imaging system that can time correlate read-out, capture, and image process all of the channels concurrently with high performance and energy efficiency. The dual-channel continuous read-out is combined with a third high definition digital snapshot capability and has been designed using an FPGA (Field Programmable Gate Array) to capture, decimate, down-convert, re-encode, and transform images from two standard definition CCD (Charge Coupled Device) cameras at 30Hz. The continuous stereo vision can be time correlated to megapixel high definition snapshots. This proof of concept has been fabricated as a fourlayer PCB (Printed Circuit Board) suitable for use in education and research for low cost high efficiency field monitoring applications that need multispectral and three dimensional imaging capabilities. Initial testing is in progress and includes field testing in ports, potential test flights in un-manned aerial systems, and future planned missions to image harsh environments in the arctic including volcanic plumes, ice formation, and arctic marine life.

  18. Computational Sensing Using Low-Cost and Mobile Plasmonic Readers Designed by Machine Learning

    KAUST Repository

    Ballard, Zachary S.

    2017-01-27

    Plasmonic sensors have been used for a wide range of biological and chemical sensing applications. Emerging nanofabrication techniques have enabled these sensors to be cost-effectively mass manufactured onto various types of substrates. To accompany these advances, major improvements in sensor read-out devices must also be achieved to fully realize the broad impact of plasmonic nanosensors. Here, we propose a machine learning framework which can be used to design low-cost and mobile multispectral plasmonic readers that do not use traditionally employed bulky and expensive stabilized light sources or high-resolution spectrometers. By training a feature selection model over a large set of fabricated plasmonic nanosensors, we select the optimal set of illumination light-emitting diodes needed to create a minimum-error refractive index prediction model, which statistically takes into account the varied spectral responses and fabrication-induced variability of a given sensor design. This computational sensing approach was experimentally validated using a modular mobile plasmonic reader. We tested different plasmonic sensors with hexagonal and square periodicity nanohole arrays and revealed that the optimal illumination bands differ from those that are “intuitively” selected based on the spectral features of the sensor, e.g., transmission peaks or valleys. This framework provides a universal tool for the plasmonics community to design low-cost and mobile multispectral readers, helping the translation of nanosensing technologies to various emerging applications such as wearable sensing, personalized medicine, and point-of-care diagnostics. Beyond plasmonics, other types of sensors that operate based on spectral changes can broadly benefit from this approach, including e.g., aptamer-enabled nanoparticle assays and graphene-based sensors, among others.

  19. Performance Assessment of a Custom, Portable, and Low-Cost Brain-Computer Interface Platform.

    Science.gov (United States)

    McCrimmon, Colin M; Fu, Jonathan Lee; Wang, Ming; Lopes, Lucas Silva; Wang, Po T; Karimi-Bidhendi, Alireza; Liu, Charles Y; Heydari, Payam; Nenadic, Zoran; Do, An Hong

    2017-10-01

    Conventional brain-computer interfaces (BCIs) are often expensive, complex to operate, and lack portability, which confines their use to laboratory settings. Portable, inexpensive BCIs can mitigate these problems, but it remains unclear whether their low-cost design compromises their performance. Therefore, we developed a portable, low-cost BCI and compared its performance to that of a conventional BCI. The BCI was assembled by integrating a custom electroencephalogram (EEG) amplifier with an open-source microcontroller and a touchscreen. The function of the amplifier was first validated against a commercial bioamplifier, followed by a head-to-head comparison between the custom BCI (using four EEG channels) and a conventional 32-channel BCI. Specifically, five able-bodied subjects were cued to alternate between hand opening/closing and remaining motionless while the BCI decoded their movement state in real time and provided visual feedback through a light emitting diode. Subjects repeated the above task for a total of 10 trials, and were unaware of which system was being used. The performance in each trial was defined as the temporal correlation between the cues and the decoded states. The EEG data simultaneously acquired with the custom and commercial amplifiers were visually similar and highly correlated ( ρ = 0.79). The decoding performances of the custom and conventional BCIs averaged across trials and subjects were 0.70 ± 0.12 and 0.68 ± 0.10, respectively, and were not significantly different. The performance of our portable, low-cost BCI is comparable to that of the conventional BCIs. Platforms, such as the one developed here, are suitable for BCI applications outside of a laboratory.

  20. Evolution of product lifespan and implications for environmental assessment and management: a case study of personal computers in higher education.

    Science.gov (United States)

    Babbitt, Callie W; Kahhat, Ramzy; Williams, Eric; Babbitt, Gregory A

    2009-07-01

    Product lifespan is a fundamental variable in understanding the environmental impacts associated with the life cycle of products. Existing life cycle and materials flow studies of products, almost without exception, consider lifespan to be constant over time. To determine the validity of this assumption, this study provides an empirical documentation of the long-term evolution of personal computer lifespan, using a major U.S. university as a case study. Results indicate that over the period 1985-2000, computer lifespan (purchase to "disposal") decreased steadily from a mean of 10.7 years in 1985 to 5.5 years in 2000. The distribution of lifespan also evolved, becoming narrower over time. Overall, however, lifespan distribution was broader than normally considered in life cycle assessments or materials flow forecasts of electronic waste management for policy. We argue that these results suggest that at least for computers, the assumption of constant lifespan is problematic and that it is important to work toward understanding the dynamics of use patterns. We modify an age-structured model of population dynamics from biology as a modeling approach to describe product life cycles. Lastly, the purchase share and generation of obsolete computers from the higher education sector is estimated using different scenarios for the dynamics of product lifespan.

  1. Computer-mediated communication and time pressure induce higher cardiovascular responses in the preparatory and execution phases of cooperative tasks.

    Science.gov (United States)

    Costa Ferrer, Raquel; Serrano Rosa, Miguel Ángel; Zornoza Abad, Ana; Salvador Fernández-Montejo, Alicia

    2010-11-01

    The cardiovascular (CV) response to social challenge and stress is associated with the etiology of cardiovascular diseases. New ways of communication, time pressure and different types of information are common in our society. In this study, the cardiovascular response to two different tasks (open vs. closed information) was examined employing different communication channels (computer-mediated vs. face-to-face) and with different pace control (self vs. external). Our results indicate that there was a higher CV response in the computer-mediated condition, on the closed information task and in the externally paced condition. These role of these factors should be considered when studying the consequences of social stress and their underlying mechanisms.

  2. Computation of higher spherical harmonics moments of the angular flux for neutron transport problems in spherical geometry

    International Nuclear Information System (INIS)

    Sahni, D.C.; Sharma, A.

    2000-01-01

    The integral form of one-speed, spherically symmetric neutron transport equation with isotropic scattering is considered. Two standard problems are solved using normal mode expansion technique. The expansion coefficients are obtained by solving their singular integral equations. It is shown that these expansion coefficients provide a representation of all spherical harmonics moments of the angular flux as a superposition of Bessel functions. It is seen that large errors occur in the computation of higher moments unless we take certain precautions. The reasons for this phenomenon are explained. They throw some light on the failure of spherical harmonics method in treating spherical geometry problems as observed by Aronsson

  3. Development of a low-cost biogas filtration system to achieve higher-power efficient AC generator

    Science.gov (United States)

    Mojica, Edison E.; Ardaniel, Ar-Ar S.; Leguid, Jeanlou G.; Loyola, Andrea T.

    2018-02-01

    The paper focuses on the development of a low-cost biogas filtration system for alternating current generator to achieve higher efficiency in terms of power production. A raw biogas energy comprises of 57% combustible element and 43% non-combustible elements containing carbon dioxide (36%), water vapor (5%), hydrogen sulfide (0.5%), nitrogen (1%), oxygen (0 - 2%), and ammonia (0 - 1%). The filtration system composes of six stages: stage 1 is the water scrubber filter intended to remove the carbon dioxide and traces of hydrogen sulfide; stage 2 is the silica gel filter intended to reduce the water vapor; stage 3 is the iron sponge filter intended to remove the remaining hydrogen sulfide; stage 4 is the sodium hydroxide solution filter intended to remove the elemental sulfur formed during the interaction of the hydrogen sulfide and the iron sponge and for further removal of carbon dioxide; stage 5 is the silica gel filter intended to further eliminate the water vapor gained in stage 4; and, stage 6 is the activated carbon filter intended to remove the carbon dioxide. The filtration system was able to lower the non-combustible elements by 72% and thus, increasing the combustible element by 54.38%. The unfiltered biogas is capable of generating 16.3 kW while the filtered biogas is capable of generating 18.6 kW. The increased in methane concentration resulted to 14.11% increase in the power output. The outcome resulted to better engine performance in the generation of electricity.

  4. Analysis of Unit Costs in a University. The Fribourg Example. Program on Institutional Management in Higher Education.

    Science.gov (United States)

    Pasquier, Jacques; Sachse, Matthias

    Costing principles are applied to a university by estimating unit costs and their component factors for the university's different inputs, activities, and outputs. The information system used is designed for Fribourg University but could be applicable to other Swiss universities and could serve Switzerland's universities policy. In general, it…

  5. High performance computing enabling exhaustive analysis of higher order single nucleotide polymorphism interaction in Genome Wide Association Studies.

    Science.gov (United States)

    Goudey, Benjamin; Abedini, Mani; Hopper, John L; Inouye, Michael; Makalic, Enes; Schmidt, Daniel F; Wagner, John; Zhou, Zeyu; Zobel, Justin; Reumann, Matthias

    2015-01-01

    Genome-wide association studies (GWAS) are a common approach for systematic discovery of single nucleotide polymorphisms (SNPs) which are associated with a given disease. Univariate analysis approaches commonly employed may miss important SNP associations that only appear through multivariate analysis in complex diseases. However, multivariate SNP analysis is currently limited by its inherent computational complexity. In this work, we present a computational framework that harnesses supercomputers. Based on our results, we estimate a three-way interaction analysis on 1.1 million SNP GWAS data requiring over 5.8 years on the full "Avoca" IBM Blue Gene/Q installation at the Victorian Life Sciences Computation Initiative. This is hundreds of times faster than estimates for other CPU based methods and four times faster than runtimes estimated for GPU methods, indicating how the improvement in the level of hardware applied to interaction analysis may alter the types of analysis that can be performed. Furthermore, the same analysis would take under 3 months on the currently largest IBM Blue Gene/Q supercomputer "Sequoia" at the Lawrence Livermore National Laboratory assuming linear scaling is maintained as our results suggest. Given that the implementation used in this study can be further optimised, this runtime means it is becoming feasible to carry out exhaustive analysis of higher order interaction studies on large modern GWAS.

  6. Manipulative therapy in addition to usual medical care accelerates recovery of shoulder complaints at higher costs: economic outcomes of a randomized trial

    Directory of Open Access Journals (Sweden)

    Bergman Gert JD

    2010-09-01

    Full Text Available Abstract Background Shoulder complaints are common in primary care and have unfavourable long term prognosis. Our objective was to evaluate the clinical effectiveness of manipulative therapy of the cervicothoracic spine and the adjacent ribs in addition to usual medical care (UMC by the general practitioner in the treatment of shoulder complaints. Methods This economic evaluation was conducted alongside a randomized trial in primary care. Included were 150 patients with shoulder complaints and a dysfunction of the cervicothoracic spine and adjacent ribs. Patients were treated with UMC (NSAID's, corticosteroid injection or referral to physical therapy and were allocated at random (yes/no to manipulative therapy (manipulation and mobilization. Patient perceived recovery, severity of main complaint, shoulder pain, disability and general health were outcome measures. Data about direct and indirect costs were collected by means of a cost diary. Results Manipulative therapy as add-on to UMC accelerated recovery on all outcome measures included. At 26 weeks after randomization, both groups reported similar recovery rates (41% vs. 38%, but the difference between groups in improvement of severity of the main complaint, shoulder pain and disability sustained. Compared to the UMC group the total costs were higher in the manipulative group (€1167 vs. €555. This is explained mainly by the costs of the manipulative therapy itself and the higher costs due sick leave from work. The cost effectiveness ratio showed that additional manipulative treatment is more costly but also more effective than UMC alone. The cost-effectiveness acceptability curve shows that a 50%-probability of recovery with AMT within 6 months after initiation of treatment is achieved at €2876. Conclusion Manipulative therapy in addition to UMC accelerates recovery and is more effective than UMC alone on the long term, but is associated with higher costs. International Standard

  7. Low Computational-Cost Footprint Deformities Diagnosis Sensor through Angles, Dimensions Analysis and Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    J. Rodolfo Maestre-Rendon

    2017-11-01

    Full Text Available Manual measurements of foot anthropometry can lead to errors since this task involves the experience of the specialist who performs them, resulting in different subjective measures from the same footprint. Moreover, some of the diagnoses that are given to classify a footprint deformity are based on a qualitative interpretation by the physician; there is no quantitative interpretation of the footprint. The importance of providing a correct and accurate diagnosis lies in the need to ensure that an appropriate treatment is provided for the improvement of the patient without risking his or her health. Therefore, this article presents a smart sensor that integrates the capture of the footprint, a low computational-cost analysis of the image and the interpretation of the results through a quantitative evaluation. The smart sensor implemented required the use of a camera (Logitech C920 connected to a Raspberry Pi 3, where a graphical interface was made for the capture and processing of the image, and it was adapted to a podoscope conventionally used by specialists such as orthopedist, physiotherapists and podiatrists. The footprint diagnosis smart sensor (FPDSS has proven to be robust to different types of deformity, precise, sensitive and correlated in 0.99 with the measurements from the digitalized image of the ink mat.

  8. Incentive motivation deficits in schizophrenia reflect effort computation impairments during cost-benefit decision-making.

    Science.gov (United States)

    Fervaha, Gagan; Graff-Guerrero, Ariel; Zakzanis, Konstantine K; Foussias, George; Agid, Ofer; Remington, Gary

    2013-11-01

    Motivational impairments are a core feature of schizophrenia and although there are numerous reports studying this feature using clinical rating scales, objective behavioural assessments are lacking. Here, we use a translational paradigm to measure incentive motivation in individuals with schizophrenia. Sixteen stable outpatients with schizophrenia and sixteen matched healthy controls completed a modified version of the Effort Expenditure for Rewards Task that accounts for differences in motoric ability. Briefly, subjects were presented with a series of trials where they may choose to expend a greater amount of effort for a larger monetary reward versus less effort for a smaller reward. Additionally, the probability of receiving money for a given trial was varied at 12%, 50% and 88%. Clinical and other reward-related variables were also evaluated. Patients opted to expend greater effort significantly less than controls for trials of high, but uncertain (i.e. 50% and 88% probability) incentive value, which was related to amotivation and neurocognitive deficits. Other abnormalities were also noted but were related to different clinical variables such as impulsivity (low reward and 12% probability). These motivational deficits were not due to group differences in reward learning, reward valuation or hedonic capacity. Our findings offer novel support for incentive motivation deficits in schizophrenia. Clinical amotivation is associated with impairments in the computation of effort during cost-benefit decision-making. This objective translational paradigm may guide future investigations of the neural circuitry underlying these motivational impairments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. A Low-Cost Computer-Controlled Arduino-Based Educational Laboratory System for Teaching the Fundamentals of Photovoltaic Cells

    Science.gov (United States)

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental…

  10. Estimating the cost of referral and willingness to pay for referral to higher-level health facilities: a case series study from an integrated community case management programme in Uganda.

    Science.gov (United States)

    Nanyonjo, Agnes; Bagorogoza, Benson; Kasteng, Frida; Ayebale, Godfrey; Makumbi, Fredrick; Tomson, Göran; Källander, Karin

    2015-08-28

    Integrated community case management (iCCM) relies on community health workers (CHWs) managing children with malaria, pneumonia, diarrhoea, and referring children when management is not possible. This study sought to establish the cost per sick child referred to seek care from a higher-level health facility by a CHW and to estimate caregivers' willingness to pay (WTP) for referral. Caregivers of 203 randomly selected children referred to higher-level health facilities by CHWs were interviewed in four Midwestern Uganda districts. Questionnaires and document reviews were used to capture direct, indirect and opportunity costs incurred by caregivers, CHWs and health facilities managing referred children. WTP for referral was assessed through the 'bidding game' approach followed by an open-ended question on maximum WTP. Descriptive analysis was conducted for factors associated with referral completion and WTP using logistic and linear regression methods, respectively. The cost per case referred to higher-level health facilities was computed from a societal perspective. Reasons for referral included having fever with a negative malaria test (46.8%), danger signs (29.6%) and drug shortage (37.4%). Among the referred, less than half completed referral (45.8%). Referral completion was 2.8 times higher among children with danger signs (p = 0.004) relative to those without danger signs, and 0.27 times lower among children who received pre-referral treatment (p average cost per case referred was US$ 4.89 and US$7.35 per case completing referral. For each unit cost per case referred, caregiver out of pocket expenditure contributed 33.7%, caregivers' and CHWs' opportunity costs contributed 29.2% and 5.1% respectively and health facility costs contributed 39.6%. The mean (SD) out of pocket expenditure was US$1.65 (3.25). The mean WTP for referral was US$8.25 (14.70) and was positively associated with having received pre-referral treatment, completing referral and increasing

  11. On the computation of the higher-order statistics of the channel capacity over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-12-01

    The higher-order statistics (HOS) of the channel capacity μn=E[logn (1+γ end)], where n ∈ N denotes the order of the statistics, has received relatively little attention in the literature, due in part to the intractability of its analysis. In this letter, we propose a novel and unified analysis, which is based on the moment generating function (MGF) technique, to exactly compute the HOS of the channel capacity. More precisely, our mathematical formalism can be readily applied to maximal-ratio-combining (MRC) receivers operating in generalized fading environments. The mathematical formalism is illustrated by some numerical examples focusing on the correlated generalized fading environments. © 2012 IEEE.

  12. On the computation of the higher-order statistics of the channel capacity over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan; Alouini, Mohamed-Slim

    2012-01-01

    The higher-order statistics (HOS) of the channel capacity μn=E[logn (1+γ end)], where n ∈ N denotes the order of the statistics, has received relatively little attention in the literature, due in part to the intractability of its analysis. In this letter, we propose a novel and unified analysis, which is based on the moment generating function (MGF) technique, to exactly compute the HOS of the channel capacity. More precisely, our mathematical formalism can be readily applied to maximal-ratio-combining (MRC) receivers operating in generalized fading environments. The mathematical formalism is illustrated by some numerical examples focusing on the correlated generalized fading environments. © 2012 IEEE.

  13. On the computation of the higher order statistics of the channel capacity for amplify-and-forward multihop transmission

    KAUST Repository

    Yilmaz, Ferkan; Tabassum, Hina; Alouini, Mohamed-Slim

    2014-01-01

    Higher order statistics (HOS) of the channel capacity provide useful information regarding the level of reliability of signal transmission at a particular rate. In this paper, we propose a novel and unified analysis, which is based on the moment-generating function (MGF) approach, to efficiently and accurately compute the HOS of the channel capacity for amplify-and-forward (AF) multihop transmission over generalized fading channels. More precisely, our easy-to-use and tractable mathematical formalism requires only the reciprocal MGFs of the transmission hop signal-to-noise ratio (SNR). Numerical and simulation results, which are performed to exemplify the usefulness of the proposed MGF-based analysis, are shown to be in perfect agreement. © 2013 IEEE.

  14. Strength and Reliability of Wood for the Components of Low-cost Wind Turbines: Computational and Experimental Analysis and Applications

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon; Freere, Peter; Sharma, Ranjan

    2009-01-01

    of experiments and computational investigations. Low cost testing machines have been designed, and employed for the systematic analysis of different sorts of Nepali wood, to be used for the wind turbine construction. At the same time, computational micromechanical models of deformation and strength of wood......This paper reports the latest results of the comprehensive program of experimental and computational analysis of strength and reliability of wooden parts of low cost wind turbines. The possibilities of prediction of strength and reliability of different types of wood are studied in the series...... are developed, which should provide the basis for microstructure-based correlating of observable and service properties of wood. Some correlations between microstructure, strength and service properties of wood have been established....

  15. Cost-effectiveness modeling of colorectal cancer: Computed tomography colonography vs colonoscopy or fecal occult blood tests

    International Nuclear Information System (INIS)

    Lucidarme, Olivier; Cadi, Mehdi; Berger, Genevieve; Taieb, Julien; Poynard, Thierry; Grenier, Philippe; Beresniak, Ariel

    2012-01-01

    Objectives: To assess the cost-effectiveness of three colorectal-cancer (CRC) screening strategies in France: fecal-occult-blood tests (FOBT), computed-tomography-colonography (CTC) and optical-colonoscopy (OC). Methods: Ten-year simulation modeling was used to assess a virtual asymptomatic, average-risk population 50–74 years old. Negative OC was repeated 10 years later, and OC positive for advanced or non-advanced adenoma 3 or 5 years later, respectively. FOBT was repeated biennially. Negative CTC was repeated 5 years later. Positive CTC and FOBT led to triennial OC. Total cost and CRC rate after 10 years for each screening strategy and 0–100% adherence rates with 10% increments were computed. Transition probabilities were programmed using distribution ranges to account for uncertainty parameters. Direct medical costs were estimated using the French national health insurance prices. Probabilistic sensitivity analyses used 5000 Monte Carlo simulations generating model outcomes and standard deviations. Results: For a given adherence rate, CTC screening was always the most effective but not the most cost-effective. FOBT was the least effective but most cost-effective strategy. OC was of intermediate efficacy and the least cost-effective strategy. Without screening, treatment of 123 CRC per 10,000 individuals would cost €3,444,000. For 60% adherence, the respective costs of preventing and treating, respectively 49 and 74 FOBT-detected, 73 and 50 CTC-detected and 63 and 60 OC-detected CRC would be €2,810,000, €6,450,000 and €9,340,000. Conclusion: Simulation modeling helped to identify what would be the most effective (CTC) and cost-effective screening (FOBT) strategy in the setting of mass CRC screening in France.

  16. Resource utilization and costs during the initial years of lung cancer screening with computed tomography in Canada.

    Science.gov (United States)

    Cressman, Sonya; Lam, Stephen; Tammemagi, Martin C; Evans, William K; Leighl, Natasha B; Regier, Dean A; Bolbocean, Corneliu; Shepherd, Frances A; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R; Mayo, John R; McWilliams, Annette; Couture, Christian; English, John C; Goffin, John; Hwang, David M; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J; Goss, Glenwood D; Nicholas, Garth; Seely, Jean M; Sekhon, Harmanjatinder S; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D; Tan, Wan C; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J

    2014-10-01

    It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer's perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400-$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553-$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254-$52,200; p = 0.061). In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure.

  17. The Cost-Effectiveness of Dual Mobility Implants for Primary Total Hip Arthroplasty: A Computer-Based Cost-Utility Model.

    Science.gov (United States)

    Barlow, Brian T; McLawhorn, Alexander S; Westrich, Geoffrey H

    2017-05-03

    Dislocation remains a clinically important problem following primary total hip arthroplasty, and it is a common reason for revision total hip arthroplasty. Dual mobility (DM) implants decrease the risk of dislocation but can be more expensive than conventional implants and have idiosyncratic failure mechanisms. The purpose of this study was to investigate the cost-effectiveness of DM implants compared with conventional bearings for primary total hip arthroplasty. Markov model analysis was conducted from the societal perspective with use of direct and indirect costs. Costs, expressed in 2013 U.S. dollars, were derived from the literature, the National Inpatient Sample, and the Centers for Medicare & Medicaid Services. Effectiveness was expressed in quality-adjusted life years (QALYs). The model was populated with health state utilities and state transition probabilities derived from previously published literature. The analysis was performed for a patient's lifetime, and costs and effectiveness were discounted at 3% annually. The principal outcome was the incremental cost-effectiveness ratio (ICER), with a willingness-to-pay threshold of $100,000/QALY. Sensitivity analyses were performed to explore relevant uncertainty. In the base case, DM total hip arthroplasty showed absolute dominance over conventional total hip arthroplasty, with lower accrued costs ($39,008 versus $40,031 U.S. dollars) and higher accrued utility (13.18 versus 13.13 QALYs) indicating cost-savings. DM total hip arthroplasty ceased being cost-saving when its implant costs exceeded those of conventional total hip arthroplasty by $1,023, and the cost-effectiveness threshold for DM implants was $5,287 greater than that for conventional implants. DM was not cost-effective when the annualized incremental probability of revision from any unforeseen failure mechanism or mechanisms exceeded 0.29%. The probability of intraprosthetic dislocation exerted the most influence on model results. This model

  18. Workforce Investments: State Strategies to Preserve Higher-Cost Career Education Programs in Community and Technical Colleges

    Science.gov (United States)

    Shulock, Nancy; Lewis, Jodi; Tan, Connie

    2013-01-01

    In today's highly-skilled economy, rewarding career pathways are available to those who acquire technical skills by enrolling in certificate and associate degree programs in a community or technical college. Such programs are often more costly to offer than liberal arts and sciences programs that prepare students to transfer to four-year…

  19. Do fragmented landholdings have higher production costs? Evidence from rice farmers in Northeastern Jiangxi province, P.R. China

    NARCIS (Netherlands)

    Tan, S.; Heerink, N.; Kruseman, G.; Qu, F.

    2008-01-01

    Land fragmentation is generally seen as an obstacle to agricultural productivity improvements, but it can also facilitate labor smoothing and risk diversification. In this paper we examine the impact of land fragmentation on the variable production costs of rice farmers in three villages in Jiangxi

  20. Numerically stable, scalable formulas for parallel and online computation of higher-order multivariate central moments with arbitrary weights

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Terriberry, Timothy B. [Xiph.Org Foundation, Arlington, VA (United States); Kolla, Hemanth [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Bennett, Janine [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2016-03-29

    Formulas for incremental or parallel computation of second order central moments have long been known, and recent extensions of these formulas to univariate and multivariate moments of arbitrary order have been developed. Formulas such as these, are of key importance in scenarios where incremental results are required and in parallel and distributed systems where communication costs are high. We survey these recent results, and improve them with arbitrary-order, numerically stable one-pass formulas which we further extend with weighted and compound variants. We also develop a generalized correction factor for standard two-pass algorithms that enables the maintenance of accuracy over nearly the full representable range of the input, avoiding the need for extended-precision arithmetic. We then empirically examine algorithm correctness for pairwise update formulas up to order four as well as condition number and relative error bounds for eight different central moment formulas, each up to degree six, to address the trade-offs between numerical accuracy and speed of the various algorithms. Finally, we demonstrate the use of the most elaborate among the above mentioned formulas, with the utilization of the compound moments for a practical large-scale scientific application.

  1. Computational Stimulation of the Basal Ganglia Neurons with Cost Effective Delayed Gaussian Waveforms.

    Science.gov (United States)

    Daneshzand, Mohammad; Faezipour, Miad; Barkana, Buket D

    2017-01-01

    Deep brain stimulation (DBS) has compelling results in the desynchronization of the basal ganglia neuronal activities and thus, is used in treating the motor symptoms of Parkinson's disease (PD). Accurate definition of DBS waveform parameters could avert tissue or electrode damage, increase the neuronal activity and reduce energy cost which will prolong the battery life, hence avoiding device replacement surgeries. This study considers the use of a charge balanced Gaussian waveform pattern as a method to disrupt the firing patterns of neuronal cell activity. A computational model was created to simulate ganglia cells and their interactions with thalamic neurons. From the model, we investigated the effects of modified DBS pulse shapes and proposed a delay period between the cathodic and anodic parts of the charge balanced Gaussian waveform to desynchronize the firing patterns of the GPe and GPi cells. The results of the proposed Gaussian waveform with delay outperformed that of rectangular DBS waveforms used in in-vivo experiments. The Gaussian Delay Gaussian (GDG) waveforms achieved lower number of misses in eliciting action potential while having a lower amplitude and shorter length of delay compared to numerous different pulse shapes. The amount of energy consumed in the basal ganglia network due to GDG waveforms was dropped by 22% in comparison with charge balanced Gaussian waveforms without any delay between the cathodic and anodic parts and was also 60% lower than a rectangular charged balanced pulse with a delay between the cathodic and anodic parts of the waveform. Furthermore, by defining a Synchronization Level metric, we observed that the GDG waveform was able to reduce the synchronization of GPi neurons more effectively than any other waveform. The promising results of GDG waveforms in terms of eliciting action potential, desynchronization of the basal ganglia neurons and reduction of energy consumption can potentially enhance the performance of DBS

  2. Computational Stimulation of the Basal Ganglia Neurons with Cost Effective Delayed Gaussian Waveforms

    Directory of Open Access Journals (Sweden)

    Mohammad Daneshzand

    2017-08-01

    Full Text Available Deep brain stimulation (DBS has compelling results in the desynchronization of the basal ganglia neuronal activities and thus, is used in treating the motor symptoms of Parkinson's disease (PD. Accurate definition of DBS waveform parameters could avert tissue or electrode damage, increase the neuronal activity and reduce energy cost which will prolong the battery life, hence avoiding device replacement surgeries. This study considers the use of a charge balanced Gaussian waveform pattern as a method to disrupt the firing patterns of neuronal cell activity. A computational model was created to simulate ganglia cells and their interactions with thalamic neurons. From the model, we investigated the effects of modified DBS pulse shapes and proposed a delay period between the cathodic and anodic parts of the charge balanced Gaussian waveform to desynchronize the firing patterns of the GPe and GPi cells. The results of the proposed Gaussian waveform with delay outperformed that of rectangular DBS waveforms used in in-vivo experiments. The Gaussian Delay Gaussian (GDG waveforms achieved lower number of misses in eliciting action potential while having a lower amplitude and shorter length of delay compared to numerous different pulse shapes. The amount of energy consumed in the basal ganglia network due to GDG waveforms was dropped by 22% in comparison with charge balanced Gaussian waveforms without any delay between the cathodic and anodic parts and was also 60% lower than a rectangular charged balanced pulse with a delay between the cathodic and anodic parts of the waveform. Furthermore, by defining a Synchronization Level metric, we observed that the GDG waveform was able to reduce the synchronization of GPi neurons more effectively than any other waveform. The promising results of GDG waveforms in terms of eliciting action potential, desynchronization of the basal ganglia neurons and reduction of energy consumption can potentially enhance the

  3. Does obesity along with major depression or anxiety lead to higher use of health care and costs? : A 6-year follow-up study

    NARCIS (Netherlands)

    Nigatu, Yeshambel T.; Bultmann, Ute; Schoevers, Robert A.; Penninx, Brenda W. J. H.; Reijneveld, Sijmen A.

    2017-01-01

    Background: Evidence lacks on whether obesity along with major depression (MD)/anxiety leads to higher health care use (HCU) and health care-related costs (HCC) compared with either condition alone. The objective of the study was to examine the longitudinal associations of obesity, MD/anxiety, and

  4. Effects on costs of frontline diagnostic evaluation in patients suspected of angina: coronary computed tomography angiography vs. conventional ischaemia testing

    DEFF Research Database (Denmark)

    Nielsen, Lene H; Olsen, Jens; Markenvard, John

    2013-01-01

    group. The mean (SD) total costs per patient at the end of thefollow-up were 14% lower in the CTA group than in the ex-test group, € 1510 (3474) vs. €1777 (3746) (P = 0.03). CONCLUSION: Diagnostic assessment of symptomatic patients with a low-intermediate probability of CAD by CTA incurred lower costs......AIMS: The aim of this study was to investigate in patients with stable angina the effects on costs of frontline diagnostics by exercise-stress testing (ex-test) vs. coronary computed tomography angiography (CTA). METHODS AND RESULTS: In two coronary units at Lillebaelt Hospital, Denmark, 498...... patients were identified in whom either ex-test (n = 247) or CTA (n = 251) were applied as the frontline diagnostic strategy in symptomatic patients with a low-intermediate pre-test probability of coronary artery disease (CAD). During 12 months of follow-up, death, myocardial infarction and costs...

  5. Performance and Cost-Effectiveness of Computed Tomography Lung Cancer Screening Scenarios in a Population-Based Setting: A Microsimulation Modeling Analysis in Ontario, Canada.

    Directory of Open Access Journals (Sweden)

    Kevin Ten Haaf

    2017-02-01

    Full Text Available The National Lung Screening Trial (NLST results indicate that computed tomography (CT lung cancer screening for current and former smokers with three annual screens can be cost-effective in a trial setting. However, the cost-effectiveness in a population-based setting with >3 screening rounds is uncertain. Therefore, the objective of this study was to estimate the cost-effectiveness of lung cancer screening in a population-based setting in Ontario, Canada, and evaluate the effects of screening eligibility criteria.This study used microsimulation modeling informed by various data sources, including the Ontario Health Insurance Plan (OHIP, Ontario Cancer Registry, smoking behavior surveys, and the NLST. Persons, born between 1940 and 1969, were examined from a third-party health care payer perspective across a lifetime horizon. Starting in 2015, 576 CT screening scenarios were examined, varying by age to start and end screening, smoking eligibility criteria, and screening interval. Among the examined outcome measures were lung cancer deaths averted, life-years gained, percentage ever screened, costs (in 2015 Canadian dollars, and overdiagnosis. The results of the base-case analysis indicated that annual screening was more cost-effective than biennial screening. Scenarios with eligibility criteria that required as few as 20 pack-years were dominated by scenarios that required higher numbers of accumulated pack-years. In general, scenarios that applied stringent smoking eligibility criteria (i.e., requiring higher levels of accumulated smoking exposure were more cost-effective than scenarios with less stringent smoking eligibility criteria, with modest differences in life-years gained. Annual screening between ages 55-75 for persons who smoked ≥40 pack-years and who currently smoke or quit ≤10 y ago yielded an incremental cost-effectiveness ratio of $41,136 Canadian dollars ($33,825 in May 1, 2015, United States dollars per life-year gained

  6. Meeting UK dietary recommendations is associated with higher estimated consumer food costs: an analysis using the National Diet and Nutrition Survey and consumer expenditure data, 2008-2012.

    Science.gov (United States)

    Jones, Nicholas Rv; Tong, Tammy Yn; Monsivais, Pablo

    2018-04-01

    To test whether diets achieving recommendations from the UK's Scientific Advisory Committee on Nutrition (SACN) were associated with higher monetary costs in a nationally representative sample of UK adults. A cross-sectional study linking 4 d diet diaries in the National Diet and Nutrition Survey (NDNS) to contemporaneous food price data from a market research firm. The monetary cost of diets was assessed in relation to whether or not they met eight food- and nutrient-based recommendations from SACN. Regression models adjusted for potential confounding factors. The primary outcome measure was individual dietary cost per day and per 2000 kcal (8368 kJ). UK. Adults (n 2045) sampled between 2008 and 2012 in the NDNS. On an isoenergetic basis, diets that met the recommendations for fruit and vegetables, oily fish, non-milk extrinsic sugars, fat, saturated fat and salt were estimated to be between 3 and 17 % more expensive. Diets meeting the recommendation for red and processed meats were 4 % less expensive, while meeting the recommendation for fibre was cost-neutral. Meeting multiple targets was also associated with higher costs; on average, diets meeting six or more SACN recommendations were estimated to be 29 % more costly than isoenergetic diets that met no recommendations. Food costs may be a population-level barrier limiting the adoption of dietary recommendations in the UK. Future research should focus on identifying systems- and individual-level strategies to enable consumers achieve dietary recommendations without increasing food costs. Such strategies may improve the uptake of healthy eating in the population.

  7. Incremental cost of department-wide implementation of a picture archiving and communication system and computed radiography.

    Science.gov (United States)

    Pratt, H M; Langlotz, C P; Feingold, E R; Schwartz, J S; Kundel, H L

    1998-01-01

    To determine the incremental cash flows associated with department-wide implementation of a picture archiving and communication system (PACS) and computed radiography (CR) at a large academic medical center. The authors determined all capital and operational costs associated with PACS implementation during an 8-year time horizon. Economic effects were identified, adjusted for time value, and used to calculate net present values (NPVs) for each section of the department of radiology and for the department as a whole. The chest-bone section used the most resources. Changes in cost assumptions for the chest-bone section had a dominant effect on the department-wide NPV. The base-case NPV (i.e., that determined by using the initial assumptions) was negative, indicating that additional net costs are incurred by the radiology department from PACS implementation. PACS and CR provide cost savings only when a 12-year hardware life span is assumed, when CR equipment is removed from the analysis, or when digitized long-term archives are compressed at a rate of 10:1. Full PACS-CR implementation would not provide cost savings for a large, subspecialized department. However, institutions that are committed to CR implementation (for whom CR implementation would represent a sunk cost) or institutions that are able to archive images by using image compression will experience cost savings from PACS.

  8. Interdisciplinary Approaches at Institutions of Higher Education: Teaching Information Systems Concepts to Students of Non-Computer Science Programs

    Directory of Open Access Journals (Sweden)

    Roland Schwald

    2011-07-01

    Full Text Available The aim of this paper is to present a curriculum development concept for teaching information systems content to students enrolled in non-computer science programs by presenting examples from the Business Administration programs at Albstadt-Sigmaringen University, a state university located in Southern Germany. The main focus of this paper therefore is to describe this curriculum development concept. Since this concept involves two disciplines, i.e. business administration and computer science, the author argues that it is necessary to define the roles of one discipline for the other and gives an example on how this could be done. The paper acknowledges that the starting point for the development of a curriculum such as one for a business administration program will be the requirements of the potential employers of the graduates. The paper continues to recommend the assignment of categorized skills and qualifications, such as knowledge, social, methodological, and decision making skills to the different parts of the curricula in question for the development of such a curriculum concept. After the mapping of skills and courses the paper describes how specific information systems can be used in courses, especially those with a specific focus on methodological skills. Two examples from Albstadt-Sigma-ringen University are being given. At the end of the paper the author explains the implications and limitations of such a concept, especially for programs that build on each other, as is the case for some Bachelor and Master programs. The paper concludes that though some elements of this concept are transferable, it is still necessary that every institution of higher education has to take into consideration its own situation to develop curricula concepts. It provides recommendations what issues every institution should solve for itself.

  9. Selecting an Architecture for a Safety-Critical Distributed Computer System with Power, Weight and Cost Considerations

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.

  10. Higher Dairy Food Intake Is Associated With Higher Spine Quantitative Computed Tomography (QCT) Bone Measures in the Framingham Study for Men But Not Women

    NARCIS (Netherlands)

    Dongen, van Laura H.; Kiel, Douglas P.; Soedamah-Muthu, Sabita S.; Bouxsein, Mary L.; Hannan, Marian T.; Sahni, Shivani

    2018-01-01

    Previous studies found that dairy foods were associated with higher areal bone mineral density (BMD). However, data on bone geometry or compartment-specific bone density is lacking. In this cross-sectional study, the association of milk, yogurt, cheese, cream, milk+yogurt, and milk+yogurt+cheese

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  12. Implications of higher energy - summary of benefits, issues, commissioning cost, SEU, Cryo, QPS margins, Potential availability issues

    CERN Document Server

    Alemany, R

    2012-01-01

    The LHC is technically almost ready to run at 4 TeV per beam in 2012. Nevertheless, a review of the advantages and disadvantages of such an energy step should be carefully made before taking this decision. There fore, this paper will summarize the benefits from the physics point of view; the potential issues like a possible increase of Single Event Errors , Unidentified Flying Objects, or a significant decrease of the quench margin from beam losses that, all in all , could lead to availability issues, compromising the integrated luminosity. And last but not least, the commissioning cost will be addressed.

  13. The application of cloud computing to scientific workflows: a study of cost and performance.

    Science.gov (United States)

    Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S

    2013-01-28

    The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.

  14. Thoracoabdominal computed tomography in trauma patients: a cost-consequences analysis

    NARCIS (Netherlands)

    Vugt, R. van; Kool, D.R.; Brink, M.; Dekker, H.M.; Deunk, J.; Edwards, M.J.R.

    2014-01-01

    BACKGROUND: CT is increasingly used during the initial evaluation of blunt trauma patients. In this era of increasing cost-awareness, the pros and cons of CT have to be assessed. OBJECTIVES: This study was performed to evaluate cost-consequences of different diagnostic algorithms that use

  15. Computer software to estimate timber harvesting system production, cost, and revenue

    Science.gov (United States)

    Dr. John E. Baumgras; Dr. Chris B. LeDoux

    1992-01-01

    Large variations in timber harvesting cost and revenue can result from the differences between harvesting systems, the variable attributes of harvesting sites and timber stands, or changing product markets. Consequently, system and site specific estimates of production rates and costs are required to improve estimates of harvesting revenue. This paper describes...

  16. Computation of piecewise affine terminal cost functions for model predictive control

    NARCIS (Netherlands)

    Brunner, F.D.; Lazar, M.; Allgöwer, F.; Fränzle, Martin; Lygeros, John

    2014-01-01

    This paper proposes a method for the construction of piecewise affine terminal cost functions for model predictive control (MPC). The terminal cost function is constructed on a predefined partition by solving a linear program for a given piecewise affine system, a stabilizing piecewise affine

  17. Is Higher Education Ready to Switch to Digital Course Materials? The Cost of Textbooks Is Driving Electronic Solutions

    Science.gov (United States)

    Nelson, Mark A.

    2008-01-01

    Each year one of the biggest debates in higher education seems to be: Is this the year that electronic textbooks take off? E-reader devices are getting better. The inventory of digital content is expanding. Business models are emerging to support the needs of students, faculty members, and publishers. People are getting more comfortable with new…

  18. Cost-effectiveness of computed tomography colonography in colorectal cancer screening: a systematic review.

    Science.gov (United States)

    Hanly, Paul; Skally, Mairead; Fenlon, Helen; Sharp, Linda

    2012-10-01

    The European Code Against Cancer recommends individuals aged ≥ 50 should participate in colorectal cancer screening. CT-colonography (CTC) is one of several screening tests available. We systematically reviewed evidence on, and identified key factors influencing, cost-effectiveness of CTC screening. PubMed, Medline, and the Cochrane library were searched for cost-effectiveness or cost-utility analyses of CTC-based screening, published in English, January 1999 to July 2010. Data was abstracted on setting, model type and horizon, screening scenario(s), comparator(s), participants, uptake, CTC performance and cost, effectiveness, ICERs, and whether extra-colonic findings and medical complications were considered. Sixteen studies were identified from the United States (n = 11), Canada (n = 2), and France, Italy, and the United Kingdom (1 each). Markov state-transition (n = 14) or microsimulation (n = 2) models were used. Eleven considered direct medical costs only; five included indirect costs. Fourteen compared CTC with no screening; fourteen compared CTC with colonoscopy-based screening; fewer compared CTC with sigmoidoscopy (8) or fecal tests (4). Outcomes assessed were life-years gained/saved (13), QALYs (2), or both (1). Three considered extra-colonic findings; seven considered complications. CTC appeared cost-effective versus no screening and, in general, flexible sigmoidoscopy and fecal occult blood testing. Results were mixed comparing CTC to colonoscopy. Parameters most influencing cost-effectiveness included: CTC costs, screening uptake, threshold for polyp referral, and extra-colonic findings. Evidence on cost-effectiveness of CTC screening is heterogeneous, due largely to between-study differences in comparators and parameter values. Future studies should: compare CTC with currently favored tests, especially fecal immunochemical tests; consider extra-colonic findings; and conduct comprehensive sensitivity analyses.

  19. A nearly-linear computational-cost scheme for the forward dynamics of an N-body pendulum

    Science.gov (United States)

    Chou, Jack C. K.

    1989-01-01

    The dynamic equations of motion of an n-body pendulum with spherical joints are derived to be a mixed system of differential and algebraic equations (DAE's). The DAE's are kept in implicit form to save arithmetic and preserve the sparsity of the system and are solved by the robust implicit integration method. At each solution point, the predicted solution is corrected to its exact solution within given tolerance using Newton's iterative method. For each iteration, a linear system of the form J delta X = E has to be solved. The computational cost for solving this linear system directly by LU factorization is O(n exp 3), and it can be reduced significantly by exploring the structure of J. It is shown that by recognizing the recursive patterns and exploiting the sparsity of the system the multiplicative and additive computational costs for solving J delta X = E are O(n) and O(n exp 2), respectively. The formulation and solution method for an n-body pendulum is presented. The computational cost is shown to be nearly linearly proportional to the number of bodies.

  20. Design and implementation of a reliable and cost-effective cloud computing infrastructure: the INFN Napoli experience

    International Nuclear Information System (INIS)

    Capone, V; Esposito, R; Pardi, S; Taurino, F; Tortone, G

    2012-01-01

    Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.

  1. Design and implementation of a reliable and cost-effective cloud computing infrastructure: the INFN Napoli experience

    Science.gov (United States)

    Capone, V.; Esposito, R.; Pardi, S.; Taurino, F.; Tortone, G.

    2012-12-01

    Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.

  2. A computational model for determining the minimal cost expansion alternatives in transmission systems planning

    International Nuclear Information System (INIS)

    Pinto, L.M.V.G.; Pereira, M.V.F.; Nunes, A.

    1989-01-01

    A computational model for determining an economical transmission expansion plan, based in the decomposition techniques is presented. The algorithm was used in the Brazilian South System and was able to find an optimal solution, with a low computational resource. Some expansions of this methodology are been investigated: the probabilistic one and the expansion with financier restriction. (C.G.C.). 4 refs, 7 figs

  3. Paper Circuits: A Tangible, Low Threshold, Low Cost Entry to Computational Thinking

    Science.gov (United States)

    Lee, Victor R.; Recker, Mimi

    2018-01-01

    In this paper, we propose that paper circuitry provides a productive space for exploring aspects of computational thinking, an increasingly critical 21st century skills for all students. We argue that the creation and operation of paper circuits involve learning about computational concepts such as rule-based constraints, operations, and defined…

  4. Intravenous thrombolysis of large vessel occlusions is associated with higher hospital costs than small vessel strokes: a rationale for developing stroke severity-based financial models.

    Science.gov (United States)

    Rai, Ansaar T; Evans, Kim; Riggs, Jack E; Hobbs, Gerald R

    2016-04-01

    Owing to their severity, large vessel occlusion (LVO) strokes may be associated with higher costs that are not reflected in current coding systems. This study aimed to determine whether intravenous thrombolysis costs are related to the presence or absence of LVO. Patients who had undergone intravenous thrombolysis over a 9-year period were divided into LVO and no LVO (nLVO) groups based on admission CT angiography. The primary outcome was hospital cost per admission. Secondary outcomes included admission duration, 90-day clinical outcome, and discharge destination. 119 patients (53%) had LVO and 104 (47%) had nLVO. Total mean±SD cost per LVO patient was $18,815±14,262 compared with $15,174±11,769 per nLVO patient (p=0.04). Hospital payments per admission were $17,338±13,947 and $15,594±16,437 for LVO and nLVO patients, respectively (p=0.4). A good outcome was seen in 33 LVO patients (27.7%) and in 69 nLVO patients (66.4%) (OR 0.2, 95% CI 0.1 to 0.3, pregression analysis after controlling for comorbidities showed the presence of LVO to be an independent predictor of higher total hospital costs. The presence or absence of LVO is associated with significant differences in hospital costs, outcomes, admission duration, and home discharge. These differences can be important when developing systems of care models for acute ischemic stroke. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  5. Computed tomographic colonography to screen for colorectal cancer, extracolonic cancer, and aortic aneurysm: model simulation with cost-effectiveness analysis.

    Science.gov (United States)

    Hassan, Cesare; Pickhardt, Perry J; Pickhardt, Perry; Laghi, Andrea; Kim, Daniel H; Kim, Daniel; Zullo, Angelo; Iafrate, Franco; Di Giulio, Lorenzo; Morini, Sergio

    2008-04-14

    In addition to detecting colorectal neoplasia, abdominal computed tomography (CT) with colonography technique (CTC) can also detect unsuspected extracolonic cancers and abdominal aortic aneurysms (AAA).The efficacy and cost-effectiveness of this combined abdominal CT screening strategy are unknown. A computerized Markov model was constructed to simulate the occurrence of colorectal neoplasia, extracolonic malignant neoplasm, and AAA in a hypothetical cohort of 100,000 subjects from the United States who were 50 years of age. Simulated screening with CTC, using a 6-mm polyp size threshold for reporting, was compared with a competing model of optical colonoscopy (OC), both without and with abdominal ultrasonography for AAA detection (OC-US strategy). In the simulated population, CTC was the dominant screening strategy, gaining an additional 1458 and 462 life-years compared with the OC and OC-US strategies and being less costly, with a savings of $266 and $449 per person, respectively. The additional gains for CTC were largely due to a decrease in AAA-related deaths, whereas the modeled benefit from extracolonic cancer downstaging was a relatively minor factor. At sensitivity analysis, OC-US became more cost-effective only when the CTC sensitivity for large polyps dropped to 61% or when broad variations of costs were simulated, such as an increase in CTC cost from $814 to $1300 or a decrease in OC cost from $1100 to $500. With the OC-US approach, suboptimal compliance had a strong negative influence on efficacy and cost-effectiveness. The estimated mortality from CT-induced cancer was less than estimated colonoscopy-related mortality (8 vs 22 deaths), both of which were minor compared with the positive benefit from screening. When detection of extracolonic findings such as AAA and extracolonic cancer are considered in addition to colorectal neoplasia in our model simulation, CT colonography is a dominant screening strategy (ie, more clinically effective and more cost

  6. Near DC eddy current measurement of aluminum multilayers using MR sensors and commodity low-cost computer technology

    Science.gov (United States)

    Perry, Alexander R.

    2002-06-01

    Low Frequency Eddy Current (EC) probes are capable of measurement from 5 MHz down to DC through the use of Magnetoresistive (MR) sensors. Choosing components with appropriate electrical specifications allows them to be matched to the power and impedance characteristics of standard computer connectors. This permits direct attachment of the probe to inexpensive computers, thereby eliminating external power supplies, amplifiers and modulators that have heretofore precluded very low system purchase prices. Such price reduction is key to increased market penetration in General Aviation maintenance and consequent reduction in recurring costs. This paper examines our computer software CANDETECT, which implements this approach and permits effective probe operation. Results are presented to show the intrinsic sensitivity of the software and demonstrate its practical performance when seeking cracks in the underside of a thick aluminum multilayer structure. The majority of the General Aviation light aircraft fleet uses rivets and screws to attach sheet aluminum skin to the airframe, resulting in similar multilayer lap joints.

  7. Minimizing total costs of forest roads with computer-aided design ...

    Indian Academy of Sciences (India)

    imum total road costs, while conforming to design specifications, environmental ..... quality, and enhancing fish and wildlife habitat, an appropriate design ..... Soil, Water and Timber Management: Forest Engineering Solutions in Response to.

  8. A probabilistic approach to the computation of the levelized cost of electricity

    International Nuclear Information System (INIS)

    Geissmann, Thomas

    2017-01-01

    This paper sets forth a novel approach to calculate the levelized cost of electricity (LCOE) using a probabilistic model that accounts for endogenous input parameters. The approach is applied to the example of a nuclear and gas power project. Monte Carlo simulation results show that a correlation between input parameters has a significant effect on the model outcome. By controlling for endogeneity, a statistically significant difference in the mean LCOE estimate and a change in the order of input leverages is observed. Moreover, the paper discusses the role of discounting options and external costs in detail. In contrast to the gas power project, the economic viability of the nuclear project is considerably weaker. - Highlights: • First model of levelized cost of electricity accounting for uncertainty and endogeneities in input parameters. • Allowance for endogeneities significantly affects results. • Role of discounting options and external costs is discussed and modelled.

  9. Model implementation for dynamic computation of system cost for advanced life support

    Science.gov (United States)

    Levri, J. A.; Vaccari, D. A.

    2004-01-01

    Life support system designs for long-duration space missions have a multitude of requirements drivers, such as mission objectives, political considerations, cost, crew wellness, inherent mission attributes, as well as many other influences. Evaluation of requirements satisfaction can be difficult, particularly at an early stage of mission design. Because launch cost is a critical factor and relatively easy to quantify, it is a point of focus in early mission design. The method used to determine launch cost influences the accuracy of the estimate. This paper discusses the appropriateness of dynamic mission simulation in estimating the launch cost of a life support system. This paper also provides an abbreviated example of a dynamic simulation life support model and possible ways in which such a model might be utilized for design improvement. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  10. An estimation of the effect of 100% Compliance with Diabetes Treatment: Can we reduce cost of illness with higher compliance rates?

    Directory of Open Access Journals (Sweden)

    Guvenc Kockaya

    2011-01-01

    Full Text Available Introduction: The current study was designed to estimate the direct cost of noncompliance of diabetes patients to the US health system. Understanding these expenses can inform screening and education budget policy regarding expenditure levels that can be calculated to be cost-beneficial. Materials and Method: The study was conducted in three parts. First, a computer search of National Institutes of Health websites and professional society websites for organizations with members that treat diabetes, and a PubMed search were performed to obtain the numbers required for calculations. Second, formulas were developed to estimate the risk of non-compliance and undiagnosed diabetes. Third, risk calculations were performed using the information obtained in part one and the formulas developed in part two. Results: Direct risk reduction for diabetes-related kidney disease, stroke, heart disease, and amputation were estimated for 100% compliance with diabetes treatment. Risk, case and yearly cost reduction calculated for a 100% compliance with diabetes treatment were 13.6%, 0.9 million and US$ 9.3 billion, respectively. Conclusion: Society, insurers, policy makers and other stakeholders could invest up to these amounts in screening, education and prevention efforts in an effort to reduce these costly and traumatic sequelae of noncompliant diabetes patients.   Type: Original Research

  11. Computation of spot prices and congestion costs in large interconnected power systems

    International Nuclear Information System (INIS)

    Mukerji, R.; Jordan, G.A.; Clayton, R.; Haringa, G.E.

    1995-01-01

    Foremost among the new paradigms for the US utility industry is the ''poolco'' concept proposed by Prof. William W. Hogan of Harvard University. This concept uses a central pool or power exchange in which physical power is traded based on spot prices or market clearing prices. The rapid and accurate calculation of these ''spot'' prices and associated congestion costs for large interconnected power systems is the central tenet upon which the poolco concept is based. The market clearing price would be the same throughout the system if there were no system losses and transmission limitations did not exist. System losses cause small differences in market clearing prices as the cost of supplying a MW at various load buses includes the cost of losses. Transmission limits may cause large differences in market clearing prices between regions as low cost generation is blocked by the transmission constraints from serving certain loads. In models currently in use in the electric power industry spot price calculations range from ''bubble diagram'' type contract path models to full electrical representation such as GE-MAPS. The modeling aspects of the full electrical representation are included in the Appendix. The problem with the bubble diagram representation is that these models are liable to produce unacceptably large errors in the calculation of spot prices and congestion costs. The subtleties of the calculation of spot prices and congestion costs are illustrated in this paper

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  13. Noise Threshold and Resource Cost of Fault-Tolerant Quantum Computing with Majorana Fermions in Hybrid Systems.

    Science.gov (United States)

    Li, Ying

    2016-09-16

    Fault-tolerant quantum computing in systems composed of both Majorana fermions and topologically unprotected quantum systems, e.g., superconducting circuits or quantum dots, is studied in this Letter. Errors caused by topologically unprotected quantum systems need to be corrected with error-correction schemes, for instance, the surface code. We find that the error-correction performance of such a hybrid topological quantum computer is not superior to a normal quantum computer unless the topological charge of Majorana fermions is insusceptible to noise. If errors changing the topological charge are rare, the fault-tolerance threshold is much higher than the threshold of a normal quantum computer and a surface-code logical qubit could be encoded in only tens of topological qubits instead of about 1,000 normal qubits.

  14. Computer-assisted cognitive remediation therapy in schizophrenia: Durability of the effects and cost-utility analysis.

    Science.gov (United States)

    Garrido, Gemma; Penadés, Rafael; Barrios, Maite; Aragay, Núria; Ramos, Irene; Vallès, Vicenç; Faixa, Carlota; Vendrell, Josep M

    2017-08-01

    The durability of computer-assisted cognitive remediation (CACR) therapy over time and the cost-effectiveness of treatment remains unclear. The aim of the current study is to investigate the effectiveness of CACR and to examine the use and cost of acute psychiatric admissions before and after of CACR. Sixty-seven participants were initially recruited. For the follow-up study a total of 33 participants were enrolled, 20 to the CACR condition group and 13 to the active control condition group. All participants were assessed at baseline, post-therapy and 12 months post-therapy on neuropsychology, QoL and self-esteem measurements. The use and cost of acute psychiatric admissions were collected retrospectively at four assessment points: baseline, 12 months post-therapy, 24 months post-therapy, and 36 months post-therapy. The results indicated that treatment effectiveness persisted in the CACR group one year post-therapy on neuropsychological and well-being outcomes. The CACR group showed a clear decrease in the use of acute psychiatric admissions at 12, 24 and 36 months post-therapy, which lowered the global costs the acute psychiatric admissions at 12, 24 and 36 months post-therapy. The CACR is durable over at least a 12-month period, and CACR may be helping to reduce health care costs for schizophrenia patients. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  15. Chest Computed Tomographic Image Screening for Cystic Lung Diseases in Patients with Spontaneous Pneumothorax Is Cost Effective.

    Science.gov (United States)

    Gupta, Nishant; Langenderfer, Dale; McCormack, Francis X; Schauer, Daniel P; Eckman, Mark H

    2017-01-01

    Patients without a known history of lung disease presenting with a spontaneous pneumothorax are generally diagnosed as having primary spontaneous pneumothorax. However, occult diffuse cystic lung diseases such as Birt-Hogg-Dubé syndrome (BHD), lymphangioleiomyomatosis (LAM), and pulmonary Langerhans cell histiocytosis (PLCH) can also first present with a spontaneous pneumothorax, and their early identification by high-resolution computed tomographic (HRCT) chest imaging has implications for subsequent management. The objective of our study was to evaluate the cost-effectiveness of HRCT chest imaging to facilitate early diagnosis of LAM, BHD, and PLCH. We constructed a Markov state-transition model to assess the cost-effectiveness of screening HRCT to facilitate early diagnosis of diffuse cystic lung diseases in patients presenting with an apparent primary spontaneous pneumothorax. Baseline data for prevalence of BHD, LAM, and PLCH and rates of recurrent pneumothoraces in each of these diseases were derived from the literature. Costs were extracted from 2014 Medicare data. We compared a strategy of HRCT screening followed by pleurodesis in patients with LAM, BHD, or PLCH versus conventional management with no HRCT screening. In our base case analysis, screening for the presence of BHD, LAM, or PLCH in patients presenting with a spontaneous pneumothorax was cost effective, with a marginal cost-effectiveness ratio of $1,427 per quality-adjusted life-year gained. Sensitivity analysis showed that screening HRCT remained cost effective for diffuse cystic lung diseases prevalence as low as 0.01%. HRCT image screening for BHD, LAM, and PLCH in patients with apparent primary spontaneous pneumothorax is cost effective. Clinicians should consider performing a screening HRCT in patients presenting with apparent primary spontaneous pneumothorax.

  16. In the Clouds: The Implications of Cloud Computing for Higher Education Information Technology Governance and Decision Making

    Science.gov (United States)

    Dulaney, Malik H.

    2013-01-01

    Emerging technologies challenge the management of information technology in organizations. Paradigm changing technologies, such as cloud computing, have the ability to reverse the norms in organizational management, decision making, and information technology governance. This study explores the effects of cloud computing on information technology…

  17. Computed tomography versus intravenous urography in diagnosis of acute flank pain from urolithiasis: a randomized study comparing imaging costs and radiation dose

    International Nuclear Information System (INIS)

    Thomson, J.M.Z.; Maling, T.M.J.; Glocer, J.; Mark, S.; Abbott, C.

    2001-01-01

    The equivalent sensitivity of non-contrast computed tomography (NCCT) and intravenous urography (IVU) in the diagnosis of suspected ureteric colic has been established. Approximately 50% of patients with suspected ureteric colic do not have a nephro-urological cause for pain. Because many such patients require further imaging studies, NCCT may obviate the need for these studies and, in so doing, be more cost effective and involve less overall radiation exposure. The present study compares the total imaging cost and radiation dose of NCCT versus IVU in the diagnosis of acute flank pain. Two hundred and twenty-four patients (157 men; mean age 45 years; age range 19-79 years) with suspected renal colic were randomized either to NCCT or IVU. The number of additional diagnostic imaging studies, cost (IVU A$ 136; CTU A$ 173), radiation exposure and imaging times were compared. Of 119(53%) patients with renal obstruction, 105 had no nephro-urological causes of pain. For 21 (20%) of these patients an alternative diagnosis was made at the initial imaging, 10 of which were significant. Of 118 IVU patients, 28 (24%) required 32 additional imaging tests to reach a diagnosis, whereas seven of 106 (6%) NCCT patients required seven additional imaging studies. The average total diagnostic imaging cost for the NCCT group was A$181.94 and A$175.46 for the IVU group (P< 0.43). Mean radiation dose to diagnosis was 5.00 mSv (NCCT) versus 3.50 mSv (IVU) (P < 0.001). Mean imaging time was 30 min (NCCT) versus 75 min (IVU) (P < 0.001). Diagnostic imaging costs were remarkably similar. Although NCCT involves a higher radiation dose than IVU, its advantages of faster diagnosis, the avoidance of additional diagnostic imaging tests and its ability to diagnose other causes makes it the study of choice for acute flank pain at Christchurch Hospital. Copyright (2001) Blackwell Science Pty Ltd

  18. Computational Intelligence in Software Cost Estimation: Evolving Conditional Sets of Effort Value Ranges

    OpenAIRE

    Papatheocharous, Efi; Andreou, Andreas S.

    2008-01-01

    In this approach we aimed at addressing the problem of large variances found in available historical data that are used in software cost estimation. Project data is expensive to collect, manage and maintain. Therefore, if we wish to lower the dependence of the estimation to

  19. Robotic lower limb prosthesis design through simultaneous computer optimizations of human and prosthesis costs

    Science.gov (United States)

    Handford, Matthew L.; Srinivasan, Manoj

    2016-02-01

    Robotic lower limb prostheses can improve the quality of life for amputees. Development of such devices, currently dominated by long prototyping periods, could be sped up by predictive simulations. In contrast to some amputee simulations which track experimentally determined non-amputee walking kinematics, here, we explicitly model the human-prosthesis interaction to produce a prediction of the user’s walking kinematics. We obtain simulations of an amputee using an ankle-foot prosthesis by simultaneously optimizing human movements and prosthesis actuation, minimizing a weighted sum of human metabolic and prosthesis costs. The resulting Pareto optimal solutions predict that increasing prosthesis energy cost, decreasing prosthesis mass, and allowing asymmetric gaits all decrease human metabolic rate for a given speed and alter human kinematics. The metabolic rates increase monotonically with speed. Remarkably, by performing an analogous optimization for a non-amputee human, we predict that an amputee walking with an appropriately optimized robotic prosthesis can have a lower metabolic cost - even lower than assuming that the non-amputee’s ankle torques are cost-free.

  20. Quantitative evaluation of low-cost frame-grabber boards for personal computers.

    Science.gov (United States)

    Kofler, J M; Gray, J E; Fuelberth, J T; Taubel, J P

    1995-11-01

    Nine moderately priced frame-grabber boards for both Macintosh (Apple Computers, Cupertino, CA) and IBM-compatible computers were evaluated using a Society of Motion Pictures and Television Engineers (SMPTE) pattern and a video signal generator for dynamic range, gray-scale reproducibility, and spatial integrity of the captured image. The degradation of the video information ranged from minor to severe. Some boards are of reasonable quality for applications in diagnostic imaging and education. However, price and quality are not necessarily directly related.

  1. Cost-effectiveness of computed tomographic colonography screening for colorectal cancer in the medicare population

    NARCIS (Netherlands)

    A.B. Knudsen (Amy); I. Lansdorp-Vogelaar (Iris); C.M. Rutter (Carolyn); J.E. Savarino (James); M. van Ballegooijen (Marjolein); K.M. Kuntz (Karen); A. Zauber (Ann)

    2010-01-01

    textabstractBackground The Centers for Medicare and Medicaid Services (CMS) considered whether to reimburse computed tomographic colonography (CTC) for colorectal cancer screening of Medicare enrollees. To help inform its decision, we evaluated the reimbursement rate at which CTC screening could be

  2. Cost analysis of hash collisions : will quantum computers make SHARCS obsolete?

    NARCIS (Netherlands)

    Bernstein, D.J.

    2009-01-01

    Current proposals for special-purpose factorization hardware will become obsolete if large quantum computers are built: the number-field sieve scales much more poorly than Shor's quantum algorithm for factorization. Will all special-purpose cryptanalytic hardware become obsolete in a post-quantum

  3. Low cost SCR lamp driver indicates contents of digital computer registers

    Science.gov (United States)

    Cliff, R. A.

    1967-01-01

    Silicon Controlled Rectifier /SCR/ lamp driver is adapted for use in integrated circuit digital computers where it indicates the contents of the various registers. The threshold voltage at which visual indication begins is very sharply defined and can be adjusted to suit particular system requirements.

  4. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  5. Females with a mutation in a nuclear-encoded mitochondrial protein pay a higher cost of survival than do males in Drosophila.

    Science.gov (United States)

    Melvin, Richard G; Ballard, J William O

    2011-07-01

    Males and females age at different rates in a variety of species, but the mechanisms underlying the difference is not understood. In this study, we investigated sex-specific costs of a naturally occurring mildly deleterious deletion (DTrp85, DVal86) in cytochrome c oxidase subunit 7A (cox7A) in Drosophila simulans. We observed that females and males homozygous for the mutation had 30% and 26% reduced Cox activity, respectively, compared with wild type. Furthermore, 4-day-old females had 34%-42% greater physical activity than males. Greater physical activity in mutant females was correlated with a 19% lower 50% survival compared with wild-type females. Mutant and wild-type males had equal survival. These data suggest that females paid a higher cost of the mutation than did males. The data demonstrate linking population genetics and structural modeling to experimental manipulations that lead to functional predictions of mitochondrial bioenergetics and organism aging.

  6. The use of 3D CADD (Computer Aided Design and Drafting) models in operation and maintenance cost reduction

    International Nuclear Information System (INIS)

    Didsbury, R.; Bains, N.; Cho, U.Y.

    1998-01-01

    The use of three dimensional(3D) computer-aided design and drafting(CADD) models, and the associated information technology and databases, in the engineering and construction phases of large projects is well established and yielding significant improvements in project cost, schedule and quality. The information contained in these models can also be extremely valuable to operating plants, particularly when the visual and spatial information contained in the 3D models is interfaced to other plant information databases. Indeed many plant owners and operators in the process and power industries are already using this technology to assist with such activities as plant configuration management, staff training, work planning and radiation protection. This paper will explore the application of 3D models and the associated databases in an operating plant environment and describe the resulting operational benefits and cost reduction benefits. Several industrial experience case studies will be presented along with suggestions for further future applications. (author). 4 refs., 1 tab., 8 figs

  7. User's guide to SERICPAC: A computer program for calculating electric-utility avoided costs rates

    Energy Technology Data Exchange (ETDEWEB)

    Wirtshafter, R.; Abrash, M.; Koved, M.; Feldman, S.

    1982-05-01

    SERICPAC is a computer program developed to calculate average avoided cost rates for decentralized power producers and cogenerators that sell electricity to electric utilities. SERICPAC works in tandem with SERICOST, a program to calculate avoided costs, and determines the appropriate rates for buying and selling of electricity from electric utilities to qualifying facilities (QF) as stipulated under Section 210 of PURA. SERICPAC contains simulation models for eight technologies including wind, hydro, biogas, and cogeneration. The simulations are converted in a diversified utility production which can be either gross production or net production, which accounts for an internal electricity usage by the QF. The program allows for adjustments to the production to be made for scheduled and forced outages. The final output of the model is a technology-specific average annual rate. The report contains a description of the technologies and the simulations as well as complete user's guide to SERICPAC.

  8. The weighted average cost of capital over the lifecycle of the firm: Is the overinvestment problem of mature firms intensified by a higher WACC?

    Directory of Open Access Journals (Sweden)

    Carlos S. Garcia

    2016-08-01

    Full Text Available Firm lifecycle theory predicts that the Weighted Average Cost of Capital (WACC will tend to fall over the lifecycle of the firm (Mueller, 2003, p. 80-81. However, given that previous research finds that corporate governance deteriorates as firms get older (Mueller and Yun, 1998; Saravia, 2014 there is good reason to suspect that the opposite could be the case, that is, that the WACC is higher for older firms. Since our literature review indicates that no direct tests to clarify this question have been carried out up till now, this paper aims to fill the gap by testing this prediction empirically. Our findings support the proposition that the WACC of younger firms is higher than that of mature firms. Thus, we find that the mature firm overinvestment problem is not intensified by a higher cost of capital, on the contrary, our results suggest that mature firms manage to invest in negative net present value projects even though they have access to cheaper capital. This finding sheds new light on the magnitude of the corporate governance problems found in mature firms.

  9. Costs associated with implementation of computer-assisted clinical decision support system for antenatal and delivery care: case study of Kassena-Nankana district of northern Ghana.

    Science.gov (United States)

    Dalaba, Maxwell Ayindenaba; Akweongo, Patricia; Williams, John; Saronga, Happiness Pius; Tonchev, Pencho; Sauerborn, Rainer; Mensah, Nathan; Blank, Antje; Kaltschmidt, Jens; Loukanova, Svetla

    2014-01-01

    This study analyzed cost of implementing computer-assisted Clinical Decision Support System (CDSS) in selected health care centres in Ghana. A descriptive cross sectional study was conducted in the Kassena-Nankana district (KND). CDSS was deployed in selected health centres in KND as an intervention to manage patients attending antenatal clinics and the labour ward. The CDSS users were mainly nurses who were trained. Activities and associated costs involved in the implementation of CDSS (pre-intervention and intervention) were collected for the period between 2009-2013 from the provider perspective. The ingredients approach was used for the cost analysis. Costs were grouped into personnel, trainings, overheads (recurrent costs) and equipment costs (capital cost). We calculated cost without annualizing capital cost to represent financial cost and cost with annualizing capital costs to represent economic cost. Twenty-two trained CDSS users (at least 2 users per health centre) participated in the study. Between April 2012 and March 2013, users managed 5,595 antenatal clients and 872 labour clients using the CDSS. We observed a decrease in the proportion of complications during delivery (pre-intervention 10.74% versus post-intervention 9.64%) and a reduction in the number of maternal deaths (pre-intervention 4 deaths versus post-intervention 1 death). The overall financial cost of CDSS implementation was US$23,316, approximately US$1,060 per CDSS user trained. Of the total cost of implementation, 48% (US$11,272) was pre-intervention cost and intervention cost was 52% (US$12,044). Equipment costs accounted for the largest proportion of financial cost: 34% (US$7,917). When economic cost was considered, total cost of implementation was US$17,128-lower than the financial cost by 26.5%. The study provides useful information in the implementation of CDSS at health facilities to enhance health workers' adherence to practice guidelines and taking accurate decisions to improve

  10. Development and implementation of a low cost micro computer system for LANDSAT analysis and geographic data base applications

    Science.gov (United States)

    Faust, N.; Jordon, L.

    1981-01-01

    Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.

  11. The Department of Defense and the Power of Cloud Computing: Weighing Acceptable Cost Versus Acceptable Risk

    Science.gov (United States)

    2016-04-01

    of control 8 2 PAAS component stack and scope of control 9 3 SAAS component stack and scope of control 10 vii Foreword It is my great pleasure to...service (PAAS), or software as a service ( SAAS ). Regardless of the model or service selected, the process of implementing a cloud-computing environment... SAAS . To- gether, these build on each other, providing more service to the customer while limiting customers’ abilities to operate, maintain, and

  12. A low-cost, low-energy tangible programming system for computer illiterates in developing regions

    CSIR Research Space (South Africa)

    Smith, Andrew C

    2008-07-01

    Full Text Available approach is to first develop, in the illiterate population, the cognitive process of logical thinking required in the IT field. Having developed this ability, the illiterate person has a tool for potentially controlling a number of objects... functionality. Because of these tangible and visual properties, the cognitive burden on the user is reduced as compared with text-only input systems. We therefore hypothesise that our input devices are well suited for computer-illiterate people. 3...

  13. Pricing the Services of the Computer Center at the Catholic University of Louvain. Program on Institutional Management in Higher Education.

    Science.gov (United States)

    Hecquet, Ignace; And Others

    Principles are outlined that are used as a basis for the system of pricing the services of the Computer Centre. The system illustrates the use of a management method to secure better utilization of university resources. Departments decide how to use the appropriations granted to them and establish a system of internal prices that reflect the cost…

  14. Cost-optimized configuration of computing instances for large sized cloud systems

    Directory of Open Access Journals (Sweden)

    Woo-Chan Kim

    2017-09-01

    Full Text Available Cloud computing services are becoming more popular for various reasons which include ‘having no need for capital expenditure’ and ‘the ability to quickly meet business demands’. However, what seems to be an attractive option may become a substantial expenditure as more projects are moved into the cloud. Cloud service companies provide different pricing options to their customers that can potentially lower the customers’ spending on the cloud. Choosing the right combination of pricing options can be formulated as a linear mixed integer programming problem, which can be solved using optimization.

  15. Computed radiography and the workstation in a study of the cervical spine. Technical and cost implications

    International Nuclear Information System (INIS)

    Garcia, J. M.; Lopez-Galiacho, N.; Martinez, M.

    1999-01-01

    To demonstrate the advantages of computed radiography and the workstation in assessing the images acquired in a study of the cervical spine. Lateral projections of cervical spine obtained using a computed radiography system in 63 ambulatory patients were studied in a workstation. Images of the tip of the odontoid process. C1-C2, basion-opisthion and C7 were visualized prior to and after their transmission and processing, and the overall improvement in their diagnostic quality was assessed. The rate of detection of the tip of the odontoid process, C1-C2, the foramen magnum and C/ increased by 17,6, 11 and 14 percentage points, respectively. Image processing improved the diagnostic quality in over 75% of cases. Image processing in a workstation improved the visualization of the anatomical points being studied and the diagnostic quality of the images. These advantages as well as the possibility of transferring the images to a picture archiving and communication system (PACS) are convincing reasons for using digital radiography. (Author) 7 refs

  16. Clinical and cost effectiveness of computer treatment for aphasia post stroke (Big CACTUS): study protocol for a randomised controlled trial.

    Science.gov (United States)

    Palmer, Rebecca; Cooper, Cindy; Enderby, Pam; Brady, Marian; Julious, Steven; Bowen, Audrey; Latimer, Nicholas

    2015-01-27

    Aphasia affects the ability to speak, comprehend spoken language, read and write. One third of stroke survivors experience aphasia. Evidence suggests that aphasia can continue to improve after the first few months with intensive speech and language therapy, which is frequently beyond what resources allow. The development of computer software for language practice provides an opportunity for self-managed therapy. This pragmatic randomised controlled trial will investigate the clinical and cost effectiveness of a computerised approach to long-term aphasia therapy post stroke. A total of 285 adults with aphasia at least four months post stroke will be randomly allocated to either usual care, computerised intervention in addition to usual care or attention and activity control in addition to usual care. Those in the intervention group will receive six months of self-managed word finding practice on their home computer with monthly face-to-face support from a volunteer/assistant. Those in the attention control group will receive puzzle activities, supplemented by monthly telephone calls. Study delivery will be coordinated by 20 speech and language therapy departments across the United Kingdom. Outcome measures will be made at baseline, six, nine and 12 months after randomisation by blinded speech and language therapist assessors. Primary outcomes are the change in number of words (of personal relevance) named correctly at six months and improvement in functional conversation. Primary outcomes will be analysed using a Hochberg testing procedure. Significance will be declared if differences in both word retrieval and functional conversation at six months are significant at the 5% level, or if either comparison is significant at 2.5%. A cost utility analysis will be undertaken from the NHS and personal social service perspective. Differences between costs and quality-adjusted life years in the three groups will be described and the incremental cost effectiveness ratio

  17. CLOUD COMPUTING ARCHITECTURE FOR HIGHER EDUCATION IN THE THIRD WORLD COUNTRIES (REPUBLIC OF THE SUDAN AS MODEL)

    OpenAIRE

    Mohmed Sirelkhtem Adrees; Majzoob Kamal Aldein Omer; Osama E. Sheta

    2015-01-01

    The exponential growth in the volume of data and information lead to problems in management, controlling effective and high costs of storage operation, where organizations are having problems: data retrieval and preparation and backups, and other acts of data. Therefore seeking companies and business organizations at the present time to achieve the highest return on their investments in technology through the planning and implementation of virtualization technologies and cloud com...

  18. SUCCESS OF IMPLEMENTATION OF COMPUTER CRIME ACT (UU ITE NO.11 2008 (A Case Study in the Higher Education Institution in Indonesia

    Directory of Open Access Journals (Sweden)

    Rizki Yudhi Dewantara

    2017-06-01

    Full Text Available Computer crime rate grow rapidly along with the development of the digital world that has touched almost all aspects of human life. Institutions of higher education cannot be separated from the problem of computer crime activities. The paper analyses the implementation of Indonesia Computer Crime Act (UU ITE NO.11 2008 in the Higher Education Institution in Indonesia. It aims to investigate the level of computer crimes that occurred in the higher education institution environment and the act (UU ITE 11, 2008 successfully applied to prevent the crime that would arise. In this research, the analysis using Descriptive Statistics, Binary logistic regression. This paper also describes the success implementation of the Information System Security Policy (ISSP as a computer crime prevention policy in higher education institution in Indonesia. In factor of act, clarity of objectives and purpose of the UU ITE 11, 2008 was low, the communication and socialization activities are still low to the society especially to the higher education institution, moreover the control process has been running on UU ITE 11, 2008, but at a low level. Keywords: computer crime, computer crime act, public policy implementation  ABSTRAK  Kejahatan Komputer berkembang pesat sejalan dengan perkembangan dunia digital, pada institusi perguruan tinggi tidak dapat dipisahkan dari bagian kejahatan computer. Penelitian ini merupakan analisis kesuksesan penerapan undang-undang kejahatan komputer (UU ITE 11, 2008 di institusi perguruan tinggi di Indonesia. Penelitian ini bertujuan untuk mengetahui tingkat kejahatan komputer yang terjadi pada lingkungan institusi perguruan tinggi dan kesuksesan penerapan undang-undang kejahatan komputer untuk mencegah tindakan kejahatan komputer yang mungkin dapat terjadi maupun menangani kejahatan yang sedang terjadi. Berdasarkan tujuan penelitian, digunakan pendekatan quantitative dengan beberapa uji statistic antara lain analisis statistic

  19. Potentially Low Cost Solution to Extend Use of Early Generation Computed Tomography

    Directory of Open Access Journals (Sweden)

    Tonna, Joseph E

    2010-12-01

    Full Text Available In preparing a case report on Brown-Séquard syndrome for publication, we made the incidental finding that the inexpensive, commercially available three-dimensional (3D rendering software we were using could produce high quality 3D spinal cord reconstructions from any series of two-dimensional (2D computed tomography (CT images. This finding raises the possibility that spinal cord imaging capabilities can be expanded where bundled 2D multi-planar reformats and 3D reconstruction software for CT are not available and in situations where magnetic resonance imaging (MRI is either not available or appropriate (e.g. metallic implants. Given the worldwide burden of trauma and considering the limited availability of MRI and advanced generation CT scanners, we propose an alternative, potentially useful approach to imaging spinal cord that might be useful in areas where technical capabilities and support are limited. [West J Emerg Med. 2010; 11(5:463-466.

  20. Computing Cost Price by Using Activity Based Costing (ABC Method in Dialysis Ward of Shahid Rajaei Medical & Education Center, in Alborz University of Medical Sciences Karaj in 2015

    Directory of Open Access Journals (Sweden)

    H. Derafshi

    2016-08-01

    Full Text Available Background: Analysis of hospital cost is one of the key subjects for resource allocation. The Activity – based costing is an applicable tool to recognize accurate costs .This technique helps to determine costs. The aim of this study is utilizing activity activity-based costing method to estimate the cost of dialysis unit related to Shahid Rajaei hospital in year 2015. Methods: The type of this research is applied and sectioned descriptive study. The required data is collected from dialysis unit , accounting unit, discharge, the completion of medical equipments of Shahid Rajaei hospital in the first six months 2015 which was calculated cost by excel software. Results and Conclusion: In any month, the average 1238 patients accepted to receive the dialysis services in Shahid Rajaei hospital .The cost of consumables materials was 47.6%, which is the majority percentage of allocated costs. The lowest cost related to insurance deductions about 2.27%. After Calculating various costs of dialysis services, we find out, the personal cost covers only 32% of the all cost. The other ongoing overhead cost is about 11.94% of all cost. Therefore, any dialysis service requires 2.017.131 rial costs, however the tariff of any dialysis service is 1.838.871 rial. So, this center loses 178,260 rial in each session. The results show that the cost of doing any dialysis services is more than the revenue of it in Shahid Rajaei hospital. It seems that the reforming processes of supplying consumable, changing the tariffs in chronic dialysis; especially in set the filter and consumable materials unit besides controlling the cost of human resource could decrease the cost of this unit with Regard to the results recommended using capacity of the private department recommended. 

  1. Leveraging Cloud Computing to Improve Storage Durability, Availability, and Cost for MER Maestro

    Science.gov (United States)

    Chang, George W.; Powell, Mark W.; Callas, John L.; Torres, Recaredo J.; Shams, Khawaja S.

    2012-01-01

    The Maestro for MER (Mars Exploration Rover) software is the premiere operation and activity planning software for the Mars rovers, and it is required to deliver all of the processed image products to scientists on demand. These data span multiple storage arrays sized at 2 TB, and a backup scheme ensures data is not lost. In a catastrophe, these data would currently recover at 20 GB/hour, taking several days for a restoration. A seamless solution provides access to highly durable, highly available, scalable, and cost-effective storage capabilities. This approach also employs a novel technique that enables storage of the majority of data on the cloud and some data locally. This feature is used to store the most recent data locally in order to guarantee utmost reliability in case of an outage or disconnect from the Internet. This also obviates any changes to the software that generates the most recent data set as it still has the same interface to the file system as it did before updates

  2. Providing a Spanish interpreter using low-cost videoconferencing in a community study computers

    Directory of Open Access Journals (Sweden)

    James L Wofford

    2013-03-01

    Full Text Available Background The advent of more mobile, more reliable, and more affordable videoconferencing technology finally makes it realistic to offer remote foreign language interpretation in the office setting. Still, such technologies deserve proof of acceptability to clinicians and patients before there is widespread acceptance and routine use.Objective We sought to examine: (1 the audio and video technical fidelity of iPad/Facetime software, (2 the acceptability of videoconferencing to patients and clinicians.Methods The convenience sample included Spanish-speaking adult patients at a community health care medicine clinic in 2011. Videoconferencing was conducted using two iPads connecting patient/physician located in the clinic examination room, and the interpreter in a remote/separate office in the same building. A five-item survey was used to solicit opinions on overall quality of the videoconferencing device, audio/video integrity/fidelity, perception of encounter duration, and attitude toward future use.Results Twenty-five patients, 18 clinicians and 5 interpreters participated in the project. Most patients (24/25 rated overall quality of videoconferencing as good/excellent with only 1 ‘fair’ rating. Eleven patients rated the amount of time as no longer than in-person, and nine reported it as shorter than in person. Most patients, 94.0% (24/25, favoured using videoconferencing during future visits. For the 18 clinicians, the results were similar.Conclusions Based on our experience at a single-site community health centre, the videoconferencing technology appeared to be flawless, and both patients and clinicians were satisfied. Expansion of videoconferencing to other off-site healthcare professionals should be considered in the search for more cost-effective healthcare.

  3. Quantitative computed tomography of lung parenchyma in patients with emphysema: analysis of higher-density lung regions

    Science.gov (United States)

    Lederman, Dror; Leader, Joseph K.; Zheng, Bin; Sciurba, Frank C.; Tan, Jun; Gur, David

    2011-03-01

    Quantitative computed tomography (CT) has been widely used to detect and evaluate the presence (or absence) of emphysema applying the density masks at specific thresholds, e.g., -910 or -950 Hounsfield Unit (HU). However, it has also been observed that subjects with similar density-mask based emphysema scores could have varying lung function, possibly indicating differences of disease severity. To assess this possible discrepancy, we investigated whether density distribution of "viable" lung parenchyma regions with pixel values > -910 HU correlates with lung function. A dataset of 38 subjects, who underwent both pulmonary function testing and CT examinations in a COPD SCCOR study, was assembled. After the lung regions depicted on CT images were automatically segmented by a computerized scheme, we systematically divided the lung parenchyma into different density groups (bins) and computed a number of statistical features (i.e., mean, standard deviation (STD), skewness of the pixel value distributions) in these density bins. We then analyzed the correlations between each feature and lung function. The correlation between diffusion lung capacity (DLCO) and STD of pixel values in the bin of -910HU lung parenchyma and lung function, which indicates that similar to the conventional density mask method, the pixel value distribution features in "viable" lung parenchyma areas may also provide clinically useful information to improve assessments of lung disease severity as measured by lung functional tests.

  4. Modeling the economic costs of disasters and recovery: analysis using a dynamic computable general equilibrium model

    Science.gov (United States)

    Xie, W.; Li, N.; Wu, J.-D.; Hao, X.-L.

    2014-04-01

    Disaster damages have negative effects on the economy, whereas reconstruction investment has positive effects. The aim of this study is to model economic causes of disasters and recovery involving the positive effects of reconstruction activities. Computable general equilibrium (CGE) model is a promising approach because it can incorporate these two kinds of shocks into a unified framework and furthermore avoid the double-counting problem. In order to factor both shocks into the CGE model, direct loss is set as the amount of capital stock reduced on the supply side of the economy; a portion of investments restores the capital stock in an existing period; an investment-driven dynamic model is formulated according to available reconstruction data, and the rest of a given country's saving is set as an endogenous variable to balance the fixed investment. The 2008 Wenchuan Earthquake is selected as a case study to illustrate the model, and three scenarios are constructed: S0 (no disaster occurs), S1 (disaster occurs with reconstruction investment) and S2 (disaster occurs without reconstruction investment). S0 is taken as business as usual, and the differences between S1 and S0 and that between S2 and S0 can be interpreted as economic losses including reconstruction and excluding reconstruction, respectively. The study showed that output from S1 is found to be closer to real data than that from S2. Economic loss under S2 is roughly 1.5 times that under S1. The gap in the economic aggregate between S1 and S0 is reduced to 3% at the end of government-led reconstruction activity, a level that should take another four years to achieve under S2.

  5. Radioimmunoassay evaluation and quality control by use of a simple computer program for a low cost desk top calculator

    International Nuclear Information System (INIS)

    Schwarz, S.

    1980-01-01

    A simple computer program for the data processing and quality control of radioimmunoassays is presented. It is written for low cost programmable desk top calculator (Hewlett Packard 97), which can be afforded by smaller laboratories. The untreated counts from the scintillation spectrometer are entered manually; the printout gives the following results: initial data, logit-log transformed calibration points, parameters of goodness of fit and of the position of the standard curve, control and unknown samples dose estimates (mean value from single dose interpolations and scatter of replicates) together with the automatic calculation of within assay variance and, by use of magnetic cards holding the control parameters of all previous assays, between assay variance. (orig.) [de

  6. TOWARDS A LOW-COST, REAL-TIME PHOTOGRAMMETRIC LANDSLIDE MONITORING SYSTEM UTILISING MOBILE AND CLOUD COMPUTING TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    P. Chidburee

    2016-06-01

    Full Text Available Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i the development of an Android mobile application; (ii the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan and a web-based system (Autodesk 123D Catch. Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard

  7. Towards a Low-Cost Real-Time Photogrammetric Landslide Monitoring System Utilising Mobile and Cloud Computing Technology

    Science.gov (United States)

    Chidburee, P.; Mills, J. P.; Miller, P. E.; Fieber, K. D.

    2016-06-01

    Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM) pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i) the development of an Android mobile application; (ii) the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii) performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan) and a web-based system (Autodesk 123D Catch). Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard at a local-scale.

  8. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    International Nuclear Information System (INIS)

    Wattson, Daniel A.; Hunink, M.G. Myriam; DiPiro, Pamela J.; Das, Prajnan; Hodgson, David C.; Mauch, Peter M.; Ng, Andrea K.

    2014-01-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening

  9. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wattson, Daniel A., E-mail: dwattson@partners.org [Harvard Radiation Oncology Program, Boston, Massachusetts (United States); Hunink, M.G. Myriam [Departments of Radiology and Epidemiology, Erasmus Medical Center, Rotterdam, the Netherlands and Center for Health Decision Science, Harvard School of Public Health, Boston, Massachusetts (United States); DiPiro, Pamela J. [Department of Imaging, Dana-Farber Cancer Institute, Boston, Massachusetts (United States); Das, Prajnan [Department of Radiation Oncology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Hodgson, David C. [Department of Radiation Oncology, University of Toronto, Toronto, Ontario (Canada); Mauch, Peter M.; Ng, Andrea K. [Department of Radiation Oncology, Brigham and Women' s Hospital and Dana-Farber Cancer Institute, Boston, Massachusetts (United States)

    2014-10-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening

  10. Low-dose chest computed tomography for lung cancer screening among Hodgkin lymphoma survivors: a cost-effectiveness analysis.

    Science.gov (United States)

    Wattson, Daniel A; Hunink, M G Myriam; DiPiro, Pamela J; Das, Prajnan; Hodgson, David C; Mauch, Peter M; Ng, Andrea K

    2014-10-01

    Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening may be cost effective for all smokers but possibly not

  11. Low-cost computer classification of land cover in the Portland area, Oregon, by signature extension techniques

    Science.gov (United States)

    Gaydos, Leonard

    1978-01-01

    Computer-aided techniques for interpreting multispectral data acquired by Landsat offer economies in the mapping of land cover. Even so, the actual establishment of the statistical classes, or "signatures," is one of the relatively more costly operations involved. Analysts have therefore been seeking cost-saving signature extension techniques that would accept training data acquired for one time or place and apply them to another. Opportunities to extend signatures occur in preprocessing steps and in the classification steps that follow. In the present example, land cover classes were derived by the simplest and most direct form of signature extension: Classes statistically derived from a Landsat scene for the Puget Sound area, Wash., were applied to the Portland area, Oreg., using data for the next Landsat scene acquired less than 25 seconds down orbit. Many features can be recognized on the reduced-scale version of the Portland land cover map shown in this report, although no statistical assessment of its accuracy is available.

  12. Computed tomography for preoperative planning in minimal-invasive total hip arthroplasty: Radiation exposure and cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huppertz, Alexander, E-mail: Alexander.Huppertz@charite.de [Imaging Science Institute Charite Berlin, Robert-Koch-Platz 7, D-10115 Berlin (Germany); Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Radmer, Sebastian, E-mail: s.radmer@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany); Asbach, Patrick, E-mail: Patrick.Asbach@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Juran, Ralf, E-mail: ralf.juran@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Schwenke, Carsten, E-mail: carsten.schwenke@scossis.de [Biostatistician, Scossis Statistical Consulting, Zeltinger Str. 58G, D-13465 Berlin (Germany); Diederichs, Gerd, E-mail: gerd.diederichs@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Hamm, Bernd, E-mail: Bernd.Hamm@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Sparmann, Martin, E-mail: m.sparmann@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany)

    2011-06-15

    Computed tomography (CT) was used for preoperative planning of minimal-invasive total hip arthroplasty (THA). 92 patients (50 males, 42 females, mean age 59.5 years) with a mean body-mass-index (BMI) of 26.5 kg/m{sup 2} underwent 64-slice CT to depict the pelvis, the knee and the ankle in three independent acquisitions using combined x-, y-, and z-axis tube current modulation. Arthroplasty planning was performed using 3D-Hip Plan (Symbios, Switzerland) and patient radiation dose exposure was determined. The effects of BMI, gender, and contralateral THA on the effective dose were evaluated by an analysis-of-variance. A process-cost-analysis from the hospital perspective was done. All CT examinations were of sufficient image quality for 3D-THA planning. A mean effective dose of 4.0 mSv (SD 0.9 mSv) modeled by the BMI (p < 0.0001) was calculated. The presence of a contralateral THA (9/92 patients; p = 0.15) and the difference between males and females were not significant (p = 0.08). Personnel involved were the radiologist (4 min), the surgeon (16 min), the radiographer (12 min), and administrative personnel (4 min). A CT operation time of 11 min and direct per-patient costs of 52.80 Euro were recorded. Preoperative CT for THA was associated with a slight and justifiable increase of radiation exposure in comparison to conventional radiographs and low per-patient costs.

  13. A low-cost computer-controlled Arduino-based educational laboratory system for teaching the fundamentals of photovoltaic cells

    International Nuclear Information System (INIS)

    Zachariadou, K; Yiasemides, K; Trougkakos, N

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C number sign and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments’ academic staff. (paper)

  14. A low-cost computer-controlled Arduino-based educational laboratory system for teaching the fundamentals of photovoltaic cells

    Energy Technology Data Exchange (ETDEWEB)

    Zachariadou, K; Yiasemides, K; Trougkakos, N [Technological Educational Institute of Piraeus, P Ralli and Thivon 250, 12244 Egaleo (Greece)

    2012-11-15

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C number sign and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments' academic staff. (paper)

  15. Beat-ID: Towards a computationally low-cost single heartbeat biometric identity check system based on electrocardiogram wave morphology

    Science.gov (United States)

    Paiva, Joana S.; Dias, Duarte

    2017-01-01

    In recent years, safer and more reliable biometric methods have been developed. Apart from the need for enhanced security, the media and entertainment sectors have also been applying biometrics in the emerging market of user-adaptable objects/systems to make these systems more user-friendly. However, the complexity of some state-of-the-art biometric systems (e.g., iris recognition) or their high false rejection rate (e.g., fingerprint recognition) is neither compatible with the simple hardware architecture required by reduced-size devices nor the new trend of implementing smart objects within the dynamic market of the Internet of Things (IoT). It was recently shown that an individual can be recognized by extracting features from their electrocardiogram (ECG). However, most current ECG-based biometric algorithms are computationally demanding and/or rely on relatively large (several seconds) ECG samples, which are incompatible with the aforementioned application fields. Here, we present a computationally low-cost method (patent pending), including simple mathematical operations, for identifying a person using only three ECG morphology-based characteristics from a single heartbeat. The algorithm was trained/tested using ECG signals of different duration from the Physionet database on more than 60 different training/test datasets. The proposed method achieved maximal averaged accuracy of 97.450% in distinguishing each subject from a ten-subject set and false acceptance and rejection rates (FAR and FRR) of 5.710±1.900% and 3.440±1.980%, respectively, placing Beat-ID in a very competitive position in terms of the FRR/FAR among state-of-the-art methods. Furthermore, the proposed method can identify a person using an average of 1.020 heartbeats. It therefore has FRR/FAR behavior similar to obtaining a fingerprint, yet it is simpler and requires less expensive hardware. This method targets low-computational/energy-cost scenarios, such as tiny wearable devices (e.g., a

  16. Improving the mixing performances of rice straw anaerobic digestion for higher biogas production by computational fluid dynamics (CFD) simulation.

    Science.gov (United States)

    Shen, Fei; Tian, Libin; Yuan, Hairong; Pang, Yunzhi; Chen, Shulin; Zou, Dexun; Zhu, Baoning; Liu, Yanping; Li, Xiujin

    2013-10-01

    As a lignocellulose-based substrate for anaerobic digestion, rice straw is characterized by low density, high water absorbability, and poor fluidity. Its mixing performances in digestion are completely different from traditional substrates such as animal manures. Computational fluid dynamics (CFD) simulation was employed to investigate mixing performances and determine suitable stirring parameters for efficient biogas production from rice straw. The results from CFD simulation were applied in the anaerobic digestion tests to further investigate their reliability. The results indicated that the mixing performances could be improved by triple impellers with pitched blade, and complete mixing was easily achieved at the stirring rate of 80 rpm, as compared to 20-60 rpm. However, mixing could not be significantly improved when the stirring rate was further increased from 80 to 160 rpm. The simulation results agreed well with the experimental results. The determined mixing parameters could achieve the highest biogas yield of 370 mL (g TS)(-1) (729 mL (g TS(digested))(-1)) and 431 mL (g TS)(-1) (632 mL (g TS(digested))(-1)) with the shortest technical digestion time (T 80) of 46 days. The results obtained in this work could provide useful guides for the design and operation of biogas plants using rice straw as substrates.

  17. Early assessment of the likely cost-effectiveness of a new technology: A Markov model with probabilistic sensitivity analysis of computer-assisted total knee replacement.

    Science.gov (United States)

    Dong, Hengjin; Buxton, Martin

    2006-01-01

    The objective of this study is to apply a Markov model to compare cost-effectiveness of total knee replacement (TKR) using computer-assisted surgery (CAS) with that of TKR using a conventional manual method in the absence of formal clinical trial evidence. A structured search was carried out to identify evidence relating to the clinical outcome, cost, and effectiveness of TKR. Nine Markov states were identified based on the progress of the disease after TKR. Effectiveness was expressed by quality-adjusted life years (QALYs). The simulation was carried out initially for 120 cycles of a month each, starting with 1,000 TKRs. A discount rate of 3.5 percent was used for both cost and effectiveness in the incremental cost-effectiveness analysis. Then, a probabilistic sensitivity analysis was carried out using a Monte Carlo approach with 10,000 iterations. Computer-assisted TKR was a long-term cost-effective technology, but the QALYs gained were small. After the first 2 years, the incremental cost per QALY of computer-assisted TKR was dominant because of cheaper and more QALYs. The incremental cost-effectiveness ratio (ICER) was sensitive to the "effect of CAS," to the CAS extra cost, and to the utility of the state "Normal health after primary TKR," but it was not sensitive to utilities of other Markov states. Both probabilistic and deterministic analyses produced similar cumulative serious or minor complication rates and complex or simple revision rates. They also produced similar ICERs. Compared with conventional TKR, computer-assisted TKR is a cost-saving technology in the long-term and may offer small additional QALYs. The "effect of CAS" is to reduce revision rates and complications through more accurate and precise alignment, and although the conclusions from the model, even when allowing for a full probabilistic analysis of uncertainty, are clear, the "effect of CAS" on the rate of revisions awaits long-term clinical evidence.

  18. Theoretical and computational study of the energy dependence of the muon transfer rate from hydrogen to higher-Z gases

    Energy Technology Data Exchange (ETDEWEB)

    Bakalov, Dimitar, E-mail: dbakalov@inrne.bas.bg [Institute for Nuclear Research and Nuclear Energy, Bulgarian Academy of Sciences, Tsarigradsko chaussée 72, Sofia 1784 (Bulgaria); Adamczak, Andrzej [Institute of Nuclear Physics, Polish Academy of Sciences, ul. Radzikowskiego 152, 31-342 Krakow (Poland); Stoilov, Mihail [Institute for Nuclear Research and Nuclear Energy, Bulgarian Academy of Sciences, Tsarigradsko chaussée 72, Sofia 1784 (Bulgaria); Vacchi, Andrea [Istituto Nazionale di Fisica Nucleare, Sezione di Trieste, Via A. Valerio 2, 34127 Trieste (Italy)

    2015-01-23

    The recent PSI Lamb shift experiment and the controversy about proton size revived the interest in measuring the hyperfine splitting in muonic hydrogen as an alternative possibility for comparing ordinary and muonic hydrogen spectroscopy data on proton electromagnetic structure. This measurement critically depends on the energy dependence of the muon transfer rate to heavier gases in the epithermal range. The available data provide only qualitative information, and the theoretical predictions have not been verified. We propose a new method by measurements of the transfer rate in thermalized target at different temperatures, estimate its accuracy and investigate the optimal experimental conditions. - Highlights: • Method for measuring the energy dependence of muon transfer rate to higher-Z gases. • Thermalization and depolarization of muonic hydrogen studied by Monte Carlo method. • Optimal experimental conditions determined by Monte Carlo simulations. • Mathematical model and for estimating the uncertainty of the experimental results.

  19. Costs associated with implementation of computer-assisted clinical decision support system for antenatal and delivery care: case study of Kassena-Nankana district of northern Ghana.

    Directory of Open Access Journals (Sweden)

    Maxwell Ayindenaba Dalaba

    Full Text Available This study analyzed cost of implementing computer-assisted Clinical Decision Support System (CDSS in selected health care centres in Ghana.A descriptive cross sectional study was conducted in the Kassena-Nankana district (KND. CDSS was deployed in selected health centres in KND as an intervention to manage patients attending antenatal clinics and the labour ward. The CDSS users were mainly nurses who were trained. Activities and associated costs involved in the implementation of CDSS (pre-intervention and intervention were collected for the period between 2009-2013 from the provider perspective. The ingredients approach was used for the cost analysis. Costs were grouped into personnel, trainings, overheads (recurrent costs and equipment costs (capital cost. We calculated cost without annualizing capital cost to represent financial cost and cost with annualizing capital costs to represent economic cost.Twenty-two trained CDSS users (at least 2 users per health centre participated in the study. Between April 2012 and March 2013, users managed 5,595 antenatal clients and 872 labour clients using the CDSS. We observed a decrease in the proportion of complications during delivery (pre-intervention 10.74% versus post-intervention 9.64% and a reduction in the number of maternal deaths (pre-intervention 4 deaths versus post-intervention 1 death. The overall financial cost of CDSS implementation was US$23,316, approximately US$1,060 per CDSS user trained. Of the total cost of implementation, 48% (US$11,272 was pre-intervention cost and intervention cost was 52% (US$12,044. Equipment costs accounted for the largest proportion of financial cost: 34% (US$7,917. When economic cost was considered, total cost of implementation was US$17,128-lower than the financial cost by 26.5%.The study provides useful information in the implementation of CDSS at health facilities to enhance health workers' adherence to practice guidelines and taking accurate decisions to

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  1. The necessity of the use of social networks as an ingredient of computer-integrated marketing communications for advancement of higher educational establishments

    Directory of Open Access Journals (Sweden)

    Kostiuk Mariia

    2016-04-01

    Full Text Available The volume of demand and supply on educational services constantly grows and education becomes the perspective sphere of the Ukrainian economy. In the conditions of the permanent increased competition between educational establishments, it is impossible to do without marketing, namely - to marketing of educational services. The article substantiates the necessity of the use of computer-integrated marketing communications in advancement of higher educational establishment. It considers questions of advancement of higher educational establishments and educational services in Internet, analyses indexes of advancement of higher educational establishment in «VKontakte» social network. The recommendations for the promotion of universities in social networks were formulated on the basis of the study results.

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  3. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  6. [Are Higher Prices for Larger Femoral Heads in Total Hip Arthroplasty Justified from the Perspective of Health Care Economics? An Analysis of Costs and Effects in Germany].

    Science.gov (United States)

    Grunert, R; Schleifenbaum, S; Möbius, R; Sommer, G; Zajonz, D; Hammer, N; Prietzel, T

    2017-02-01

    Background: In total hip arthroplasty (THA), femoral head diameter has not been regarded as a key parameter which should be restored when reconstructing joint biomechanics and geometry. Apart from the controversial discussion on the advantages and disadvantages of using larger diameter heads, their higher cost is another important reason that they have only been used to a limited extent. The goal of this study was to analyse the price structure of prosthetic heads in comparison to other components used in THA. A large group of patients with hip endoprostheses were evaluated with respect to the implanted socket diameter and thus the theoretically attainable head diameter. Materials and Methods: The relative prices of various THA components (cups, inserts, stems and ball heads) distributed by two leading German manufacturers were determined and analysed. Special attention was paid to different sizes and varieties in a series of components. A large patient population treated with THA was evaluated with respect to the implanted cup diameter and therefore the theoretically attainable head diameter. Results: The pricing analysis of the THA components of two manufacturers showed identical prices for cups, inserts and stems in a series. In contrast to this, the prices for prosthetic heads with a diameter of 36-44 mm were 11-50 % higher than for 28 mm heads. Identical prices for larger heads were the exception. The distribution of the head diameter in 2719 THA cases showed significant differences between the actually implanted and the theoretically attainable heads. Conclusion: There are proven advantages in using larger diameter ball heads in THA and the remaining problems can be solved. It is therefore desirable to correct the current pricing practice of charging higher prices for larger components. Instead, identical prices should be charged for all head diameters in a series, as is currently established practice for all other THA components. Thus when

  7. Straightening the Hierarchical Staircase for Basis Set Extrapolations: A Low-Cost Approach to High-Accuracy Computational Chemistry

    Science.gov (United States)

    Varandas, António J. C.

    2018-04-01

    Because the one-electron basis set limit is difficult to reach in correlated post-Hartree-Fock ab initio calculations, the low-cost route of using methods that extrapolate to the estimated basis set limit attracts immediate interest. The situation is somewhat more satisfactory at the Hartree-Fock level because numerical calculation of the energy is often affordable at nearly converged basis set levels. Still, extrapolation schemes for the Hartree-Fock energy are addressed here, although the focus is on the more slowly convergent and computationally demanding correlation energy. Because they are frequently based on the gold-standard coupled-cluster theory with single, double, and perturbative triple excitations [CCSD(T)], correlated calculations are often affordable only with the smallest basis sets, and hence single-level extrapolations from one raw energy could attain maximum usefulness. This possibility is examined. Whenever possible, this review uses raw data from second-order Møller-Plesset perturbation theory, as well as CCSD, CCSD(T), and multireference configuration interaction methods. Inescapably, the emphasis is on work done by the author's research group. Certain issues in need of further research or review are pinpointed.

  8. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  9. The Internet--Flames, Firewalls and the Future. Proceedings for the 1995 Conference of the Council for Higher Education Computing Services (CHECS) (Roswell, New Mexico, November 8-10, 1995).

    Science.gov (United States)

    Suiter, Martha, Ed.

    This set of proceedings assembles papers presented at the 1995 Council for Higher Education Computing Services (CHECS) conference, held at the New Mexico Military Institute in Roswell, New Mexico. CHECS members are higher education computing services organizations within the state of New Mexico. The main focus of the conference was the Internet…

  10. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  16. Computer simulation of energy use, greenhouse gas emissions, and costs for alternative methods of processing fluid milk.

    Science.gov (United States)

    Tomasula, P M; Datta, N; Yee, W C F; McAloon, A J; Nutter, D W; Sampedro, F; Bonnaillie, L M

    2014-07-01

    Computer simulation is a useful tool for benchmarking electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature, short time (HTST) pasteurization was extended to include models for processes for shelf-stable milk and extended shelf-life milk that may help prevent the loss or waste of milk that leads to increases in the greenhouse gas (GHG) emissions for fluid milk. The models were for UHT processing, crossflow microfiltration (MF) without HTST pasteurization, crossflow MF followed by HTST pasteurization (MF/HTST), crossflow MF/HTST with partial homogenization, and pulsed electric field (PEF) processing, and were incorporated into the existing model for the fluid milk process. Simulation trials were conducted assuming a production rate for the plants of 113.6 million liters of milk per year to produce only whole milk (3.25%) and 40% cream. Results showed that GHG emissions in the form of process-related CO₂ emissions, defined as CO₂ equivalents (e)/kg of raw milk processed (RMP), and specific energy consumptions (SEC) for electricity and natural gas use for the HTST process alone were 37.6g of CO₂e/kg of RMP, 0.14 MJ/kg of RMP, and 0.13 MJ/kg of RMP, respectively. Emissions of CO2 and SEC for electricity and natural gas use were highest for the PEF process, with values of 99.1g of CO₂e/kg of RMP, 0.44 MJ/kg of RMP, and 0.10 MJ/kg of RMP, respectively, and lowest for the UHT process at 31.4 g of CO₂e/kg of RMP, 0.10 MJ/kg of RMP, and 0.17 MJ/kg of RMP. Estimated unit production costs associated with the various processes were lowest for the HTST process and MF/HTST with partial homogenization at $0.507/L and highest for the UHT process at $0.60/L. The increase in shelf life associated with the UHT and MF processes may eliminate some of the supply chain product and consumer losses and waste of milk and compensate for the small increases in GHG

  17. Computer-Aided Surgical Simulation in Head and Neck Reconstruction: A Cost Comparison among Traditional, In-House, and Commercial Options.

    Science.gov (United States)

    Li, Sean S; Copeland-Halperin, Libby R; Kaminsky, Alexander J; Li, Jihui; Lodhi, Fahad K; Miraliakbari, Reza

    2018-06-01

    Computer-aided surgical simulation (CASS) has redefined surgery, improved precision and reduced the reliance on intraoperative trial-and-error manipulations. CASS is provided by third-party services; however, it may be cost-effective for some hospitals to develop in-house programs. This study provides the first cost analysis comparison among traditional (no CASS), commercial CASS, and in-house CASS for head and neck reconstruction.  The costs of three-dimensional (3D) pre-operative planning for mandibular and maxillary reconstructions were obtained from an in-house CASS program at our large tertiary care hospital in Northern Virginia, as well as a commercial provider (Synthes, Paoli, PA). A cost comparison was performed among these modalities and extrapolated in-house CASS costs were derived. The calculations were based on estimated CASS use with cost structures similar to our institution and sunk costs were amortized over 10 years.  Average operating room time was estimated at 10 hours, with an average of 2 hours saved with CASS. The hourly cost to the hospital for the operating room (including anesthesia and other ancillary costs) was estimated at $4,614/hour. Per case, traditional cases were $46,140, commercial CASS cases were $40,951, and in-house CASS cases were $38,212. Annual in-house CASS costs were $39,590.  CASS reduced operating room time, likely due to improved efficiency and accuracy. Our data demonstrate that hospitals with similar cost structure as ours, performing greater than 27 cases of 3D head and neck reconstructions per year can see a financial benefit from developing an in-house CASS program. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  18. Is there any way out there? Environmental and cultural influences in computing least-cost paths with GIS

    Directory of Open Access Journals (Sweden)

    Fairén Jiménez, Sara

    2004-12-01

    Full Text Available One of the most interesting subjects in post-structural landscape studies is the analysis of the relationships between its natural and cultural components: the structuring of the landscape, with the identification of the social practices and patterns of movement that took part around them. These patterns depend both on the natural form of the terrain and on socio-cultural decisions. In relation to the settlement pattern and distribution of rock art sites in the central- Mediterranean coastal area of Spain, a method is proposed to evaluate the role of cultural aspects of landscape in computing least-cost paths.

    Entre los recientes estudios dentro de la Arqueología del Paisaje, uno de los aspectos que presenta mayor potencial interpretativo es el análisis de la relación entre sus distintos componentes naturales y culturales: la articulación del paisaje – con la identificación de las prácticas sociales que se realizarían en torno a estos elementos y de las pautas de movimiento entre unos y otros. Este movimiento dependería tanto de las características naturales del terreno como de decisiones prácticas de carácter socio-cultural. A partir del estudio de la distribución del poblamiento y abrigos con arte rupestre neolíticos en las tierras centro-meridionales valencianas, se propone un sistema para la introducción y valoración del papel de los componentes culturales del paisaje en el cálculo de caminos óptimos mediante Sistemas de Información Geográfica.

  19. [Changing the internal cost allocation (ICA) on DRG shares : Example of computed tomography in a university radiology setting].

    Science.gov (United States)

    Wirth, K; Zielinski, P; Trinter, T; Stahl, R; Mück, F; Reiser, M; Wirth, S

    2016-08-01

    In hospitals, the radiological services provided to non-privately insured in-house patients are mostly distributed to requesting disciplines through internal cost allocation (ICA). In many institutions, computed tomography (CT) is the modality with the largest amount of allocation credits. The aim of this work is to compare the ICA to respective DRG (Diagnosis Related Groups) shares for diagnostic CT services in a university hospital setting. The data from four CT scanners in a large university hospital were processed for the 2012 fiscal year. For each of the 50 DRG groups with the most case-mix points, all diagnostic CT services were documented including their respective amount of GOÄ allocation credits and invoiced ICA value. As the German Institute for Reimbursement of Hospitals (InEK) database groups the radiation disciplines (radiology, nuclear medicine and radiation therapy) together and also lacks any modality differentiation, the determination of the diagnostic CT component was based on the existing institutional distribution of ICA allocations. Within the included 24,854 cases, 63,062,060 GOÄ-based performance credits were counted. The ICA relieved these diagnostic CT services by € 819,029 (single credit value of 1.30 Eurocent), whereas accounting by using DRG shares would have resulted in € 1,127,591 (single credit value of 1.79 Eurocent). The GOÄ single credit value is 5.62 Eurocent. The diagnostic CT service was basically rendered as relatively inexpensive. In addition to a better financial result, changing the current ICA to DRG shares might also mean a chance for real revenues. However, the attractiveness considerably depends on how the DRG shares are distributed to the different radiation disciplines of one institution.

  20. On Account--table Combined Method for Education Cost Calculation of Higher Vocational Colleges%高职院校教育成本帐表结合法的探析

    Institute of Scientific and Technical Information of China (English)

    韩征

    2012-01-01

    随着高职院校各项教育费用支出的增加,教育成本逐渐受到人们的关注.为了科学核算高职院校教育成本,分析了高职院校现行核算方法不能客观反映教育成本的原因,明确了教育成本核算的基本会计假设和原则.以高职院校经费预算项目核算为基础,结合高职院校财务现状和办学特点,提出了帐表结合法核算高职院校教育成本.该方法采用项目帐和费用分配表相结合的模式,可科学核算高职院校教育成本.%With the increase of various education expenses of higher vocational colleges, educa-tion cost is gradually getting noticed by people. In order to calculate education cost of higher vo-cational colleges scientifically, this paper puts forward that current calculation method can not re-flect the education cost of higher vocational colleges objectively and clarifies basic accounting pos-tulate as well as principles of education cost calculation. Based on calculation of fund budget i-tems of higher vocational colleges, this paper puts forward account table combined method to calculate education cost of higher vocational colleges according to their financial situation and school-running characteristics. This method adopts the model combining item account with cost allocation table, which can calculate education cost of higher vocational colleges scientifically.

  1. Cost-effectiveness of computer-assisted training in cognitive-behavioral therapy as an adjunct to standard care for addiction.

    Science.gov (United States)

    Olmstead, Todd A; Ostrow, Cary D; Carroll, Kathleen M

    2010-08-01

    To determine the cost-effectiveness, from clinic and patient perspectives, of a computer-based version of cognitive-behavioral therapy (CBT4CBT) as an addition to regular clinical practice for substance dependence. PARTICIPANTS, DESIGN AND MEASUREMENTS: This cost-effectiveness study is based on a randomized clinical trial in which 77 individuals seeking treatment for substance dependence at an outpatient community setting were randomly assigned to treatment as usual (TAU) or TAU plus biweekly access to computer-based training in CBT (TAU plus CBT4CBT). The primary patient outcome measure was the total number of drug-free specimens provided during treatment. Incremental cost-effectiveness ratios (ICERs) and cost-effectiveness acceptability curves (CEACs) were used to determine the cost-effectiveness of TAU plus CBT4CBT relative to TAU alone. Results are presented from both the clinic and patient perspectives and are shown to be robust to (i) sensitivity analyses and (ii) a secondary objective patient outcome measure. The per patient cost of adding CBT4CBT to standard care was $39 ($27) from the clinic (patient) perspective. From the clinic (patient) perspective, TAU plus CBT4CBT is likely to be cost-effective when the threshold value to decision makers of an additional drug-free specimen is greater than approximately $21 ($15), and TAU alone is likely to be cost-effective when the threshold value is less than approximately $21 ($15). The ICERs for TAU plus CBT4CBT also compare favorably to ICERs reported elsewhere for other empirically validated therapies, including contingency management. TAU plus CBT4CBT appears to be a good value from both the clinic and patient perspectives. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  2. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  3. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  5. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  8. A time and imaging cost analysis of low-risk ED observation patients: a conservative 64-section computed tomography coronary angiography "triple rule-out" compared to nuclear stress test strategy.

    Science.gov (United States)

    Takakuwa, Kevin M; Halpern, Ethan J; Shofer, Frances S

    2011-02-01

    The study aimed to examine time and imaging costs of 2 different imaging strategies for low-risk emergency department (ED) observation patients with acute chest pain or symptoms suggestive of acute coronary syndrome. We compared a "triple rule-out" (TRO) 64-section multidetector computed tomography protocol with nuclear stress testing. This was a prospective observational cohort study of consecutive ED patients who were enrolled in our chest pain observation protocol during a 16-month period. Our standard observation protocol included a minimum of 2 sets of cardiac enzymes at least 6 hours apart followed by a nuclear stress test. Once a week, observation patients were offered a TRO (to evaluate for coronary artery disease, thoracic dissection, and pulmonary embolus) multidetector computed tomography with the option of further stress testing for those patients found to have evidence of coronary artery disease. We analyzed 832 consecutive observation patients including 214 patients who underwent the TRO protocol. Mean total length of stay was 16.1 hours for TRO patients, 16.3 hours for TRO plus other imaging test, 22.6 hours for nuclear stress testing, 23.3 hours for nuclear stress testing plus other imaging tests, and 23.7 hours for nuclear stress testing plus TRO (P < .0001 for TRO and TRO + other test compared to stress test ± other test). Mean imaging times were 3.6, 4.4, 5.9, 7.5, and 6.6 hours, respectively (P < .05 for TRO and TRO + other test compared to stress test ± other test). Mean imaging costs were $1307 for TRO patients vs $945 for nuclear stress testing. Triple rule-out reduced total length of stay and imaging time but incurred higher imaging costs. A per-hospital analysis would be needed to determine if patient time savings justify the higher imaging costs. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Suspected acute pulmonary emboli: cost-effectiveness of chest helical computed tomography versus a standard diagnostic algorithm incorporating ventilation-perfusion scintigraphy

    International Nuclear Information System (INIS)

    Larcos, G.; Chi, K.K.G.; Berry, G.; Westmead Hospital, Sydney, NSW; Shiell, A.

    2000-01-01

    There is a controversy regarding the investigation of patients with suspected acute pulmonary embolism (PE). To compare the cost-effectiveness of alternative methods of diagnosing acute PE, chest helical computed tomography (CT) alone and in combination with venous ultrasound (US) of legs and pulmonary angiography (PA) were compared to a conventional algorithm using ventilation-perfusion (V/Q) scintigraphy supplemented in selected cases by US and PA. A decision-analytical model was constructed to model the costs and effects of the three diagnostic strategies in a hypothetical cohort of 1000 patients each. Transition probabilities were based on published data. Life years gained by each strategy were estimated from published mortality rates. Schedule fees were used to estimate costs. The V/Q protocol is both more expensive and more effective than CT alone resulting in 20.1 additional lives saved at a (discounted) cost of $940 per life year gained. An additional 2.5 lives can be saved if CT replaces V/Q scintigraphy in the diagnostic algorithm but at a cost of $23,905 per life year saved. It resulted that the more effective diagnostic strategies are also more expensive. In patients with suspected PE, the incremental cost-effectiveness of the V/Q based strategy over CT alone is reasonable in comparison with other health interventions. The cost-effectiveness of the supplemented CT strategy is more questionable. Copyright (2000) The Australasian College of Physicians

  10. Positron emission tomography/computed tomography surveillance in patients with Hodgkin lymphoma in first remission has a low positive predictive value and high costs.

    Science.gov (United States)

    El-Galaly, Tarec Christoffer; Mylam, Karen Juul; Brown, Peter; Specht, Lena; Christiansen, Ilse; Munksgaard, Lars; Johnsen, Hans Erik; Loft, Annika; Bukh, Anne; Iyer, Victor; Nielsen, Anne Lerberg; Hutchings, Martin

    2012-06-01

    The value of performing post-therapy routine surveillance imaging in patients with Hodgkin lymphoma is controversial. This study evaluates the utility of positron emission tomography/computed tomography using 2-[18F]fluoro-2-deoxyglucose for this purpose and in situations with suspected lymphoma relapse. We conducted a multicenter retrospective study. Patients with newly diagnosed Hodgkin lymphoma achieving at least a partial remission on first-line therapy were eligible if they received positron emission tomography/computed tomography surveillance during follow-up. Two types of imaging surveillance were analyzed: "routine" when patients showed no signs of relapse at referral to positron emission tomography/computed tomography, and "clinically indicated" when recurrence was suspected. A total of 211 routine and 88 clinically indicated positron emission tomography/computed tomography studies were performed in 161 patients. In ten of 22 patients with recurrence of Hodgkin lymphoma, routine imaging surveillance was the primary tool for the diagnosis of the relapse. Extranodal disease, interim positron emission tomography-positive lesions and positron emission tomography activity at response evaluation were all associated with a positron emission tomography/computed tomography-diagnosed preclinical relapse. The true positive rates of routine and clinically indicated imaging were 5% and 13%, respectively (P = 0.02). The overall positive predictive value and negative predictive value of positron emission tomography/computed tomography were 28% and 100%, respectively. The estimated cost per routine imaging diagnosed relapse was US$ 50,778. Negative positron emission tomography/computed tomography reliably rules out a relapse. The high false positive rate is, however, an important limitation and a confirmatory biopsy is mandatory for the diagnosis of a relapse. With no proven survival benefit for patients with a pre-clinically diagnosed relapse, the high costs and low

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  12. Determination of the secondary energy from the electron beam with a flattening foil by computer. Percentage depth dose curve fitting using the specific higher order polynomial

    Energy Technology Data Exchange (ETDEWEB)

    Kawakami, H [Kyushu Univ., Beppu, Oita (Japan). Inst. of Balneotherapeutics

    1980-09-01

    A computer program written in FORTRAN is described for determining the secondary energy of the electron beam which passed through a flattening foil, using a time-sharing computer service. The procedure of this program is first to fit the specific higher order polynomial to the measured percentage depth dose curve. Next, the practical range is evaluated by the point of intersection R of the line tangent to the fitted curve at the inflection point P and the given dose E, as shown in Fig. 2. Finally, the secondary energy corresponded to the determined practical range can be obtained by the experimental equation (2.1) between the practial range R (g/cm/sup 2/) and the electron energy T (MeV). A graph for the fitted polynomial with the inflection points and the practical range can be plotted on a teletype machine by request of user. In order to estimate the shapes of percentage depth dose curves correspond to the electron beams of different energies, we tried to find some specific functional relationships between each coefficient of the fitted seventh-degree equation and the incident electron energies. However, exact relationships could not be obtained for irreguarity among these coefficients.

  13. Development of a Computer Program (CASK) for the Analysis of Logistics and Transportation Cost of the Spent Fuels

    International Nuclear Information System (INIS)

    Cha, Jeong-Hun; Choi, Heui-Joo; Cho, Dong-Keun; Kim, Seong-Ki; Lee, Jong-Youl; Choi, Jong-Won

    2008-07-01

    The cost for the spent fuel management includes the costs for the interim storage, the transportation, and the permanent disposal of the spent fuels. The CASK(Cost and logistics Analysis program for Spent fuel transportation in Korea) program is developed to analyze logistics and transportation cost of the spent fuels. And the total amount of PWR spent fuels stored in four nuclear plant sites, a centralized interim storage facility near coast and a permanent disposal facility near the interim storage facility are considered in this program. The CASK program is developed by using Visual Basic language and coupled with an excel sheet. The excel sheet shows a change of logistics and transportation cost. Also transportation unit cost is easily changed in the excel sheet. The scopes of the report are explanation of parameters in the CASK program and a preliminary calculation. We have developed the CASK version 1.0 so far, and will update its parameters in transportation cost and transportation scenario. Also, we will incorporate it into the program which is used for the projection of spent fuels from the nuclear power plants. Finally, it is expected that the CASK program could be a part of the cost estimation tools which are under development at KAERI. And this program will be a very useful tool for the establishment of transportation scenario and transportation cost in Korean situations

  14. A Proposed Model for Improving Performance and Reducing Costs of IT Through Cloud Computing of Egyptian Business Enterprises

    OpenAIRE

    Mohamed M.El Hadi; Azza Monir Ismail

    2016-01-01

    Information technologies are affecting the big business enterprises of todays from data processing and transactions to achieve the goals efficiently and effectively, affecting creates new business opportunities and towards new competitive advantage, service must be enough to match the recent trends of IT such as cloud computing. Cloud computing technology has provided all IT services. Therefore, cloud computing offers an alternative to adaptable with technology model current , creating reduci...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  16. Costs and clinical outcomes in individuals without known coronary artery disease undergoing coronary computed tomographic angiography from an analysis of Medicare category III transaction codes.

    Science.gov (United States)

    Min, James K; Shaw, Leslee J; Berman, Daniel S; Gilmore, Amanda; Kang, Ning

    2008-09-15

    Multidetector coronary computed tomographic angiography (CCTA) demonstrates high accuracy for the detection and exclusion of coronary artery disease (CAD) and predicts adverse prognosis. To date, opportunity costs relating the clinical and economic outcomes of CCTA compared with other methods of diagnosing CAD, such as myocardial perfusion single-photon emission computed tomography (SPECT), remain unknown. An observational, multicenter, patient-level analysis of patients without known CAD who underwent CCTA or SPECT was performed. Patients who underwent CCTA (n = 1,938) were matched to those who underwent SPECT (n = 7,752) on 8 demographic and clinical characteristics and 2 summary measures of cardiac medications and co-morbidities and were evaluated for 9-month expenditures and clinical outcomes. Adjusted total health care and CAD expenditures were 27% (p cost-efficient alternative to SPECT for the initial coronary evaluation of patients without known CAD.

  17. An Analysis of the RCA Price-S Cost Estimation Model as it Relates to Current Air Force Computer Software Acquisition and Management.

    Science.gov (United States)

    1979-12-01

    because of the use of complex computational algorithms (Ref 25). Another important factor effecting the cost of soft- ware is the size of the development...involved the alignment and navigational algorithm portions of the software. The second avionics system application was the development of an inertial...001 1 COAT CONL CREA CINT CMAT CSTR COPR CAPP New Code .001 .001 .001 .001 1001 ,OO .00 Device TDAT T03NL TREA TINT Types o * Quantity OGAT OONL OREA

  18. Smoking is Associated with Higher Disease-related Costs and Lower Health-related Quality of Life in Inflammatory Bowel Disease

    NARCIS (Netherlands)

    Severs, M.; Mangen, M.J.; Valk, M.E. van der; Fidder, H.H.; Dijkstra, G.; Have, M. van der; Bodegraven, A.A. van; Jong, D.J. de; Woude, C.J. van der; Romberg-Camps, M.J.; Clemens, C.H.; Jansen, J.M.; Meeberg, P.C. van de; Mahmmod, N.; Ponsioen, C.Y.; Vermeijden, J.R.; Jong, A E F de; Pierik, M.; Siersema, P.D.; Oldenburg, B.

    2017-01-01

    Background and Aims: Smoking affects the course of inflammatory bowel disease [IBD]. We aimed to study the impact of smoking on IBD-specific costs and health-related quality-of-life [HrQoL] among adults with Crohn's disease [CD] and ulcerative colitis [UC]. Methods: A large cohort of IBD patients

  19. ABC's of Higher Education. Getting Back to the Basics: An Activity-Based Costing Approach to Planning and Financial Decision Making. AIR 1999 Annual Forum Paper.

    Science.gov (United States)

    Cox, Kelline S.; Downey, Ronald G.; Smith, Laurinda G.

    This paper describes the activity-based costing approach used to report and capture the time spent by faculty for specified activities at one Midwestern university. For each department, four major areas (instruction, research, public service, and administration) and 14 activities were identified. During the annual goal-setting period, each faculty…

  20. Computer simulation to predict energy use, greenhouse gas emissions and costs for production of fluid milk using alternative processing methods

    Science.gov (United States)

    Computer simulation is a useful tool for benchmarking the electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature short time (HTST) pasteurization was extended to include models for pr...

  1. Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics.

    Science.gov (United States)

    Patrizi, Alfredo; Pennestrì, Ettore; Valentini, Pier Paolo

    2016-01-01

    The paper deals with the comparison between a high-end marker-based acquisition system and a low-cost marker-less methodology for the assessment of the human posture during working tasks. The low-cost methodology is based on the use of a single Microsoft Kinect V1 device. The high-end acquisition system is the BTS SMART that requires the use of reflective markers to be placed on the subject's body. Three practical working activities involving object lifting and displacement have been investigated. The operational risk has been evaluated according to the lifting equation proposed by the American National Institute for Occupational Safety and Health. The results of the study show that the risk multipliers computed from the two acquisition methodologies are very close for all the analysed activities. In agreement to this outcome, the marker-less methodology based on the Microsoft Kinect V1 device seems very promising to promote the dissemination of computer-aided assessment of ergonomics while maintaining good accuracy and affordable costs. PRACTITIONER’S SUMMARY: The study is motivated by the increasing interest for on-site working ergonomics assessment. We compared a low-cost marker-less methodology with a high-end marker-based system. We tested them on three different working tasks, assessing the working risk of lifting loads. The two methodologies showed comparable precision in all the investigations.

  2. Product Costing in FMT: Comparing Deterministic and Stochastic Models Using Computer-Based Simulation for an Actual Case Study

    DEFF Research Database (Denmark)

    Nielsen, Steen

    2000-01-01

    This paper expands the traditional product costing technique be including a stochastic form in a complex production process for product costing. The stochastic phenomenon in flesbile manufacturing technologies is seen as an important phenomenon that companies try to decreas og eliminate. DFM has...... been used for evaluating the appropriateness of the firm's production capability. In this paper a simulation model is developed to analyze the relevant cost behaviour with respect to DFM and to develop a more streamlined process in the layout of the manufacturing process....

  3. Computational electrodynamics in material media with constraint-preservation, multidimensional Riemann solvers and sub-cell resolution - Part II, higher order FVTD schemes

    Science.gov (United States)

    Balsara, Dinshaw S.; Garain, Sudip; Taflove, Allen; Montecinos, Gino

    2018-02-01

    The Finite Difference Time Domain (FDTD) scheme has served the computational electrodynamics community very well and part of its success stems from its ability to satisfy the constraints in Maxwell's equations. Even so, in the previous paper of this series we were able to present a second order accurate Godunov scheme for computational electrodynamics (CED) which satisfied all the same constraints and simultaneously retained all the traditional advantages of Godunov schemes. In this paper we extend the Finite Volume Time Domain (FVTD) schemes for CED in material media to better than second order of accuracy. From the FDTD method, we retain a somewhat modified staggering strategy of primal variables which enables a very beneficial constraint-preservation for the electric displacement and magnetic induction vector fields. This is accomplished with constraint-preserving reconstruction methods which are extended in this paper to third and fourth orders of accuracy. The idea of one-dimensional upwinding from Godunov schemes has to be significantly modified to use the multidimensionally upwinded Riemann solvers developed by the first author. In this paper, we show how they can be used within the context of a higher order scheme for CED. We also report on advances in timestepping. We show how Runge-Kutta IMEX schemes can be adapted to CED even in the presence of stiff source terms brought on by large conductivities as well as strong spatial variations in permittivity and permeability. We also formulate very efficient ADER timestepping strategies to endow our method with sub-cell resolving capabilities. As a result, our method can be stiffly-stable and resolve significant sub-cell variation in the material properties within a zone. Moreover, we present ADER schemes that are applicable to all hyperbolic PDEs with stiff source terms and at all orders of accuracy. Our new ADER formulation offers a treatment of stiff source terms that is much more efficient than previous ADER

  4. Bringing Computers into College and University Teaching. Papers Presented at a Symposium Held under the Auspices of the Higher Education Research and Development Society of Australasia (Canberra, Australia, November 19, 1980).

    Science.gov (United States)

    Miller, Allen H., Ed.; Ogilvie, John F., Ed.

    The use of computers in higher education teaching programs is discussed in 16 papers and reports. Applications of computers in teaching particular subjects including prehistory and anthropology, mathematics, Hindi, plant science, chemistry, language, medicine, drawing, statistics, and engineering are discussed in 10 of the contributions. The other…

  5. When Action-Inaction Framing Leads to Higher Escalation of Commitment: A New Inaction-Effect Perspective on the Sunk-Cost Fallacy.

    Science.gov (United States)

    Feldman, Gilad; Wong, Kin Fai Ellick

    2018-04-01

    Escalation of commitment to a failing course of action occurs in the presence of (a) sunk costs, (b) negative feedback that things are deviating from expectations, and (c) a decision between escalation and de-escalation. Most of the literature to date has focused on sunk costs, yet we offer a new perspective on the classic escalation-of-commitment phenomenon by focusing on the impact of negative feedback. On the basis of the inaction-effect bias, we theorized that negative feedback results in the tendency to take action, regardless of what that action may be. In four experiments, we demonstrated that people facing escalation-decision situations were indeed action oriented and that framing escalation as action and de-escalation as inaction resulted in a stronger tendency to escalate than framing de-escalation as action and escalation as inaction (mini-meta-analysis effect d = 0.37, 95% confidence interval = [0.21, 0.53]).

  6. Doing Very Big Calculations on Modest Size Computers: Reducing the Cost of Exact Diagonalization Using Singular Value Decomposition

    International Nuclear Information System (INIS)

    Weinstein, M.

    2012-01-01

    I will talk about a new way of implementing Lanczos and contraction algorithms to diagonalize lattice Hamiltonians that dramatically reduces the memory required to do the computation, without restricting to variational ansatzes. (author)

  7. Females With a Mutation in a Nuclear-Encoded Mitochondrial Protein Pay a Higher Cost of Survival Than Do Males in Drosophila

    OpenAIRE

    Melvin, Richard G.; Ballard, J. William O.

    2011-01-01

    Males and females age at different rates in a variety of species, but the mechanisms underlying the difference is not understood. In this study, we investigated sex-specific costs of a naturally occurring mildly deleterious deletion (DTrp85, DVal86) in cytochrome c oxidase subunit 7A (cox7A) in Drosophila simulans. We observed that females and males homozygous for the mutation had 30% and 26% reduced Cox activity, respectively, compared with wild type. Furthermore, 4-day-old females had 34%–4...

  8. Can a Costly Intervention Be Cost-effective?

    Science.gov (United States)

    Foster, E. Michael; Jones, Damon

    2009-01-01

    Objectives To examine the cost-effectiveness of the Fast Track intervention, a multi-year, multi-component intervention designed to reduce violence among at-risk children. A previous report documented the favorable effect of intervention on the highest-risk group of ninth-graders diagnosed with conduct disorder, as well as self-reported delinquency. The current report addressed the cost-effectiveness of the intervention for these measures of program impact. Design Costs of the intervention were estimated using program budgets. Incremental cost-effectiveness ratios were computed to determine the cost per unit of improvement in the 3 outcomes measured in the 10th year of the study. Results Examination of the total sample showed that the intervention was not cost-effective at likely levels of policymakers' willingness to pay for the key outcomes. Subsequent analysis of those most at risk, however, showed that the intervention likely was cost-effective given specified willingness-to-pay criteria. Conclusions Results indicate that the intervention is cost-effective for the children at highest risk. From a policy standpoint, this finding is encouraging because such children are likely to generate higher costs for society over their lifetimes. However, substantial barriers to cost-effectiveness remain, such as the ability to effectively identify and recruit such higher-risk children in future implementations. PMID:17088509

  9. Can broader diffusion of value-based insurance design increase benefits from US health care without increasing costs? Evidence from a computer simulation model.

    Directory of Open Access Journals (Sweden)

    R Scott Braithwaite

    2010-02-01

    Full Text Available BACKGROUND: Evidence suggests that cost sharing (i.e.,copayments and deductibles decreases health expenditures but also reduces essential care. Value-based insurance design (VBID has been proposed to encourage essential care while controlling health expenditures. Our objective was to estimate the impact of broader diffusion of VBID on US health care benefits and costs. METHODS AND FINDINGS: We used a published computer simulation of costs and life expectancy gains from US health care to estimate the impact of broader diffusion of VBID. Two scenarios were analyzed: (1 applying VBID solely to pharmacy benefits and (2 applying VBID to both pharmacy benefits and other health care services (e.g., devices. We assumed that cost sharing would be eliminated for high-value services ($300,000 per life-year. All costs are provided in 2003 US dollars. Our simulation estimated that approximately 60% of health expenditures in the US are spent on low-value services, 20% are spent on intermediate-value services, and 20% are spent on high-value services. Correspondingly, the vast majority (80% of health expenditures would have cost sharing that is impacted by VBID. With prevailing patterns of cost sharing, health care conferred 4.70 life-years at a per-capita annual expenditure of US$5,688. Broader diffusion of VBID to pharmaceuticals increased the benefit conferred by health care by 0.03 to 0.05 additional life-years, without increasing costs and without increasing out-of-pocket payments. Broader diffusion of VBID to other health care services could increase the benefit conferred by health care by 0.24 to 0.44 additional life-years, also without increasing costs and without increasing overall out-of-pocket payments. Among those without health insurance, using cost saving from VBID to subsidize insurance coverage would increase the benefit conferred by health care by 1.21 life-years, a 31% increase. CONCLUSION: Broader diffusion of VBID may amplify benefits from

  10. Can broader diffusion of value-based insurance design increase benefits from US health care without increasing costs? Evidence from a computer simulation model.

    Science.gov (United States)

    Braithwaite, R Scott; Omokaro, Cynthia; Justice, Amy C; Nucifora, Kimberly; Roberts, Mark S

    2010-02-16

    Evidence suggests that cost sharing (i.e.,copayments and deductibles) decreases health expenditures but also reduces essential care. Value-based insurance design (VBID) has been proposed to encourage essential care while controlling health expenditures. Our objective was to estimate the impact of broader diffusion of VBID on US health care benefits and costs. We used a published computer simulation of costs and life expectancy gains from US health care to estimate the impact of broader diffusion of VBID. Two scenarios were analyzed: (1) applying VBID solely to pharmacy benefits and (2) applying VBID to both pharmacy benefits and other health care services (e.g., devices). We assumed that cost sharing would be eliminated for high-value services (value services ($100,000-$300,000 per life-year or unknown), and would be increased for low-value services (>$300,000 per life-year). All costs are provided in 2003 US dollars. Our simulation estimated that approximately 60% of health expenditures in the US are spent on low-value services, 20% are spent on intermediate-value services, and 20% are spent on high-value services. Correspondingly, the vast majority (80%) of health expenditures would have cost sharing that is impacted by VBID. With prevailing patterns of cost sharing, health care conferred 4.70 life-years at a per-capita annual expenditure of US$5,688. Broader diffusion of VBID to pharmaceuticals increased the benefit conferred by health care by 0.03 to 0.05 additional life-years, without increasing costs and without increasing out-of-pocket payments. Broader diffusion of VBID to other health care services could increase the benefit conferred by health care by 0.24 to 0.44 additional life-years, also without increasing costs and without increasing overall out-of-pocket payments. Among those without health insurance, using cost saving from VBID to subsidize insurance coverage would increase the benefit conferred by health care by 1.21 life-years, a 31% increase

  11. The truncated conjugate gradient (TCG), a non-iterative/fixed-cost strategy for computing polarization in molecular dynamics: Fast evaluation of analytical forces

    Science.gov (United States)

    Aviat, Félix; Lagardère, Louis; Piquemal, Jean-Philip

    2017-10-01

    In a recent paper [F. Aviat et al., J. Chem. Theory Comput. 13, 180-190 (2017)], we proposed the Truncated Conjugate Gradient (TCG) approach to compute the polarization energy and forces in polarizable molecular simulations. The method consists in truncating the conjugate gradient algorithm at a fixed predetermined order leading to a fixed computational cost and can thus be considered "non-iterative." This gives the possibility to derive analytical forces avoiding the usual energy conservation (i.e., drifts) issues occurring with iterative approaches. A key point concerns the evaluation of the analytical gradients, which is more complex than that with a usual solver. In this paper, after reviewing the present state of the art of polarization solvers, we detail a viable strategy for the efficient implementation of the TCG calculation. The complete cost of the approach is then measured as it is tested using a multi-time step scheme and compared to timings using usual iterative approaches. We show that the TCG methods are more efficient than traditional techniques, making it a method of choice for future long molecular dynamics simulations using polarizable force fields where energy conservation matters. We detail the various steps required for the implementation of the complete method by software developers.

  12. Design and implementation of a medium speed communications interface and protocol for a low cost, refreshed display computer

    Science.gov (United States)

    Phyne, J. R.; Nelson, M. D.

    1975-01-01

    The design and implementation of hardware and software systems involved in using a 40,000 bit/second communication line as the connecting link between an IMLAC PDS 1-D display computer and a Univac 1108 computer system were described. The IMLAC consists of two independent processors sharing a common memory. The display processor generates the deflection and beam control currents as it interprets a program contained in the memory; the minicomputer has a general instruction set and is responsible for starting and stopping the display processor and for communicating with the outside world through the keyboard, teletype, light pen, and communication line. The processing time associated with each data byte was minimized by designing the input and output processes as finite state machines which automatically sequence from each state to the next. Several tests of the communication link and the IMLAC software were made using a special low capacity computer grade cable between the IMLAC and the Univac.

  13. 10 CFR Appendix I to Part 504 - Procedures for the Computation of the Real Cost of Capital

    Science.gov (United States)

    2010-01-01

    ... I Appendix I to Part 504 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS EXISTING... parameters specified above are not obtainable, alternate parameters that closely correspond to those above... common equity, an alternate methodology to predict the firm's real after-tax marginal cost of capital may...

  14. Energy Costs and Energy Conservation Programs in Colleges and Universities: 1972-73, 1974-75. Higher Education Panel Reports, Number 31.

    Science.gov (United States)

    Atelsek, Frank J.; Gomberg, Irene L.

    A survey was initiated at the request of the U.S. Office of Education and the Energy Task Force to: (1) measure the increase in energy expenditures since the OPEC oil embargo of 1973-74; (2) assess changes in energy consumption over a two-year period; and (3) examine some of the specific conservation practices of higher education institutions.…

  15. Behaviors of cost functions in image registration between 201Tl brain tumor single-photon emission computed tomography and magnetic resonance images

    International Nuclear Information System (INIS)

    Soma, Tsutomu; Takaki, Akihiro; Teraoka, Satomi; Ishikawa, Yasushi; Murase, Kenya; Koizumi, Kiyoshi

    2008-01-01

    We studied the behaviors of cost functions in the registration of thallium-201 ( 201 Tl) brain tumor single-photon emission computed tomography (SPECT) and magnetic resonance (MR) images, as the similarity index of image positioning. A marker for image registration [technetium-99m ( 99m Tc) point source] was attached at three sites on the heads of 13 patients with brain tumor, from whom 42 sets of 99m Tc- 201 Tl SPECT (the dual-isotope acquisition) and MR images were obtained. The 201 Tl SPECT and MR images were manually registered according to the markers. From the positions where the two images were registered, the position of the 201 Tl SPECT was moved to examine the behaviors of the three cost functions, i.e., ratio image uniformity (RIU), mutual information (MI), and normalized MI (NMI). The cost functions MI and NMI reached the maximum at positions adjacent to those where the SPECT and MR images were manually registered. As for the accuracy of image registration in terms of the cost functions MI and NMI, on average, the images were accurately registered within 3 deg of rotation around the X-, Y-, and Z-axes, and within 1.5 mm (within 2 pixels), 3 mm (within 3 pixels), and 4 mm (within 1 slice) of translation to the X-, Y-, and Z-axes, respectively. In terms of rotation around the Z-axis, the cost function RIU reached the minimum at positions where the manual registration of the two images was substantially inadequate. The MI and NMI were suitable cost functions in the registration of 201 Tl SPECT and MR images. The behavior of the RIU, in contrast, was unstable, being unsuitable as an index of image registration. (author)

  16. Bi-articular Knee-Ankle-Foot Exoskeleton Produces Higher Metabolic Cost Reduction than Weight-Matched Mono-articular Exoskeleton

    Science.gov (United States)

    Malcolm, Philippe; Galle, Samuel; Derave, Wim; De Clercq, Dirk

    2018-01-01

    The bi-articular m. gastrocnemius and the mono-articular m. soleus have different and complementary functions during walking. Several groups are starting to use these biological functions as inspiration to design prostheses with bi-articular actuation components to replace the function of the m. gastrocnemius. Simulation studies indicate that a bi-articular configuration and spring that mimic the m. gastrocnemius could be beneficial for orthoses or exoskeletons. Our aim was to test the effect of a bi-articular and spring configuration that mimics the m. gastrocnemius and compare this to a no-spring and mono-articular configuration. We tested nine participants during walking with knee-ankle-foot exoskeletons with dorsally mounted pneumatic muscle actuators. In the bi-articular plus spring condition the pneumatic muscles were attached to the thigh segment with an elastic cord. In the bi-articular no-spring condition the pneumatic muscles were also attached to the thigh segment but with a non-elastic cord. In the mono-articular condition the pneumatic muscles were attached to the shank segment. We found the highest reduction in metabolic cost of 13% compared to walking with the exoskeleton powered-off in the bi-articular plus spring condition. Possible explanations for this could be that the exoskeleton delivered the highest total positive work in this condition at the ankle and the knee and provided more assistance during the isometric phase of the biological plantarflexors. As expected we found that the bi-articular conditions reduced m. gastrocnemius EMG more than the mono-articular condition but this difference was not significant. We did not find that the mono-articular condition reduces the m. soleus EMG more than the bi-articular conditions. Knowledge of specific effects of different exoskeleton configurations on metabolic cost and muscle activation could be useful for providing customized assistance for specific gait impairments. PMID:29551959

  17. Bi-articular Knee-Ankle-Foot Exoskeleton Produces Higher Metabolic Cost Reduction than Weight-Matched Mono-articular Exoskeleton

    Directory of Open Access Journals (Sweden)

    Philippe Malcolm

    2018-03-01

    Full Text Available The bi-articular m. gastrocnemius and the mono-articular m. soleus have different and complementary functions during walking. Several groups are starting to use these biological functions as inspiration to design prostheses with bi-articular actuation components to replace the function of the m. gastrocnemius. Simulation studies indicate that a bi-articular configuration and spring that mimic the m. gastrocnemius could be beneficial for orthoses or exoskeletons. Our aim was to test the effect of a bi-articular and spring configuration that mimics the m. gastrocnemius and compare this to a no-spring and mono-articular configuration. We tested nine participants during walking with knee-ankle-foot exoskeletons with dorsally mounted pneumatic muscle actuators. In the bi-articular plus spring condition the pneumatic muscles were attached to the thigh segment with an elastic cord. In the bi-articular no-spring condition the pneumatic muscles were also attached to the thigh segment but with a non-elastic cord. In the mono-articular condition the pneumatic muscles were attached to the shank segment. We found the highest reduction in metabolic cost of 13% compared to walking with the exoskeleton powered-off in the bi-articular plus spring condition. Possible explanations for this could be that the exoskeleton delivered the highest total positive work in this condition at the ankle and the knee and provided more assistance during the isometric phase of the biological plantarflexors. As expected we found that the bi-articular conditions reduced m. gastrocnemius EMG more than the mono-articular condition but this difference was not significant. We did not find that the mono-articular condition reduces the m. soleus EMG more than the bi-articular conditions. Knowledge of specific effects of different exoskeleton configurations on metabolic cost and muscle activation could be useful for providing customized assistance for specific gait impairments.

  18. Bi-articular Knee-Ankle-Foot Exoskeleton Produces Higher Metabolic Cost Reduction than Weight-Matched Mono-articular Exoskeleton.

    Science.gov (United States)

    Malcolm, Philippe; Galle, Samuel; Derave, Wim; De Clercq, Dirk

    2018-01-01

    The bi-articular m. gastrocnemius and the mono-articular m. soleus have different and complementary functions during walking. Several groups are starting to use these biological functions as inspiration to design prostheses with bi-articular actuation components to replace the function of the m. gastrocnemius. Simulation studies indicate that a bi-articular configuration and spring that mimic the m. gastrocnemius could be beneficial for orthoses or exoskeletons. Our aim was to test the effect of a bi-articular and spring configuration that mimics the m. gastrocnemius and compare this to a no-spring and mono-articular configuration. We tested nine participants during walking with knee-ankle-foot exoskeletons with dorsally mounted pneumatic muscle actuators. In the bi-articular plus spring condition the pneumatic muscles were attached to the thigh segment with an elastic cord. In the bi-articular no-spring condition the pneumatic muscles were also attached to the thigh segment but with a non-elastic cord. In the mono-articular condition the pneumatic muscles were attached to the shank segment. We found the highest reduction in metabolic cost of 13% compared to walking with the exoskeleton powered-off in the bi-articular plus spring condition . Possible explanations for this could be that the exoskeleton delivered the highest total positive work in this condition at the ankle and the knee and provided more assistance during the isometric phase of the biological plantarflexors. As expected we found that the bi-articular conditions reduced m. gastrocnemius EMG more than the mono-articular condition but this difference was not significant. We did not find that the mono-articular condition reduces the m. soleus EMG more than the bi-articular conditions . Knowledge of specific effects of different exoskeleton configurations on metabolic cost and muscle activation could be useful for providing customized assistance for specific gait impairments.

  19. 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography Accuracy in the Staging of Non-Small Cell Lung Cancer: Review and Cost-Effectiveness

    International Nuclear Information System (INIS)

    Gómez León, Nieves; Escalona, Sofía; Bandrés, Beatriz; Belda, Cristobal; Callejo, Daniel; Blasco, Juan Antonio

    2014-01-01

    Aim of the performed clinical study was to compare the accuracy and cost-effectiveness of PET/CT in the staging of non-small cell lung cancer (NSCLC). Material and Methods. Cross-sectional and prospective study including 103 patients with histologically confirmed NSCLC. All patients were examined using PET/CT with intravenous contrast medium. Those with disease stage ≤IIB underwent surgery (n = 40). Disease stage was confirmed based on histology results, which were compared with those of PET/CT and positron emission tomography (PET) and computed tomography (CT) separately. 63 patients classified with ≥IIIA disease stage by PET/CT did not undergo surgery. The cost-effectiveness of PET/CT for disease classification was examined using a decision tree analysis. Results. Compared with histology, the accuracy of PET/CT for disease staging has a positive predictive value of 80%, a negative predictive value of 95%, a sensitivity of 94%, and a specificity of 82%. For PET alone, these values are 53%, 66%, 60%, and 50%, whereas for CT alone they are 68%, 86%, 76%, and 72%, respectively. Incremental cost-effectiveness of PET/CT over CT alone was €17,412 quality-adjusted life-year (QALY). Conclusion. In our clinical study, PET/CT using intravenous contrast medium was an accurate and cost-effective method for staging of patients with NSCLC

  20. 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography Accuracy in the Staging of Non-Small Cell Lung Cancer: Review and Cost-Effectiveness

    International Nuclear Information System (INIS)

    Leon, N.G.; Bandrs, B.; Escalona, S.; Callejo, D.; Blasco, J.A.; Belda, C.; Blasco, J.A.

    2014-01-01

    Aim of the performed clinical study was to compare the accuracy and cost-effectiveness of PET/CT in the staging of non-small cell lung cancer (NSCLC). Material and Methods. Cross-sectional and prospective study including 103 patients with histologically confirmed NSCLC. All patients were examined using PET/CT with intravenous contrast medium. Those with disease stage ≤IIB underwent surgery (η=40). Disease stage was confirmed based on histology results, which were compared with those of PET/CT and positron emission tomography (PET) and computed tomography (CT) separately. 63 patients classified with ≥IIIA disease stage by PET/CT did not undergo surgery. The cost-effectiveness of PET/CT for disease classification was examined using a decision tree analysis. Results. Compared with histology, the accuracy of PET/CT for disease staging has a positive predictive value of 80%, a negative predictive value of 95%, a sensitivity of 94%, and a specificity of 82%. For PET alone, these values are 53%, 66%, 60%, and 50%, whereas for CT alone they are 68%, 86%, 76%, and 72%, respectively. Incremental cost-effectiveness of PET/CT over CT alone was €17,412 quality-adjusted life-year (QALY). Conclusion. In our clinical study, PET/CT using intravenous contrast medium was an accurate and cost-effective method for staging of patients with NSCLC

  1. Accuracy of a Low-Cost Novel Computer-Vision Dynamic Movement Assessment: Potential Limitations and Future Directions

    Science.gov (United States)

    McGroarty, M.; Giblin, S.; Meldrum, D.; Wetterling, F.

    2016-04-01

    The aim of the study was to perform a preliminary validation of a low cost markerless motion capture system (CAPTURE) against an industry gold standard (Vicon). Measurements of knee valgus and flexion during the performance of a countermovement jump (CMJ) between CAPTURE and Vicon were compared. After correction algorithms were applied to the raw CAPTURE data acceptable levels of accuracy and precision were achieved. The knee flexion angle measured for three trials using Capture deviated by -3.8° ± 3° (left) and 1.7° ± 2.8° (right) compared to Vicon. The findings suggest that low-cost markerless motion capture has potential to provide an objective method for assessing lower limb jump and landing mechanics in an applied sports setting. Furthermore, the outcome of the study warrants the need for future research to examine more fully the potential implications of the use of low-cost markerless motion capture in the evaluation of dynamic movement for injury prevention.

  2. http://z.umn.edu/INNOVATIONS 2011, Vol. 2, No. 2, Article 45 INNOVATIONS in pharmacy 1 An estimation of the effect of 100% Compliance with Diabetes Treatment: Can we reduce cost of illness with higher compliance rates?

    Directory of Open Access Journals (Sweden)

    Güvenç Koçkaya, MD

    2011-01-01

    Full Text Available Introduction: The current study was designed to estimate the direct cost of noncompliance of diabetes patients to the US health system. Understanding these expenses can inform screening and education budget policy regarding expenditure levels that can be calculated to be cost-beneficial.Materials and Method: The study was conducted in three parts. First, a computer search of National Institutes of Health websites and professional society websites for organizations with members that treat diabetes, and a PubMed search were performed to obtain the numbers required for calculations. Second, formulas were developed to estimate the risk of non-compliance and undiagnosed diabetes. Third, risk calculations were performed using the information obtained in part one and the formulas developed in part two.Results: Direct risk reduction for diabetes-related kidney disease, stroke, heart disease, and amputation were estimated for 100% compliance with diabetes treatment. Risk, case and yearly cost reduction calculated for a 100% compliance with diabetes treatment were 13.6%, 0.9 million and US$ 9.3 billion, respectively.Conclusion: Society, insurers, policy makers and other stakeholders could invest up to these amounts in screening, education and prevention efforts in an effort to reduce these costly and traumatic sequelae of noncompliant diabetes patients.

  3. Money for Research, Not for Energy Bills: Finding Energy and Cost Savings in High Performance Computer Facility Designs

    Energy Technology Data Exchange (ETDEWEB)

    Drewmark Communications; Sartor, Dale; Wilson, Mark

    2010-07-01

    High-performance computing facilities in the United States consume an enormous amount of electricity, cutting into research budgets and challenging public- and private-sector efforts to reduce energy consumption and meet environmental goals. However, these facilities can greatly reduce their energy demand through energy-efficient design of the facility itself. Using a case study of a facility under design, this article discusses strategies and technologies that can be used to help achieve energy reductions.

  4. THE DEVELOPMENT OF COMPUTER TECHNOLOGY OF EDUCATION OF STUDENTS IN HIGHER EDUCATIONAL INSTITUTIONS OF UKRAINE (THE SECOND HALF OF 50S – EARLY OF 90S XX CENTURY

    Directory of Open Access Journals (Sweden)

    Oleksii S. Voronkin

    2014-02-01

    Full Text Available The article presents results of a synthesis study of the evolution of computer technologies to support students studying at the universities of Ukraine since the second half of 50th to the early of 90th of the twentieth century. Research was conducted on the basis of a wide range of sources and materials. There are four historical stages highlighted: 1 the emergence of algorithms of programmed learning; 2 the emergence of automated technologies to support studying; 3 the birth of the first computer training systems and the development of learning environment; 4 an integrated development of computer technology, the development of intelligent tutoring systems and virtual reality systems.

  5. Cost-effective pediatric head and body phantoms for computed tomography dosimetry and its evaluation using pencil ion chamber and CT dose profiler

    Directory of Open Access Journals (Sweden)

    A Saravanakumar

    2015-01-01

    Full Text Available In the present work, a pediatric head and body phantom was fabricated using polymethyl methacrylate (PMMA at a low cost when compared to commercially available phantoms for the purpose of computed tomography (CT dosimetry. The dimensions of head and body phantoms were 10 cm diameter, 15 cm length and 16 cm diameter, 15 cm length, respectively. The dose from a 128-slice CT machine received by the head and body phantom at the center and periphery were measured using a 100 mm pencil ion chamber and 150 mm CT dose profiler (CTDP. Using these values, the weighted computed tomography dose index (CTDIw and in turn the volumetric CTDI (CTDIv were calculated for various combinations of tube voltage and current-time product. A similar study was carried out using standard calibrated phantom and the results have been compared with the fabricated ones to ascertain that the performance of the latter is equivalent to that of the former. Finally, CTDIv measured using fabricated and standard phantoms were compared with respective values displayed on the console. The difference between the values was well within the limits specified by Atomic Energy Regulatory Board (AERB, India. These results indicate that the cost-effective pediatric phantom can be employed for CT dosimetry.

  6. Development and implementation of a low-cost phantom for quality control in cone beam computed tomography

    International Nuclear Information System (INIS)

    Batista, W. O.; Navarro, M. V. T.; Maia, A. F.

    2013-01-01

    A phantom for quality control in cone beam computed tomography (CBCT) scanners was designed and constructed, and a methodology for testing was developed. The phantom had a polymethyl methacrylate structure filled with water and plastic objects that allowed the assessment of parameters related to quality control. The phantom allowed the evaluation of essential parameters in CBCT as well as the evaluation of linear and angular dimensions. The plastics used in the phantom were chosen so that their density and linear attenuation coefficient were similar to those of human facial structures. Three types of CBCT equipment, with two different technological concepts, were evaluated. The results of the assessment of the accuracy of linear and angular dimensions agreed with the existing standards. However, other parameters such as computed tomography number accuracy, uniformity and high-contrast detail did not meet the tolerances established in current regulations or the manufacturer's specifications. The results demonstrate the importance of establishing specific protocols and phantoms, which meet the specificities of CBCT. The practicality of implementation, the quality control test results for the proposed phantom and the consistency of the results using different equipment demonstrate its adequacy. (authors)

  7. Development of a Computer Program for an Analysis of the Logistics and Transportation Costs of the PWR Spent Fuels in Korea

    International Nuclear Information System (INIS)

    Cha, Jeong Hun; Choi, Heui Joo; Lee, Jong Youl; Choi, Jong Won

    2009-01-01

    It is expected that a substantial amount of spent fuels will be transported from the four nuclear power plant (NPP) sites in Korea to a hypothetical centralized interim storage facility or a final repository in the near future. The cost for the transportation is proportional to the amount of spent fuels. In this paper, a cost estimation program is developed based on the conceptual design of a transportation system and a logistics analysis. Using the developed computer program, named as CASK, the minimum capacity of a centralized interim storage facility (CISF) and the transportation cost for PWR spent fuels are calculated. The PWR spent fuels are transported from 4 NPP sites to a final repository (FR) via the CISF. Since NPP sites and the CISF are located along the coast, a sea-transportation is considered and a road-transportation is considered between the CISF and the FR. The result shows that the minimum capacity of the interim storage facility is 15,000 MTU

  8. Costs and role of ultrasound follow-up of polytrauma patients after initial computed tomography; Kosten und Stellenwert von Ultraschallverlaufskontrollen bei polytraumatisierten Patienten nach initialer Computertomografie

    Energy Technology Data Exchange (ETDEWEB)

    Maurer, M.H.; Winkler, A.; Powerski, M.J.; Elgeti, F.; Huppertz, A.; Roettgen, R.; Marnitz, T. [Charite - Universitaetsmedizin Berlin (Germany). Klinik fuer Diagnostische und Interventionelle Radiologie; Wichlas, F. [Charite - Universitaetsmedizin Berlin (Germany). Centrum fuer Muskuloskeletale Chirurgie

    2012-01-15

    Purpose: To assess the costs and diagnostic gain of abdominal ultrasound follow-up of polytrauma patients initially examined by whole-body computed tomography (CT). Materials and Methods: A total of 176 patients with suspected multiple trauma (126 men, 50 women; age 43.5 {+-} 17.4 years) were retrospectively analyzed with regard to supplementary and new findings obtained by ultrasound follow-up compared with the results of exploratory FAST (focused assessment with sonography for trauma) at admission and the findings of whole-body CT. A process model was used to document the staff, materials, and total costs of the ultrasound follow-up examinations. Results: FAST yielded 26 abdominal findings (organ injury and/or free intra-abdominal fluid) in 19 patients, while the abdominal scan of whole-body CT revealed 32 findings in 25 patients. FAST had 81 % sensitivity and 100 % specificity. Follow-up ultrasound examinations revealed new findings in 2 of the 25 patients with abdominal injuries detected with initial CT. In the 151 patients without abdominal injuries in the initial CT scan, ultrasound follow-up did not yield any supplementary or new findings. The total costs of an ultrasound follow-up examination were EUR 28.93. The total costs of all follow-up ultrasound examinations performed in the study population were EUR 5658.23. Conclusion: Follow-up abdominal ultrasound yields only a low overall diagnostic gain in polytrauma patients in whom initial CT fails to detect any abdominal injuries but incurs high personnel expenses for radiological departments. (orig.)

  9. Effectiveness and cost-effectiveness of computer and other electronic aids for smoking cessation: a systematic review and network meta-analysis.

    Science.gov (United States)

    Chen, Y-F; Madan, J; Welton, N; Yahaya, I; Aveyard, P; Bauld, L; Wang, D; Fry-Smith, A; Munafò, M R

    2012-01-01

    Smoking is harmful to health. On average, lifelong smokers lose 10 years of life, and about half of all lifelong smokers have their lives shortened by smoking. Stopping smoking reverses or prevents many of these harms. However, cessation services in the NHS achieve variable success rates with smokers who want to quit. Approaches to behaviour change can be supplemented with electronic aids, and this may significantly increase quit rates and prevent a proportion of cases that relapse. The primary research question we sought to answer was: What is the effectiveness and cost-effectiveness of internet, pc and other electronic aids to help people stop smoking? We addressed the following three questions: (1) What is the effectiveness of internet sites, computer programs, mobile telephone text messages and other electronic aids for smoking cessation and/or reducing relapse? (2) What is the cost-effectiveness of incorporating internet sites, computer programs, mobile telephone text messages and other electronic aids into current nhs smoking cessation programmes? and (3) What are the current gaps in research into the effectiveness of internet sites, computer programs, mobile telephone text messages and other electronic aids to help people stop smoking? For the effectiveness review, relevant primary studies were sought from The Cochrane Library [Cochrane Central Register of Controlled Trials (CENTRAL)] 2009, Issue 4, and MEDLINE (Ovid), EMBASE (Ovid), PsycINFO (Ovid), Health Management Information Consortium (HMIC) (Ovid) and Cumulative Index to Nursing and Allied Health Literature (CINAHL) (EBSCOhost) from 1980 to December 2009. In addition, NHS Economic Evaluation Database (NHS EED) and Database of Abstracts of Reviews of Effects (DARE) were searched for information on cost-effectiveness and modelling for the same period. Reference lists of included studies and of relevant systematic reviews were examined to identify further potentially relevant studies. Research registries

  10. Cost Analyses after a single intervention using a computer application (DIAGETHER in the treatment of diabetic patients admitted to a third level hospital

    Directory of Open Access Journals (Sweden)

    César Carballo Cardona

    2018-01-01

    Full Text Available Goals: To quantify the savings that could be made by the hospital implementation of a computer application (DIAGETHER®, which advises the treatment of hyperglycemia of the diabetic patient in the emergency department when this patient is admitted to a third level hospital. Methods: A multicenter interventional study was designed, including patients in two arms, one in the conventional treatment prescribed by the physician and the other applied the treatment indicated by the computer application DIAGETHER®. The days of hospitalization were collected in the two arms of intervention. Results: A total of 183 patients were included, 86 received treatment with the computer application, and 97 received conventional treatment. The mean blood glucose level on the first day of admission in the GLIKAL group was 178.56 (59.53, compared to 212.93 (62.23 in the conventional group (p <0.001 and on the second day 173.86 (58.86 versus 196.37 (66.60 (p = 0.017. There was no difference in the frequency of hypoglycemia reported in each group (p = 0.555. A reduction in mean stay was observed in patients treated with DIAGETHER. The days of admission were 7 (2-39 days for the GLIKAL group and 10 (2-53 days for the PCH group (p <0.001. Conclusions: The annual savings that could be generated with the use of the computer tool (DIAGETHER®, with the volume of diabetic patients admitted to the hospital, could decrease hospitalization days by 26,147 (14,134 patients for 1.85 days of stay reduction, this would generate a saving of 8,811,842 million euros per year (cost of stay / day of the diabetic patient, for the savings days generated.

  11. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  12. The Role of Telematic Practices in Computer Engineering: A Low-cost Remote Power Control in a Network Lab

    Directory of Open Access Journals (Sweden)

    Tomas Mateo Sanguino

    2012-05-01

    Full Text Available The present paper describes a practical solution of e-learning laboratory devoted to the study of computer networks. This laboratory has been proven with two groups of students from the University of Huelva (Spain during two academic years. In order to achieve this objective, it has been necessary to create an entire network infrastructure that includes both the telematic access to the laboratory equipment and the remote power control. The interest of this work lies in an economical and simple system of remote control and telematic access with a twofold objective. On the one hand, to develop distance practices with attendance appearance by means of real hardware systems, not simulated. On the other hand, to reduce the power consumption regarding other proposals of remote labs with permanent power connection, providing herein an on demand connection only when required. As a result, a versatile and flexible laboratory has been put into practice whose basic network topology allows transferring traditional practices to telematic practices in a natural way and without harsh changes

  13. Identifying Benefits and risks associated with utilizing cloud computing

    OpenAIRE

    Shayan, Jafar; Azarnik, Ahmad; Chuprat, Suriayati; Karamizadeh, Sasan; Alizadeh, Mojtaba

    2014-01-01

    Cloud computing is an emerging computing model where IT and computing operations are delivered as services in highly scalable and cost effective manner. Recently, embarking this new model in business has become popular. Companies in diverse sectors intend to leverage cloud computing architecture, platforms and applications in order to gain higher competitive advantages. Likewise other models, cloud computing brought advantages to attract business but meanwhile fostering cloud has led to some ...

  14. Greater accordance with the Dietary Approaches to Stop Hypertension dietary pattern is associated with lower diet-related greenhouse gas production but higher dietary costs in the United Kingdom12

    Science.gov (United States)

    Monsivais, Pablo; Scarborough, Peter; Lloyd, Tina; Mizdrak, Anja; Luben, Robert; Mulligan, Angela A; Wareham, Nicholas J; Woodcock, James

    2015-01-01

    Background: The Dietary Approaches to Stop Hypertension (DASH) diet is a proven way to prevent and control hypertension and other chronic disease. Because the DASH diet emphasizes plant-based foods, including vegetables and grains, adhering to this diet might also bring about environmental benefits, including lower associated production of greenhouse gases (GHGs). Objective: The objective was to examine the interrelation between dietary accordance with the DASH diet and associated GHGs. A secondary aim was to examine the retail cost of diets by level of DASH accordance. Design: In this cross-sectional study of adults aged 39–79 y from the European Prospective Investigation into Cancer and Nutrition–Norfolk, United Kingdom cohort (n = 24,293), dietary intakes estimated from food-frequency questionnaires were analyzed for their accordance with the 8 DASH food and nutrient-based targets. Associations between DASH accordance, GHGs, and dietary costs were evaluated in regression analyses. Dietary GHGs were estimated with United Kingdom-specific data on carbon dioxide equivalents associated with commodities and foods. Dietary costs were estimated by using national food prices from a United Kingdom–based supermarket comparison website. Results: Greater accordance with the DASH dietary targets was associated with lower GHGs. Diets in the highest quintile of accordance had a GHG impact of 5.60 compared with 6.71 kg carbon dioxide equivalents/d for least-accordant diets (P dietary costs, with the mean cost of diets in the top quintile of DASH scores 18% higher than that of diets in the lowest quintile (P < 0.0001). Conclusions: Promoting wider uptake of the DASH diet in the United Kingdom may improve population health and reduce diet-related GHGs. However, to make the DASH diet more accessible, food affordability, particularly for lower income groups, will have to be addressed. PMID:25926505

  15. Greater accordance with the Dietary Approaches to Stop Hypertension dietary pattern is associated with lower diet-related greenhouse gas production but higher dietary costs in the United Kingdom.

    Science.gov (United States)

    Monsivais, Pablo; Scarborough, Peter; Lloyd, Tina; Mizdrak, Anja; Luben, Robert; Mulligan, Angela A; Wareham, Nicholas J; Woodcock, James

    2015-07-01

    The Dietary Approaches to Stop Hypertension (DASH) diet is a proven way to prevent and control hypertension and other chronic disease. Because the DASH diet emphasizes plant-based foods, including vegetables and grains, adhering to this diet might also bring about environmental benefits, including lower associated production of greenhouse gases (GHGs). The objective was to examine the interrelation between dietary accordance with the DASH diet and associated GHGs. A secondary aim was to examine the retail cost of diets by level of DASH accordance. In this cross-sectional study of adults aged 39-79 y from the European Prospective Investigation into Cancer and Nutrition-Norfolk, United Kingdom cohort (n = 24,293), dietary intakes estimated from food-frequency questionnaires were analyzed for their accordance with the 8 DASH food and nutrient-based targets. Associations between DASH accordance, GHGs, and dietary costs were evaluated in regression analyses. Dietary GHGs were estimated with United Kingdom-specific data on carbon dioxide equivalents associated with commodities and foods. Dietary costs were estimated by using national food prices from a United Kingdom-based supermarket comparison website. Greater accordance with the DASH dietary targets was associated with lower GHGs. Diets in the highest quintile of accordance had a GHG impact of 5.60 compared with 6.71 kg carbon dioxide equivalents/d for least-accordant diets (P dietary costs, with the mean cost of diets in the top quintile of DASH scores 18% higher than that of diets in the lowest quintile (P < 0.0001). Promoting wider uptake of the DASH diet in the United Kingdom may improve population health and reduce diet-related GHGs. However, to make the DASH diet more accessible, food affordability, particularly for lower income groups, will have to be addressed.

  16. Exploring Customization in Higher Education: An Experiment in Leveraging Computer Spreadsheet Technology to Deliver Highly Individualized Online Instruction to Undergraduate Business Students

    Science.gov (United States)

    Kunzler, Jayson S.

    2012-01-01

    This dissertation describes a research study designed to explore whether customization of online instruction results in improved learning in a college business statistics course. The study involved utilizing computer spreadsheet technology to develop an intelligent tutoring system (ITS) designed to: a) collect and monitor individual real-time…

  17. Explicating Development of Personal Professional Theories from Higher Vocational Education to Beginning a Professional Career through Computer-Supported Drawing of Concept Maps

    Science.gov (United States)

    van den Bogaart, Antoine C. M.; Hummel, Hans G. K.; Kirschner, Paul A.

    2018-01-01

    This article explores how personal professional theories (PPTs) develop. PPT development of nine junior accountants and nine novice teachers was monitored by repeated measurements over a period of 1.5 years, from the last year of vocational education until the second year of their professional careers. Computer-supported construction of PPT…

  18. Application of off-line image processing for optimization in chest computed radiography using a low cost system.

    Science.gov (United States)

    Muhogora, Wilbroad E; Msaki, Peter; Padovani, Renato

    2015-03-08

     The objective of this study was to improve the visibility of anatomical details by applying off-line postimage processing in chest computed radiography (CR). Four spatial domain-based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann-Whitney U-test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005 ≤ p ≤ 0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60 ≤ kVp ≤ 70. However, there was no improvement for images acquired using 102 ≤ kVp ≤ 107 (0.127 ≤ p ≤ 0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists.

  19. The Cost of Corruption in Higher Education

    Science.gov (United States)

    Heyneman, Stephen P.; Anderson, Kathryn H.; Nuraliyeva, Nazym

    2008-01-01

    Corruption was symptomatic of business and government interactions in Russia and other countries of the former Soviet Union before and during the economic transition of the 1990s. Corruption is difficult to quantify, but the perception of corruption is quantifiable. Nations can even be arranged along a hierarchy by the degree to which they are…

  20. Role of information systems in controlling costs: the electronic medical record (EMR) and the high-performance computing and communications (HPCC) efforts

    Science.gov (United States)

    Kun, Luis G.

    1994-12-01

    On October 18, 1991, the IEEE-USA produced an entity statement which endorsed the vital importance of the High Performance Computer and Communications Act of 1991 (HPCC) and called for the rapid implementation of all its elements. Efforts are now underway to develop a Computer Based Patient Record (CBPR), the National Information Infrastructure (NII) as part of the HPCC, and the so-called `Patient Card'. Multiple legislative initiatives which address these and related information technology issues are pending in Congress. Clearly, a national information system will greatly affect the way health care delivery is provided to the United States public. Timely and reliable information represents a critical element in any initiative to reform the health care system as well as to protect and improve the health of every person. Appropriately used, information technologies offer a vital means of improving the quality of patient care, increasing access to universal care and lowering overall costs within a national health care program. Health care reform legislation should reflect increased budgetary support and a legal mandate for the creation of a national health care information system by: (1) constructing a National Information Infrastructure; (2) building a Computer Based Patient Record System; (3) bringing the collective resources of our National Laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; and (4) utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues.

  1. An Investigation of Technology Avoidance Effect into Higher Education Environments: Some Empirical Evidence of Marketing Students' Background and Their Use of Personal Computers Outside the Academic Culture

    Science.gov (United States)

    Spais, George S.; Vasileiou, Konstantinos Z.

    2008-01-01

    The major objective of this study was to test a research hypothesis in order to explain the technology avoidance effect in higher educational environments. We addressed the core research themes of our study using a survey. Our intention was to test marketing students' perceptions in order to investigate the potent influence of a climate of…

  2. 12 CFR 219.3 - Cost reimbursement.

    Science.gov (United States)

    2010-01-01

    ... that the financial institution use programming or other higher level technical services of a computer... (private sector) set out in the Employment Cost Trends section of the National Compensation Survey (http... PROVIDING FINANCIAL RECORDS; RECORDKEEPING REQUIREMENTS FOR CERTAIN FINANCIAL RECORDS (REGULATION S...

  3. Higher Education

    African Journals Online (AJOL)

    Kunle Amuwo: Higher Education Transformation: A Paradigm Shilt in South Africa? ... ty of such skills, especially at the middle management levels within the higher ... istics and virtues of differentiation and diversity. .... may be forced to close shop for lack of capacity to attract ..... necessarily lead to racial and gender equity,.

  4. An exploratory examination of the relationship between motivational factors and the degree to which the higher education faculty integrate computer-mediated communication (CMC) tools into their courses

    Science.gov (United States)

    Murage, Francis Ndwiga

    The stated research problem of this study was to examine the relationship between motivational factors and the degree to which the higher education faculty integrate CMC tools into their courses. The study population and sample involved higher education faculty teaching in science departments at one public university and three public colleges in the state of West Virginia (N = 153). A Likert-type rating scale survey was used to collect data based on the research questions. Two parts of the survey were adopted from previous studies while the other two were self-constructed. Research questions and hypothesis were analyzed using both descriptive and inferential analyses. The study results established a positive relationship between motivational factors and the degree the higher education faculty integrate CMC tools in their courses. The results in addition established that faculty are highly motivated to integrate CMC tools by intrinsic factors, moderately motivated by environmental factors and least motivated by extrinsic factors. The results also established that the most integrated CMC tools were those that support asynchronous methods of communication while the least integrated were those that support synchronous methods of communication. A major conclusion made was that members of higher education faculty are more likely to be motivated to integrate CMC tools into their courses by intrinsic factors rather than extrinsic or environmental factors. It was further concluded that intrinsic factors that supported and enhanced student learning as well as those that were altruistic in nature significantly influenced the degree of CMC integration. The study finally concluded that to larger extent, there is a relationship between motivational factors and the degree to which the higher education faculty integrate CMC tools in their courses. A major implication of this study was that institutions that wish to promote integration of CMC technologies should provide as much

  5. Higher Education

    Science.gov (United States)

    & Development (LDRD) National Security Education Center (NSEC) Office of Science Programs Richard P Databases National Security Education Center (NSEC) Center for Nonlinear Studies Engineering Institute Scholarships STEM Education Programs Teachers (K-12) Students (K-12) Higher Education Regional Education

  6. Self-regulated learning in higher education: strategies adopted by computer programming students when supported by the SimProgramming approach

    Directory of Open Access Journals (Sweden)

    Daniela Pedrosa

    Full Text Available Abstract The goal of the SimProgramming approach is to help students overcome their learning difficulties in the transition from entry-level to advanced computer programming, developing an appropriate set of learning strategies. We implemented it at the University of Trás-os-Montes e Alto Douro (Portugal, in two courses (PM3 and PM4 of the bachelor programmes in Informatics Engineering and ICT. We conducted semi-structured interviews with students (n=38 at the end of the courses, to identify the students’ strategies for self-regulation of learning in the assignment. We found that students changed some of their strategies from one course edition to the following one and that changes are related to the SimProgramming approach. We believe that changes to the educational approach were appropriate to support the assignment goals. We recommend applying the SimProgramming approach in other educational contexts, to improve educational practices by including techniques to help students in their learning.

  7. Automated Data Handling And Instrument Control Using Low-Cost Desktop Computers And An IEEE 488 Compatible Version Of The ODETA V.

    Science.gov (United States)

    van Leunen, J. A. J.; Dreessen, J.

    1984-05-01

    The result of a measurement of the modulation transfer function is only useful as long as it is accompanied by a complete description of all relevant measuring conditions involved. For this reason it is necessary to file a full description of the relevant measuring conditions together with the results. In earlier times some of our results were rendered useless because some of the relevant measuring conditions were accidentally not written down and were forgotten. This was mainly due to the lack of concensus about which measuring conditions had to be filed together with the result of a measurement. One way to secure uniform and complete archiving of measuring conditions and results is to automate the data handling. An attendent advantage of automation of data handling is that it does away with the time-consuming correction of rough measuring results. The automation of the data handling was accomplished with rather cheap desktop computers, which were powerfull enough, however, to allow us to automate the measurement as well. After automation of the data handling we started with automatic collection of rough measurement data. Step by step we extended the automation by letting the desktop computer control more and more of the measuring set-up. At present the desktop computer controls all the electrical and most of the mechanical measuring conditions. Further it controls and reads the MTF measuring instrument. Focussing and orientation optimization can be fully automatic, semi-automatic or completely manual. MTF measuring results can be collected automatically but they can also be typed in by hand. Due to the automation we are able to implement proper archival of measuring results together with all necessary measuring conditions. The improved measuring efficiency made it possible to increase the number of routine measurements done in the same time period by an order of magnitude. To our surprise the measuring accuracy also improved by a factor of two. This was due to

  8. Variable-coefficient higher-order nonlinear Schroedinger model in optical fibers: Variable-coefficient bilinear form, Baecklund transformation, brightons and symbolic computation

    International Nuclear Information System (INIS)

    Tian Bo; Gao Yitian; Zhu Hongwu

    2007-01-01

    Symbolically investigated in this Letter is a variable-coefficient higher-order nonlinear Schroedinger (vcHNLS) model for ultrafast signal-routing, fiber laser systems and optical communication systems with distributed dispersion and nonlinearity management. Of physical and optical interests, with bilinear method extend, the vcHNLS model is transformed into a variable-coefficient bilinear form, and then an auto-Baecklund transformation is constructed. Constraints on coefficient functions are analyzed. Potentially observable with future optical-fiber experiments, variable-coefficient brightons are illustrated. Relevant properties and features are discussed as well. Baecklund transformation and other results of this Letter will be of certain value to the studies on inhomogeneous fiber media, core of dispersion-managed brightons, fiber amplifiers, laser systems and optical communication links with distributed dispersion and nonlinearity management

  9. Soliton-like solutions of a generalized variable-coefficient higher order nonlinear Schroedinger equation from inhomogeneous optical fibers with symbolic computation

    International Nuclear Information System (INIS)

    Li Juan; Zhang Haiqiang; Xu Tao; Zhang, Ya-Xing; Tian Bo

    2007-01-01

    For the long-distance communication and manufacturing problems in optical fibers, the propagation of subpicosecond or femtosecond optical pulses can be governed by the variable-coefficient nonlinear Schroedinger equation with higher order effects, such as the third-order dispersion, self-steepening and self-frequency shift. In this paper, we firstly determine the general conditions for this equation to be integrable by employing the Painleve analysis. Based on the obtained 3 x 3 Lax pair, we construct the Darboux transformation for such a model under the corresponding constraints, and then derive the nth-iterated potential transformation formula by the iterative process of Darboux transformation. Through the one- and two-soliton-like solutions, we graphically discuss the features of femtosecond solitons in inhomogeneous optical fibers

  10. New Parameters for Higher Accuracy in the Computation of Binding Free Energy Differences upon Alanine Scanning Mutagenesis on Protein-Protein Interfaces.

    Science.gov (United States)

    Simões, Inês C M; Costa, Inês P D; Coimbra, João T S; Ramos, Maria J; Fernandes, Pedro A

    2017-01-23

    Knowing how proteins make stable complexes enables the development of inhibitors to preclude protein-protein (P:P) binding. The identification of the specific interfacial residues that mostly contribute to protein binding, denominated as hot spots, is thus critical. Here, we refine an in silico alanine scanning mutagenesis protocol, based on a residue-dependent dielectric constant version of the Molecular Mechanics/Poisson-Boltzmann Surface Area method. We have used a large data set of structurally diverse P:P complexes to redefine the residue-dependent dielectric constants used in the determination of binding free energies. The accuracy of the method was validated through comparison with experimental data, considering the per-residue P:P binding free energy (ΔΔG binding ) differences upon alanine mutation. Different protocols were tested, i.e., a geometry optimization protocol and three molecular dynamics (MD) protocols: (1) one using explicit water molecules, (2) another with an implicit solvation model, and (3) a third where we have carried out an accelerated MD with explicit water molecules. Using a set of protein dielectric constants (within the range from 1 to 20) we showed that the dielectric constants of 7 for nonpolar and polar residues and 11 for charged residues (and histidine) provide optimal ΔΔG binding predictions. An overall mean unsigned error (MUE) of 1.4 kcal mol -1 relative to the experiment was achieved in 210 mutations only with geometry optimization, which was further reduced with MD simulations (MUE of 1.1 kcal mol -1 for the MD employing explicit solvent). This recalibrated method allows for a better computational identification of hot spots, avoiding expensive and time-consuming experiments or thermodynamic integration/ free energy perturbation/ uBAR calculations, and will hopefully help new drug discovery campaigns in their quest of searching spots of interest for binding small drug-like molecules at P:P interfaces.

  11. Reducing computational costs in large scale 3D EIT by using a sparse Jacobian matrix with block-wise CGLS reconstruction

    International Nuclear Information System (INIS)

    Yang, C L; Wei, H Y; Soleimani, M; Adler, A

    2013-01-01

    Electrical impedance tomography (EIT) is a fast and cost-effective technique to provide a tomographic conductivity image of a subject from boundary current–voltage data. This paper proposes a time and memory efficient method for solving a large scale 3D EIT inverse problem using a parallel conjugate gradient (CG) algorithm. The 3D EIT system with a large number of measurement data can produce a large size of Jacobian matrix; this could cause difficulties in computer storage and the inversion process. One of challenges in 3D EIT is to decrease the reconstruction time and memory usage, at the same time retaining the image quality. Firstly, a sparse matrix reduction technique is proposed using thresholding to set very small values of the Jacobian matrix to zero. By adjusting the Jacobian matrix into a sparse format, the element with zeros would be eliminated, which results in a saving of memory requirement. Secondly, a block-wise CG method for parallel reconstruction has been developed. The proposed method has been tested using simulated data as well as experimental test samples. Sparse Jacobian with a block-wise CG enables the large scale EIT problem to be solved efficiently. Image quality measures are presented to quantify the effect of sparse matrix reduction in reconstruction results. (paper)

  12. Reducing computational costs in large scale 3D EIT by using a sparse Jacobian matrix with block-wise CGLS reconstruction.

    Science.gov (United States)

    Yang, C L; Wei, H Y; Adler, A; Soleimani, M

    2013-06-01

    Electrical impedance tomography (EIT) is a fast and cost-effective technique to provide a tomographic conductivity image of a subject from boundary current-voltage data. This paper proposes a time and memory efficient method for solving a large scale 3D EIT inverse problem using a parallel conjugate gradient (CG) algorithm. The 3D EIT system with a large number of measurement data can produce a large size of Jacobian matrix; this could cause difficulties in computer storage and the inversion process. One of challenges in 3D EIT is to decrease the reconstruction time and memory usage, at the same time retaining the image quality. Firstly, a sparse matrix reduction technique is proposed using thresholding to set very small values of the Jacobian matrix to zero. By adjusting the Jacobian matrix into a sparse format, the element with zeros would be eliminated, which results in a saving of memory requirement. Secondly, a block-wise CG method for parallel reconstruction has been developed. The proposed method has been tested using simulated data as well as experimental test samples. Sparse Jacobian with a block-wise CG enables the large scale EIT problem to be solved efficiently. Image quality measures are presented to quantify the effect of sparse matrix reduction in reconstruction results.

  13. Higher Education.

    Science.gov (United States)

    Hendrickson, Robert M.

    This chapter reports 1982 cases involving aspects of higher education. Interesting cases noted dealt with the federal government's authority to regulate state employees' retirement and raised the questions of whether Title IX covers employment, whether financial aid makes a college a program under Title IX, and whether sex segregated mortality…

  14. Costs of traffic injuries

    DEFF Research Database (Denmark)

    Kruse, Marie

    2015-01-01

    assessed using Danish national healthcare registers. Productivity costs were computed using duration analysis (Cox regression models). In a subanalysis, cost per severe traffic injury was computed for the 12 995 individuals that experienced a severe injury. RESULTS: The socioeconomic cost of a traffic...... injury was €1406 (2009 price level) in the first year, and €8950 over a 10-year period. Per 100 000 population, the 10-year cost was €6 565 668. A severe traffic injury costs €4969 per person in the first year, and €4 006 685 per 100 000 population over a 10-year period. Victims of traffic injuries...

  15. New Federal Cost Accounting Regulations

    Science.gov (United States)

    Wolff, George J.; Handzo, Joseph J.

    1973-01-01

    Discusses a new set of indirect cost accounting procedures which must be followed by school districts wishing to recover any indirect costs of administering federal grants and contracts. Also discusses the amount of indirect costs that may be recovered, computing indirect costs, classifying project costs, and restricted grants. (Author/DN)

  16. From Deposit to Point Cloud – a Study of Low-Cost Computer Vision Approaches for the Straightforward Documentation of Archaeological Excavations

    Directory of Open Access Journals (Sweden)

    M. Doneus

    2011-12-01

    Full Text Available Stratigraphic archaeological excavations demand high-resolution documentation techniques for 3D recording. Today, this is typically accomplished using total stations or terrestrial laser scanners. This paper demonstrates the potential of another technique that is low-cost and easy to execute. It takes advantage of software using Structure from Motion (SfM algorithms, which are known for their ability to reconstruct camera pose and threedimensional scene geometry (rendered as a sparse point cloud from a series of overlapping photographs captured by a camera moving around the scene. When complemented by stereo matching algorithms, detailed 3D surface models can be built from such relatively oriented photo collections in a fully automated way. The absolute orientation of the model can be derived by the manual measurement of control points. The approach is extremely flexible and appropriate to deal with a wide variety of imagery, because this computer vision approach can also work with imagery resulting from a randomly moving camera (i.e. uncontrolled conditions and calibrated optics are not a prerequisite. For a few years, these algorithms are embedded in several free and low-cost software packages. This paper will outline how such a program can be applied to map archaeological excavations in a very fast and uncomplicated way, using imagery shot with a standard compact digital camera (even if the ima ges were not taken for this purpose. Archived data from previous excavations of VIAS-University of Vienna has been chosen and the derived digital surface models and orthophotos have been examined for their usefulness for archaeological applications. The a bsolute georeferencing of the resulting surface models was performed with the manual identification of fourteen control points. In order to express the positional accuracy of the generated 3D surface models, the NSSDA guidelines were applied.  Simultaneously acquired terrestrial laser scanning data

  17. Quality of Higher Education

    DEFF Research Database (Denmark)

    Zou, Yihuan

    is about constructing a more inclusive understanding of quality in higher education through combining the macro, meso and micro levels, i.e. from the perspectives of national policy, higher education institutions as organizations in society, individual teaching staff and students. It covers both......Quality in higher education was not invented in recent decades – universities have always possessed mechanisms for assuring the quality of their work. The rising concern over quality is closely related to the changes in higher education and its social context. Among others, the most conspicuous...... changes are the massive expansion, diversification and increased cost in higher education, and new mechanisms of accountability initiated by the state. With these changes the traditional internally enacted academic quality-keeping has been given an important external dimension – quality assurance, which...

  18. Solving computationally expensive engineering problems

    CERN Document Server

    Leifsson, Leifur; Yang, Xin-She

    2014-01-01

    Computational complexity is a serious bottleneck for the design process in virtually any engineering area. While migration from prototyping and experimental-based design validation to verification using computer simulation models is inevitable and has a number of advantages, high computational costs of accurate, high-fidelity simulations can be a major issue that slows down the development of computer-aided design methodologies, particularly those exploiting automated design improvement procedures, e.g., numerical optimization. The continuous increase of available computational resources does not always translate into shortening of the design cycle because of the growing demand for higher accuracy and necessity to simulate larger and more complex systems. Accurate simulation of a single design of a given system may be as long as several hours, days or even weeks, which often makes design automation using conventional methods impractical or even prohibitive. Additional problems include numerical noise often pr...

  19. Accomplish the Application Area in Cloud Computing

    OpenAIRE

    Bansal, Nidhi; Awasthi, Amit

    2012-01-01

    In the cloud computing application area of accomplish, we find the fact that cloud computing covers a lot of areas are its main asset. At a top level, it is an approach to IT where many users, some even from different companies get access to shared IT resources such as servers, routers and various file extensions, instead of each having their own dedicated servers. This offers many advantages like lower costs and higher efficiency. Unfortunately there have been some high profile incidents whe...

  20. Aufwandsanalyse für computerunterstützte Multiple-Choice Papierklausuren [Cost analysis for computer supported multiple-choice paper examinations

    Directory of Open Access Journals (Sweden)

    Mandel, Alexander

    2011-11-01

    Full Text Available [english] Introduction: Multiple-choice-examinations are still fundamental for assessment in medical degree programs. In addition to content related research, the optimization of the technical procedure is an important question. Medical examiners face three options: paper-based examinations with or without computer support or completely electronic examinations. Critical aspects are the effort for formatting, the logistic effort during the actual examination, quality, promptness and effort of the correction, the time for making the documents available for inspection by the students, and the statistical analysis of the examination results.Methods: Since three semesters a computer program for input and formatting of MC-questions in medical and other paper-based examinations is used and continuously improved at Wuerzburg University. In the winter semester (WS 2009/10 eleven, in the summer semester (SS 2010 twelve and in WS 2010/11 thirteen medical examinations were accomplished with the program and automatically evaluated. For the last two semesters the remaining manual workload was recorded. Results: The cost of the formatting and the subsequent analysis including adjustments of the analysis of an average examination with about 140 participants and about 35 questions was 5-7 hours for exams without complications in the winter semester 2009/2010, about 2 hours in SS 2010 and about 1.5 hours in the winter semester 2010/11. Including exams with complications, the average time was about 3 hours per exam in SS 2010 and 2.67 hours for the WS 10/11. Discussion: For conventional multiple-choice exams the computer-based formatting and evaluation of paper-based exams offers a significant time reduction for lecturers in comparison with the manual correction of paper-based exams and compared to purely electronically conducted exams it needs a much simpler technological infrastructure and fewer staff during the exam.[german] Einleitung: Multiple

  1. Cloud Computing Adoption and Usage in Community Colleges

    Science.gov (United States)

    Behrend, Tara S.; Wiebe, Eric N.; London, Jennifer E.; Johnson, Emily C.

    2011-01-01

    Cloud computing is gaining popularity in higher education settings, but the costs and benefits of this tool have gone largely unexplored. The purpose of this study was to examine the factors that lead to technology adoption in a higher education setting. Specifically, we examined a range of predictors and outcomes relating to the acceptance of a…

  2. Controlling Health Care Costs

    Science.gov (United States)

    Dessoff, Alan

    2009-01-01

    This article examines issues on health care costs and describes measures taken by public districts to reduce spending. As in most companies in America, health plan designs in public districts are being changed to reflect higher out-of-pocket costs, such as higher deductibles on visits to providers, hospital stays, and prescription drugs. District…

  3. Controlling costs without compromising quality: paying hospitals for total knee replacement.

    Science.gov (United States)

    Pine, Michael; Fry, Donald E; Jones, Barbara L; Meimban, Roger J; Pine, Gregory J

    2010-10-01

    Unit costs of health services are substantially higher in the United States than in any other developed country in the world, without a correspondingly healthier population. An alternative payment structure, especially for high volume, high cost episodes of care (eg, total knee replacement), is needed to reward high quality care and reduce costs. The National Inpatient Sample of administrative claims data was used to measure risk-adjusted mortality, postoperative length-of-stay, costs of routine care, adverse outcome rates, and excess costs of adverse outcomes for total knee replacements performed between 2002 and 2005. Empirically identified inefficient and ineffective hospitals were then removed to create a reference group of high-performance hospitals. Predictive models for outcomes and costs were recalibrated to the reference hospitals and used to compute risk-adjusted outcomes and costs for all hospitals. Per case predicted costs were computed and compared with observed costs. Of the 688 hospitals with acceptable data, 62 failed to meet effectiveness criteria and 210 were identified as inefficient. The remaining 416 high-performance hospitals had 13.4% fewer risk-adjusted adverse outcomes (4.56%-3.95%; P costs ($12,773-$11,512; P costs. A payment system based on the demonstrated performance of effective, efficient hospitals can produce sizable cost savings without jeopardizing quality. In this study, 96% of total excess hospital costs resulted from higher routine costs at inefficient hospitals, whereas only 4% was associated with ineffective care.

  4. Applied cost allocation

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Hougaard, Jens Leth; Smilgins, Aleksandrs

    2016-01-01

    This paper deals with empirical computation of Aumann–Shapley cost shares for joint production. We show that if one uses a mathematical programing approach with its non-parametric estimation of the cost function there may be observations in the data set for which we have multiple Aumann–Shapley p...

  5. Can Food Stamps Do More to Improve Food Choices? An Economic Perspective--Higher Cost of Food in Some Areas May Affect Food Stamp Households' Ability To Make Healthy Food Choices

    OpenAIRE

    Nord, Mark; Hopwood, Heather

    2007-01-01

    The cost of “enough food,” estimated from the amount that low- and medium-income households in a geographic area report needing to spend to just meet their food needs, differs substantially across States and among metropolitan areas. In areas with high food costs, many food-stamp recipients are likely to have inadequate food resources to support healthy food choices.

  6. Applied cost allocation

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Hougaard, Jens Leth; Smilgins, Aleksandrs

    2016-01-01

    This paper deals with empirical computation of Aumann–Shapley cost shares for joint production. We show that if one uses a mathematical programing approach with its non-parametric estimation of the cost function there may be observations in the data set for which we have multiple Aumann–Shapley p...... of assumptions concerning firm behavior. These assumptions enable us to connect inefficient with efficient production and thereby provide consistent ways of allocating the costs arising from inefficiency....

  7. National Variation in Urethroplasty Cost and Predictors of Extreme Cost: A Cost Analysis with Policy Implications

    OpenAIRE

    Harris, Catherine R.; Osterberg, E. Charles; Sanford, Thomas; Alwaal, Amjad; Gaither, Thomas W.; McAninch, Jack W.; McCulloch, Charles E.; Breyer, Benjamin N.

    2016-01-01

    To determine which factors are associated with higher costs of urethroplasty procedure and whether these factors have been increasing over time. Identification of determinants of extreme costs may help reduce cost while maintaining quality.We conducted a retrospective analysis using the 2001-2010 Healthcare Cost and Utilization Project-Nationwide Inpatient Sample (HCUP-NIS). The HCUP-NIS captures hospital charges which we converted to cost using the HCUP cost-to-charge ratio. Log cost linear ...

  8. Design, Construction, and Use of a Single Board Computer Beowulf Cluster: Application of the Small-Footprint, Low-Cost, InSignal 5420 Octa Board

    OpenAIRE

    Cusick, James J.; Miller, William; Laurita, Nicholas; Pitt, Tasha

    2014-01-01

    In recent years development in the area of Single Board Computing has been advancing rapidly. At Wolters Kluwer's Corporate Legal Services Division a prototyping effort was undertaken to establish the utility of such devices for practical and general computing needs. This paper presents the background of this work, the design and construction of a 64 core 96 GHz cluster, and their possibility of yielding approximately 400 GFLOPs from a set of small footprint InSignal boards created for just o...

  9. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  10. [Cost-benefit analysis of cranial computed tomography in mild traumatic brain injury--appropriate depiction within the G-DRG system?].

    Science.gov (United States)

    Garving, C; Weber, C D; Poßelt, S; Pishnamaz, M; Pape, H C; Dienstknecht, T

    2014-06-01

    The treatment of patients with mild head injury is related to a continuous lack of finances. The current investigation summarises radiological costs of patients from a level I trauma centre and discusses the indication for CT scanning within the G-DRG system. The study includes all patients who underwent a CCT scan in 2011. Diagnosis, length of stay and cost data were recorded for every patient. Finally, frequent diagnosis groups were summarised to clusters (Basis-DRG/MDC 21A). A total of 380 patients was treated. Within the largest group (G-DRG B80Z) the costs for a CCT already took up one quarter of the total proceedings. In combination with the high cost for monitoring patients with mild head injuries this causes an ongoing lack of finances. In spite of the often necessary CCT investigation in mild head injuries, the earnings do not cover the costs of the patients. To improve the situation clear guidelines for CCT scanning should be provided and the reimbursement in particular in the diagnosis group of the G-DRG B80Z has to be improved. Georg Thieme Verlag KG Stuttgart · New York.

  11. Middleware enabling computational self-reflection: exploring the need for and some costs of selfreflecting networks with application to homeland defense

    Science.gov (United States)

    Kramer, Michael J.; Bellman, Kirstie L.; Landauer, Christopher

    2002-07-01

    This paper will review and examine the definitions of Self-Reflection and Active Middleware. Then it will illustrate a conceptual framework for understanding and enumerating the costs of Self-Reflection and Active Middleware at increasing levels of Application. Then it will review some application of Self-Reflection and Active Middleware to simulations. Finally it will consider the application and additional kinds of costs applying Self-Reflection and Active Middleware to sharing information among the organizations expected to participate in Homeland Defense.

  12. What Does It Cost a University to Educate One Student

    Directory of Open Access Journals (Sweden)

    Maria Andrea Lotho Santiago

    2007-02-01

    Full Text Available A dilemma administrators continually face is whether to continue offering degree programs despite low student uptake, especially because producing reliable cost data to aid decision making can prove difficult. Often, a university determines a standard cost per credit or unit and uses this figure as a basis for computing the total cost of running a degree program. This is then compared to a revenue stream and the difference, whether positive or negative, is used in decision making. However, this method of computing costs, although appealing for its simplicity, may fail to capture the effects of economies that may arise as one school or college services another. In this paper, we use a basic cost accounting methodology applied to the higher education system of the Philippines to compute for a cost per degree per student for a sample of public and private universities. Although the methodology is more time consuming, the computed figures are deemed closer to actual costs and, thus, we argue, are more reliable as inputs to financial decision making.

  13. Dissecting Costs of CT Study: Application of TDABC (Time-driven Activity-based Costing) in a Tertiary Academic Center.

    Science.gov (United States)

    Anzai, Yoshimi; Heilbrun, Marta E; Haas, Derek; Boi, Luca; Moshre, Kirk; Minoshima, Satoshi; Kaplan, Robert; Lee, Vivian S

    2017-02-01

    The lack of understanding of the real costs (not charge) of delivering healthcare services poses tremendous challenges in the containment of healthcare costs. In this study, we applied an established cost accounting method, the time-driven activity-based costing (TDABC), to assess the costs of performing an abdomen and pelvis computed tomography (AP CT) in an academic radiology department and identified opportunities for improved efficiency in the delivery of this service. The study was exempt from an institutional review board approval. TDABC utilizes process mapping tools from industrial engineering and activity-based costing. The process map outlines every step of discrete activity and duration of use of clinical resources, personnel, and equipment. By multiplying the cost per unit of capacity by the required task time for each step, and summing each component cost, the overall costs of AP CT is determined for patients in three settings, inpatient (IP), outpatient (OP), and emergency departments (ED). The component costs to deliver an AP CT study were as follows: radiologist interpretation: 40.1%; other personnel (scheduler, technologist, nurse, pharmacist, and transporter): 39.6%; materials: 13.9%; and space and equipment: 6.4%. The cost of performing CT was 13% higher for ED patients and 31% higher for inpatients (IP), as compared to that for OP. The difference in cost was mostly due to non-radiologist personnel costs. Approximately 80% of the direct costs of AP CT to the academic medical center are related to labor. Potential opportunities to reduce the costs include increasing the efficiency of utilization of CT, substituting lower cost resources when appropriate, and streamlining the ordering system to clarify medical necessity and clinical indications. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  14. Software para estimativa do custo operacional de máquinas agrícolas - MAQCONTROL Development of software to compute operational costs of farm machinery - MAQCONTROL

    Directory of Open Access Journals (Sweden)

    Liane Piacentini

    2012-06-01

    Full Text Available A seleção e a otimização de sistemas mecanizados são os principais objetivos da mecanização racional. Não é suficiente uma compra adequada do maquinário agrícola se sua utilização não for controlada em aspectos operacionais e financeiros. Neste trabalho, é descrito o desenvolvimento de software para estimativa do custo operacional de máquinas agrícolas (MAQCONTROL, utilizando o ambiente de desenvolvimento Borland Delphi e o banco de dados Firebird. Os custos operacionais foram divididos em fixos e variáveis. Nos custos fixos, foram estimadas as despesas com depreciação, juros, alojamento e seguros. Nos custos variáveis, foi dada ênfase aos custos de manutenção como: óleos lubrificantes, filtros, pneus, graxa, combustível, pequenos reparos e troca de peças. Os resultados demonstraram a eficiência do software para os objetivos propostos. Assim, o MAQCONTROL pode ser uma importante ferramenta no processo de administração rural, pois reduz os custos da informação e agiliza a determinação precisa dos custos operacionais de máquinas agrícolas.The rational mechanization has as main objectives the selection and optimization of mechanized systems. An adequate purchase of agricultural machinery is not sufficient if its use is not controlled in operational and financial aspects. This work describes the development of software to estimate operational costs of agricultural machinery (MAQCONTROL, using Borland Delphi's development environment and Firebird database. The operational costs were divided in fixed and variable. In fixed costs, the expenses with depreciation, interest, storage and insurance were estimated. In variable costs, the emphasis was given to the expenses on maintenance, lubricating oils, filters, tires, grease, fuel, small repairs, and parts replacement. Results have shown the software efficiency for the proposed objectives. Therefore, the MAQCONTROL software proved to be an important tool in the rural

  15. Development of analysis method of material f low cost accounting using lean technique in food production: A case study of Universal Food Public (UFC Co.,Ltd.

    Directory of Open Access Journals (Sweden)

    Wichai Chattinnawat

    2015-06-01

    Full Text Available This research aims to apply Lean technique in conjunction with analysis of Material Flow Cost Accounting (MFCA to production process of canned sweet corn in order to increase process efficiency, eliminate waste and reduce cost of the production. This research develops and presents new type of MFCA analysis by incorporating value and non-value added activities into the MFCA cost allocation process. According to the simulation-based measurement of the process efficiency, integrated cost allocation based on activity types results in higher proportion of negative product cost in comparison to that computed from conventional MFCA cost allocation. Thus, considering types of activities and process efficiency have great impacts on cost structure especially for the negative product cost. The research leads to solutions to improve work procedures, eliminate waste and reduce production cost. The overall cost per unit decreases with higher proportion of positive product cost.

  16. Testing the Limits of the Price Elasticity of Potential Students at Colleges and Universities: Has the Increased Direct Cost to the Student Begun to Drive down Higher Education Enrolment?

    Science.gov (United States)

    Fincher, Mark; Katsinas, Stephen

    2017-01-01

    Higher education enrolment has long been known to rise and fall counter to the current economic situation. This counter-cyclical enrolment response represents an economic principle where a price-elastic consumer is more likely make a consumption choice when another valuable use of resources is not available. Higher unemployment has historically…

  17. The disarmament cost

    International Nuclear Information System (INIS)

    Cattaneo, M.

    1996-01-01

    War is costly. But peace cost is even higher. The destruction of weapons (mines, nuclear weapons, chemical weapons) is much more expensive than their manufacturing. The soldiers demobilization cost is enormous, for instance in Angola, Mozambique, Nicaragua, Zimbabwe the demobilization of 270000 soldiers cost 2.5 10 9 francs. The measures intended to reduce the war risk are also expensive. That is why the arsenal of ex USSR is still intact. Today no international agency is entirely dedicated to peace building. The question is how would cost such an agency? (O.L.). 5 refs., 2 figs

  18. 高职《工程水文与水利计算》课程设计的理念与思路%Curruculum Design Concept and Idea Water Conservancy Computation in of Engineering Hydrology and Higher Vocational Education

    Institute of Scientific and Technical Information of China (English)

    黄泽钧

    2012-01-01

    课程改革是教育教学改革的核心,也是难点。本文就高等职业教育《工程水文与水利计算》课程设计,从课程教学目标、课程的设计理念、设计思路、设计的主要内容与原则几个方面进行了论述,提出课程设计要形成基于岗位工作要求的课程目标、基于职业能力培养的课程内容、基于教学规律的课程组织和适应学生自主学习的课程资源。%Curriculum reform is the core of reform in education and teaching and also the difficult point. The curriculum design of Engineering Hydrology and Water Conservancy Computation in higher vocational education is discussed from such aspects as the teaching objective, the design concept~ the design idea, the main content and pricinple. It is proposed that the objective should be set based on the requirements of the working position, the content should be designed for the cultivation of students" profesional ability, curriculum organization should be based on teaching rules and curriculum resources should be adaptable to students" autonomous learning.

  19. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  20. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  1. Cost accounting in ECN

    International Nuclear Information System (INIS)

    Wout, E.L.; Bever Donker, J.M. van.

    1979-01-01

    A five year planning is made in which the available money is distributed to the expected programmes. This five year plan is used as basis for working plan and budget for the next year. In the working plan all financial means are divided into kinds of costs, cost centres and cost units. Based on this working plan and the relevant budgets the tariffs are calculated per working centre (cost centre). The tariffs are fixed for a whole year. Up till now these tariffs are also basis for the cost unit accounting at the end of the year together with the results of the time registration. The estimated work shop services for the working centres are included in the tariffs. For the allocation of overhead costs ECN uses dynamic keys. Depreciation costs with respect to instruments, investments etc. are determined per working centre according to a computer programme. The cost unit related costs are charged directly to cost unit. This implies that project related in instruments are looked upon as running costs. In the future we will try to refine the present cost accounting system still further in this way that we will look upon a cost centre as a profit centre. Furthermore we will try to analyse the tariff and calculation deviations and under/over occupation deviations afterwards (post calculation). The information provided to the management knows a hierachic construction: project information to projectleader, programme (compound projects) information to programme coordinator, cost centre summary to department heads, attention area (compound programme) information to programme coordinator and managing director, ECN research (compound attention areas) information to general management, information re kind of costs to relevant persons, f.e. surveys of expenditure for part time personnel to personnel bureau. The information is provided by the department of Finance and Administrative Organisation. The entire scope of cost accounting is the responsibility of the head of the department

  2. Cost benefit analysis vs. referenda

    OpenAIRE

    Martin J. Osborne; Matthew A. Turner

    2007-01-01

    We consider a planner who chooses between two possible public policies and ask whether a referendum or a cost benefit analysis leads to higher welfare. We find that a referendum leads to higher welfare than a cost benefit analyses in "common value" environments. Cost benefit analysis is better in "private value" environments.

  3. A computer simulation model of the cost-effectiveness of routine Staphylococcus aureus screening and decolonization among lung and heart-lung transplant recipients.

    Science.gov (United States)

    Clancy, C J; Bartsch, S M; Nguyen, M H; Stuckey, D R; Shields, R K; Lee, B Y

    2014-06-01

    Our objective was to model the cost-effectiveness and economic value of routine peri-operative Staphylococcus aureus screening and decolonization of lung and heart-lung transplant recipients from hospital and third-party payer perspectives. We used clinical data from 596 lung and heart-lung transplant recipients to develop a model in TreeAge Pro 2009 (Williamsport, MA, USA). Sensitivity analyses varied S. aureus colonization rate (5-15 %), probability of infection if colonized (10-30 %), and decolonization efficacy (25-90 %). Data were collected from the Cardiothoracic Transplant Program at the University of Pittsburgh Medical Center. Consecutive lung and heart-lung transplant recipients from January 2006 to December 2010 were enrolled retrospectively. Baseline rates of S. aureus colonization, infection and decolonization efficacy were 9.6 %, 36.7 %, and 31.9 %, respectively. Screening and decolonization was economically dominant for all scenarios tested, providing more cost savings and health benefits than no screening. Savings per case averted (2012 $US) ranged from $73,567 to $133,157 (hospital perspective) and $10,748 to $16,723 (third party payer perspective), varying with the probability of colonization, infection, and decolonization efficacy. Using our clinical data, screening and decolonization led to cost savings per case averted of $240,602 (hospital perspective) and averted 6.7 S. aureus infections (4.3 MRSA and 2.4 MSSA); 89 patients needed to be screened to prevent one S. aureus infection. Our data support routine S. aureus screening and decolonization of lung and heart-lung transplant patients. The economic value of screening and decolonization was greater than in previous models of other surgical populations.

  4. Can Broader Diffusion of Value-Based Insurance Design Increase Benefits from US Health Care without Increasing Costs? Evidence from a Computer Simulation Model

    OpenAIRE

    Scott Braithwaite, R.; Omokaro, Cynthia; Justice, Amy C.; Nucifora, Kimberly; Roberts, Mark S.

    2010-01-01

    Editors' Summary Background More money is spent per person on health care in the US than in any other country. US health care expenditure accounts for 16.2% of the gross domestic product and this figure is rising. Indeed, the increase in health care costs is outstripping the economy's growth rate. Consequently, US policy makers and providers of health insurance?health care in the US is largely provided by the private sector and is paid for through private health insurance or through governmen...

  5. Combining computational modelling with radioisotope technology for a more cost- effective and time-efficient method of solving industrial and medical diagnostic problems

    International Nuclear Information System (INIS)

    Tu, J.Y.; Easey, J.F.; Burch, W.M.

    1997-01-01

    In this paper, some work on computational modelling for industrial operations and processes will be presented, for example, the modelling of fly-ash flow and the associated prediction of erosion in power utility boilers. The introduction and use of new formulations of encapsulated radioisotopes, currently being research at ANSTO, will open up further possibilities for the utilisation of radiotracer applications for a wider range of validation work not only in industrial but also in medical investigations. Applications of developed models to solving industrial problems will also be discussed in the paper

  6. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  7. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  8. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  9. What does an MRI scan cost?

    Science.gov (United States)

    Young, David W

    2015-11-01

    Historically, hospital departments have computed the costs of individual tests or procedures using the ratio of cost to charges (RCC) method, which can produce inaccurate results. To determine a more accurate cost of a test or procedure, the activity-based costing (ABC) method must be used. Accurate cost calculations will ensure reliable information about the profitability of a hospital's DRGs.

  10. 24 CFR 908.108 - Cost.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Cost. 908.108 Section 908.108..., RENTAL VOUCHER, AND MODERATE REHABILITATION PROGRAMS § 908.108 Cost. (a) General. The costs of the... computer hardware or software, or both, the cost of contracting for those services, or the cost of...

  11. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  12. The Challenge of Computers.

    Science.gov (United States)

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  13. Volunteer Computing for Science Gateways

    OpenAIRE

    Anderson, David

    2017-01-01

    This poster offers information about volunteer computing for science gateways that offer high-throughput computing services. Volunteer computing can be used to get computing power. This increases the visibility of the gateway to the general public as well as increasing computing capacity at little cost.

  14. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  15. The cost of electrocoagulation

    Energy Technology Data Exchange (ETDEWEB)

    Donini, J.C.; Kan, J.; Szynkarczuk, J.; Hassan, T.A.; Kar, K.L.

    1993-01-01

    Electrocoagulation could be an attractive and suitable method for separating solids from waste water. The electrocoagulation of kaolinite and bentonite suspensions was studied in a pilot electrocoagulation unit to assess the cost and efficiency of the process. Factors affecting cost such as the formation of passivation layers on electrode plates and the recirculation and concentration of sodium chloride were examined. Colorimetry was used to analyze aluminum content in the suspension. The results were used to calculate the cost due to consumption of electrode material (aluminium) during the process. Total cost was assumed to comprise the energy cost and the cost of electrode material. Comparison was based on the settling properties of the treated product: turbidity, settling rate, and cake height. In most cases, aluminium efficiency averaged around 200% and material cost accounted for 80% of total cost. Although higher concentrations of sodium chloride could only slightly increase aluminium efficiency and electrode efficiency, the higher concentrations resulted in much greater total cost, due to the greater current generated by the increased suspension conductivity, which in turn dissolved a larger amount of aluminium. The recirculation loop increased the flow rate by 3-10 times, enhancing the mass transport between the electrodes and resulting in lower cost and better settling properties. Over the course of two months the electrodes coatings became thicker while efficiency decreased. The electrode efficiency was found to be as high as 94% for virgin electrodes and as low as 10% after two months. 8 refs., 25 figs., 9 tabs.

  16. Enabling Earth Science Through Cloud Computing

    Science.gov (United States)

    Hardman, Sean; Riofrio, Andres; Shams, Khawaja; Freeborn, Dana; Springer, Paul; Chafin, Brian

    2012-01-01

    Cloud Computing holds tremendous potential for missions across the National Aeronautics and Space Administration. Several flight missions are already benefiting from an investment in cloud computing for mission critical pipelines and services through faster processing time, higher availability, and drastically lower costs available on cloud systems. However, these processes do not currently extend to general scientific algorithms relevant to earth science missions. The members of the Airborne Cloud Computing Environment task at the Jet Propulsion Laboratory have worked closely with the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to integrate cloud computing into their science data processing pipeline. This paper details the efforts involved in deploying a science data system for the CARVE mission, evaluating and integrating cloud computing solutions with the system and porting their science algorithms for execution in a cloud environment.

  17. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  18. A new model predictive control algorithm by reducing the computing time of cost function minimization for NPC inverter in three-phase power grids.

    Science.gov (United States)

    Taheri, Asghar; Zhalebaghi, Mohammad Hadi

    2017-11-01

    This paper presents a new control strategy based on finite-control-set model-predictive control (FCS-MPC) for Neutral-point-clamped (NPC) three-level converters. Containing some advantages like fast dynamic response, easy inclusion of constraints and simple control loop, makes the FCS-MPC method attractive to use as a switching strategy for converters. However, the large amount of required calculations is a problem in the widespread of this method. In this way, to resolve this problem this paper presents a modified method that effectively reduces the computation load compare with conventional FCS-MPC method and at the same time does not affect on control performance. The proposed method can be used for exchanging power between electrical grid and DC resources by providing active and reactive power compensations. Experiments on three-level converter for three Power Factor Correction (PFC), inductive and capacitive compensation modes verify the good and comparable performance. The results have been simulated using MATLAB/SIMULINK software. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Designer's unified cost model

    Science.gov (United States)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  20. Designers' unified cost model

    Science.gov (United States)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  1. Computational Platform About Amazon Web Services (Aws Distributed Rendering

    Directory of Open Access Journals (Sweden)

    Gabriel Rojas-Albarracín

    2017-09-01

    Full Text Available Today has created a dynamic in which people require higher image quality in different media formats (games, movies, animations. Further definition usually requires image processing larger; this brings the need for increased computing power. This paper presents a case study in which the implementation of a low-cost platform on the Amazon cloud for parallel processing of images and animation.

  2. Today's Higher Education IT Workforce

    Science.gov (United States)

    Bichsel, Jacqueline

    2014-01-01

    The professionals making up the current higher education IT workforce have been asked to adjust to a culture of increased IT consumerization, more sourcing options, broader interest in IT's transformative potential, and decreased resources. Disruptions that include the bring-your-own-everything era, cloud computing, new management practices,…

  3. Cost Behavior

    DEFF Research Database (Denmark)

    Hoffmann, Kira

    The objective of this dissertation is to investigate determinants and consequences of asymmetric cost behavior. Asymmetric cost behavior arises if the change in costs is different for increases in activity compared to equivalent decreases in activity. In this case, costs are termed “sticky......” if the change is less when activity falls than when activity rises, whereas costs are termed “anti-sticky” if the change is more when activity falls than when activity rises. Understanding such cost behavior is especially relevant for decision-makers and financial analysts that rely on accurate cost information...... to facilitate resource planning and earnings forecasting. As such, this dissertation relates to the topic of firm profitability and the interpretation of cost variability. The dissertation consists of three parts that are written in the form of separate academic papers. The following section briefly summarizes...

  4. Matching Cost Filtering for Dense Stereo Correspondence

    Directory of Open Access Journals (Sweden)

    Yimin Lin

    2013-01-01

    Full Text Available Dense stereo correspondence enabling reconstruction of depth information in a scene is of great importance in the field of computer vision. Recently, some local solutions based on matching cost filtering with an edge-preserving filter have been proved to be capable of achieving more accuracy than global approaches. Unfortunately, the computational complexity of these algorithms is quadratically related to the window size used to aggregate the matching costs. The recent trend has been to pursue higher accuracy with greater efficiency in execution. Therefore, this paper proposes a new cost-aggregation module to compute the matching responses for all the image pixels at a set of sampling points generated by a hierarchical clustering algorithm. The complexity of this implementation is linear both in the number of image pixels and the number of clusters. Experimental results demonstrate that the proposed algorithm outperforms state-of-the-art local methods in terms of both accuracy and speed. Moreover, performance tests indicate that parameters such as the height of the hierarchical binary tree and the spatial and range standard deviations have a significant influence on time consumption and the accuracy of disparity maps.

  5. How to Bill Your Computer Services.

    Science.gov (United States)

    Dooskin, Herbert P.

    1981-01-01

    A computer facility billing procedure should be designed so that the full costs of a computer center operation are equitably charged to the users. Design criteria, costing methods, and management's role are discussed. (Author/MLF)

  6. Higher Franz-Reidemeister torsion

    CERN Document Server

    Igusa, Kiyoshi

    2002-01-01

    The book is devoted to the theory of topological higher Franz-Reidemeister torsion in K-theory. The author defines the higher Franz-Reidemeister torsion based on Volodin's K-theory and Borel's regulator map. He describes its properties and generalizations and studies the relation between the higher Franz-Reidemeister torsion and other torsions used in K-theory: Whitehead torsion and Ray-Singer torsion. He also presents methods of computing higher Franz-Reidemeister torsion, illustrates them with numerous examples, and describes various applications of higher Franz-Reidemeister torsion, particularly for the study of homology of mapping class groups. Packed with up-to-date information, the book provides a unique research and reference tool for specialists working in algebraic topology and K-theory.

  7. Nuclear Energy Research Initiative Project No. 02 103 Innovative Low Cost Approaches to Automating QA/QC of Fuel Particle Production Using On Line Nondestructive Methods for Higher Reliability Final Project Report

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Salahuddin; Batishko, Charles R.; Flake, Matthew; Good, Morris S.; Mathews, Royce; Morra, Marino; Panetta, Paul D.; Pardini, Allan F.; Sandness, Gerald A.; Tucker, Brian J.; Weier, Dennis R.; Hockey, Ronald L.; Gray, Joseph N.; Saurwein, John J.; Bond, Leonard J.; Lowden, Richard A.; Miller, James H.

    2006-02-28

    This Nuclear Energy Research Initiative (NERI) project was tasked with exploring, adapting, developing and demonstrating innovative nondestructive test methods to automate nuclear coated particle fuel inspection so as to provide the United States (US) with necessary improved and economical Quality Assurance and Control (QA/QC) that is needed for the fuels for several reactor concepts being proposed for both near term deployment [DOE NE & NERAC, 2001] and Generation IV nuclear systems. Replacing present day QA/QC methods, done manually and in many cases destructively, with higher speed automated nondestructive methods will make fuel production for advanced reactors economically feasible. For successful deployment of next generation reactors that employ particle fuels, or fuels in the form of pebbles based on particles, extremely large numbers of fuel particles will require inspection at throughput rates that do not significantly impact the proposed manufacturing processes. The focus of the project is nondestructive examination (NDE) technologies that can be automated for production speeds and make either: (I) On Process Measurements or (II) In Line Measurements. The inspection technologies selected will enable particle “quality” qualification as a particle or group of particles passes a sensor. A multiple attribute dependent signature will be measured and used for qualification or process control decisions. A primary task for achieving this objective is to establish standard signatures for both good/acceptable particles and the most problematic types of defects using several nondestructive methods.

  8. Attrition Cost Model Instruction Manual

    Science.gov (United States)

    Yanagiura, Takeshi

    2012-01-01

    This instruction manual explains in detail how to use the Attrition Cost Model program, which estimates the cost of student attrition for a state's higher education system. Programmed with SAS, this model allows users to instantly calculate the cost of attrition and the cumulative attrition rate that is based on the most recent retention and…

  9. 10 March 2008 - Swedish Minister for Higher Education and Research L. Leijonborg signing the guest book with CERN Chef Scientific Officer J. Engelen, followed by the signature of the Swedish Computing Memorandum of Understanding by the Director General of the Swedish Research Council P. Ömling.

    CERN Multimedia

    Maximilien Brice

    2008-01-01

    10 March 2008 - Swedish Minister for Higher Education and Research L. Leijonborg signing the guest book with CERN Chef Scientific Officer J. Engelen, followed by the signature of the Swedish Computing Memorandum of Understanding by the Director General of the Swedish Research Council P. Ömling.

  10. 49 CFR 19.27 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... applicable to the entity incurring the costs. Thus, allowability of costs incurred by State, local or... Circular A-87, “Cost Principles for State and Local Governments.” The allowability of costs incurred by non... Principles for Non-Profit Organizations.” The allowability of costs incurred by institutions of higher...

  11. 29 CFR 95.27 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... cost principles applicable to the entity incurring the costs. Thus, allowability of costs incurred by... Governments.” The allowability of costs incurred by non-profit organizations is determined in accordance with... Organizations.” The allowability of costs incurred by institutions of higher education is determined in...

  12. 24 CFR 84.27 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    ... to the entity incurring the costs. Thus, allowability of costs incurred by State, local or federally..., “Cost Principles for State and Local Governments.” The allowability of costs incurred by non-profit...-Profit Organizations.” The allowability of costs incurred by institutions of higher education is...

  13. 7 CFR 550.25 - Allowable costs.

    Science.gov (United States)

    2010-01-01

    ... cost principles applicable to the entity incurring the costs. Thus, allowability of costs incurred by... at 2 CFR part 225. The allowability of costs incurred by non-profit organizations is determined in... at 2 CFR part 230. The allowability of costs incurred by institutions of higher education is...

  14. 36 CFR 1210.27 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... applicable to the entity incurring the costs. Thus, allowability of costs incurred by State, local or... Circular A-87, “Cost Principles for State and Local Governments.” The allowability of costs incurred by non... Principles for Non-Profit Organizations.” The allowability of costs incurred by institutions of higher...

  15. 7 CFR 3019.27 - Allowable costs.

    Science.gov (United States)

    2010-01-01

    ... applicable to the entity incurring the costs. Thus, allowability of costs incurred by State, local or... Circular A-87, “Cost Principles for State and Local Governments.” The allowability of costs incurred by non... Principles for Non-Profit Organizations.” The allowability of costs incurred by institutions of higher...

  16. 34 CFR 74.27 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... Procedures or uniform cost accounting standards that comply with cost principles acceptable to ED. (b) The... OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial... principles for determining allowable costs. Allowability of costs are determined in accordance with the cost...

  17. Teacher Costs

    OpenAIRE

    DINIS MOTA DA COSTA PATRICIA; DE SOUSA LOBO BORGES DE ARAUJO LUISA

    2015-01-01

    The purpose of this technical brief is to assess current methodologies for the collection and calculation of teacher costs in European Union (EU) Member States in view of improving data series and indicators related to teacher salaries and teacher costs. To this end, CRELL compares the Eurydice collection on teacher salaries with the similar Organisation for Economic Co-operation and Development (OECD) data collection and calculates teacher costs based on the methodology established by Statis...

  18. Execution spaces for simple higher dimensional automata

    DEFF Research Database (Denmark)

    Raussen, Martin

    2012-01-01

    Higher dimensional automata (HDA) are highly expressive models for concurrency in Computer Science, cf van Glabbeek (Theor Comput Sci 368(1–2): 168–194, 2006). For a topologist, they are attractive since they can be modeled as cubical complexes—with an inbuilt restriction for directions of allowa......Higher dimensional automata (HDA) are highly expressive models for concurrency in Computer Science, cf van Glabbeek (Theor Comput Sci 368(1–2): 168–194, 2006). For a topologist, they are attractive since they can be modeled as cubical complexes—with an inbuilt restriction for directions...

  19. Batteries: Lower cost than gasoline?

    International Nuclear Information System (INIS)

    Werber, Mathew; Fischer, Michael; Schwartz, Peter V.

    2009-01-01

    We compare the lifecycle costs of an electric car to a similar gasoline-powered vehicle under different scenarios of required driving range and cost of gasoline. An electric car is cost competitive for a significant portion of the scenarios: for cars of lower range and for higher gasoline prices. Electric cars with ∼150 km range are a technologically viable, cost competitive, high performance, high efficiency alternative that can presently suit the vast majority of consumers' needs.

  20. The provider cost of treating tuberculosis in Bauchi State, Nigeria

    Directory of Open Access Journals (Sweden)

    Nisser Ali Umar

    2011-09-01

    Full Text Available The study was aimed at assessing the economic cost shouldered by government, as providers, in the provision of free tuberculosis (TB diagnosis and treatment services in Bauchi State, northern Nigeria. A cost analysis study was designed and questionnaires administered by the principal investigators to officers in charge of 27 randomly sampled government TB services providers across the State of Bauchi. Seventeen of these centers were primary care centers, 9 secondary care providers and one was a tertiary care provider. Data was also collected from personnel and projects records in the State Ministry of Health, of Works as well as the Ministry of Budget and Planning. The cost of buildings, staff and equipment replacement, laboratory, radiology and drugs in facilities were assessed and costs attributable tuberculosis inpatient, outpatient and directly observed therapy (DOT services were estimated from the total cost based on the proportion of TB cases in the total patient pool accessing those services. The average proportion of TB patients in facilities was 3.4% in overall, 3.3% among inpatients and 3.1% in the outpatient population. The average cost spent to treat a patient with TB was estimated at US $227.14. The cost of inpatient care averaged $16.95/patient; DOT and outpatient services was $133.34/patient, while the overhead cost per patient was $30.89. The overall cost and all computed cost elements, except for DOT services, were highest in the tertiary center and least expensive in the infectious diseases hospital partly due to the higher administrative and other overhead recurrent spending in the tertiary health facility while the lower overhead cost observed in the infectious diseases hospital could be due to the economy of scale as a result of the relative higher number of TB cases seen in the facility operating with relatively same level of resources as other facilities in the state.

  1. Financial Resource Allocation in Higher Education

    Science.gov (United States)

    Ušpuriene, Ana; Sakalauskas, Leonidas; Dumskis, Valerijonas

    2017-01-01

    The paper considers a problem of financial resource allocation in a higher education institution. The basic financial management instruments and the multi-stage cost minimization model created are described involving financial instruments to constraints. Both societal and institutional factors that determine the costs of educating students are…

  2. Rehabilitation costs

    Energy Technology Data Exchange (ETDEWEB)

    Kubo, Arthur S [BDM Corp., VA (United States); [Bikini Atoll Rehabilitation Committee, Berkeley, CA (United States)

    1986-07-01

    The costs of radioactivity contamination control and other matters relating to the resettlement of Bikin atoll were reviewed for Bikini Atoll Rehabilitation Committee by a panel of engineers which met in Berkeley, California on January 22-24, 1986. This Appendix presents the cost estimates.

  3. Rehabilitation costs

    International Nuclear Information System (INIS)

    Kubo, Arthur S.

    1986-01-01

    The costs of radioactivity contamination control and other matters relating to the resettlement of Bikin atoll were reviewed for Bikini Atoll Rehabilitation Committee by a panel of engineers which met in Berkeley, California on January 22-24, 1986. This Appendix presents the cost estimates

  4. Cost considerations

    NARCIS (Netherlands)

    Michiel Ras; Debbie Verbeek-Oudijk; Evelien Eggink

    2013-01-01

    Original title: Lasten onder de loep The Dutch government spends almost 7 billion euros  each year on care for people with intellectual disabilities, and these costs are rising steadily. This report analyses what underlies the increase in costs that occurred between 2007 and 2011. Was

  5. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  6. Troubleshooting Costs

    Science.gov (United States)

    Kornacki, Jeffrey L.

    Seventy-six million cases of foodborne disease occur each year in the United States alone. Medical and lost productivity costs of the most common pathogens are estimated to be 5.6-9.4 billion. Product recalls, whether from foodborne illness or spoilage, result in added costs to manufacturers in a variety of ways. These may include expenses associated with lawsuits from real or allegedly stricken individuals and lawsuits from shorted customers. Other costs include those associated with efforts involved in finding the source of the contamination and eliminating it and include time when lines are shut down and therefore non-productive, additional non-routine testing, consultant fees, time and personnel required to overhaul the entire food safety system, lost market share to competitors, and the cost associated with redesign of the factory and redesign or acquisition of more hygienic equipment. The cost associated with an effective quality assurance plan is well worth the effort to prevent the situations described.

  7. Cost comparisons

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    How much does the LHC cost? And how much does this represent in other currencies? Below we present a table showing some comparisons with the cost of other projects. Looking at the figures, you will see that the cost of the LHC can be likened to that of three skyscrapers, or two seasons of Formula 1 racing! One year's budget of a single large F1 team is comparable to the entire materials cost of the ATLAS or CMS experiments.   Please note that all the figures are rounded for ease of reading.    CHF € $   LHC 4.6 billions 3 billions  4 billions   Space Shuttle Endeavour (NASA) 1.9 billion 1.3 billion 1.7 billion   Hubble Space Telescope (cost at launch – NASA/...

  8. Benefit-cost assessment programs: Costa Rica case study

    International Nuclear Information System (INIS)

    Clark, A.L.; Trocki, L.K.

    1991-01-01

    An assessment of mineral potential, in terms of types and numbers of deposits, approximate location and associated tonnage and grades, is a valuable input to a nation's economic planning and mineral policy development. This study provides a methodology for applying benefit-cost analysis to mineral resource assessment programs, both to determine the cost effectiveness of resource assessments and to ascertain future benefits to the nation. In a case study of Costa Rica, the benefit-cost ratio of a resource assessment program was computed to be a minimum of 4:1 ($10.6 million to $2.5 million), not including the economic benefits accuring from the creation of 800 mining sector and 1,200 support services jobs. The benefit-cost ratio would be considerably higher if presently proposed revisions of mineral policy were implemented and benefits could be defined for Costa Rica

  9. [Operating cost analysis of anaesthesia: activity based costing (ABC analysis)].

    Science.gov (United States)

    Majstorović, Branislava M; Kastratović, Dragana A; Vučović, Dragan S; Milaković, Branko D; Miličić, Biljana R

    2011-01-01

    Cost of anaesthesiology represent defined measures to determine a precise profile of expenditure estimation of surgical treatment, which is important regarding planning of healthcare activities, prices and budget. In order to determine the actual value of anaestesiological services, we started with the analysis of activity based costing (ABC) analysis. Retrospectively, in 2005 and 2006, we estimated the direct costs of anestesiological services (salaries, drugs, supplying materials and other: analyses and equipment.) of the Institute of Anaesthesia and Resuscitation of the Clinical Centre of Serbia. The group included all anesthetized patients of both sexes and all ages. We compared direct costs with direct expenditure, "each cost object (service or unit)" of the Republican Healthcare Insurance. The Summary data of the Departments of Anaesthesia documented in the database of the Clinical Centre of Serbia. Numerical data were utilized and the numerical data were estimated and analyzed by computer programs Microsoft Office Excel 2003 and SPSS for Windows. We compared using the linear model of direct costs and unit costs of anaesthesiological services from the Costs List of the Republican Healthcare Insurance. Direct costs showed 40% of costs were spent on salaries, (32% on drugs and supplies, and 28% on other costs, such as analyses and equipment. The correlation of the direct costs of anaestesiological services showed a linear correlation with the unit costs of the Republican Healthcare Insurance. During surgery, costs of anaesthesia would increase by 10% the surgical treatment cost of patients. Regarding the actual costs of drugs and supplies, we do not see any possibility of costs reduction. Fixed elements of direct costs provide the possibility of rationalization of resources in anaesthesia.

  10. Total life-cycle cost analysis of conventional and alternative fueled vehicles

    International Nuclear Information System (INIS)

    Cardullo, M.W.

    1993-01-01

    Total Life-Cycle Cost (TLCC) Analysis can indicate whether paying higher capital costs for advanced technology with low operating and/or environmental costs is advantageous over paying lower capital costs for conventional technology with higher operating and/or environmental costs. While minimizing total life-cycle cost is an important consideration, the consumer often identifies non-cost-related benefits or drawbacks that make more expensive options appear more attractive. The consumer is also likely to heavily weigh initial capital costs while giving limited consideration to operating and/or societal costs, whereas policy-makers considering external costs, such as those resulting from environmental impacts, may reach significantly different conclusions about which technologies are most advantageous to society. This paper summarizes a TLCC model which was developed to facilitate consideration of the various factors involved in both individual and societal policy decision making. The model was developed as part of a US Department of Energy Contract and has been revised to reflect changes necessary to make the model more realistic. The model considers capital, operating, salvage, and environmental costs for cars, vans, and buses using conventional and alternative fuels. The model has been developed to operate on an IBM or compatible personal computer platform using the commercial spreadsheet program MicroSoft Excell reg-sign Version 4 for Windows reg-sign and can be easily kept current because its modular structure allows straightforward access to embedded data sets for review and update

  11. Tracking and computing

    International Nuclear Information System (INIS)

    Niederer, J.

    1983-01-01

    This note outlines several ways in which large scale simulation computing and programming support may be provided to the SSC design community. One aspect of the problem is getting supercomputer power without the high cost and long lead times of large scale institutional computing. Another aspect is the blending of modern programming practices with more conventional accelerator design programs in ways that do not also swamp designers with the details of complicated computer technology

  12. Low-cost positron computed tomography

    International Nuclear Information System (INIS)

    Ott, R.J.; Batty, V.; Bateman, T.E.; Clack, R.; Flower, M.A.; Leach, M.O.; Marsden, P.; Webb, S.; McCready, V.R.

    1986-01-01

    After briefly describing the technique of positron emission tomography (PET) and the types of detectors used, the operational experience of a recently developed multi-wire proportional chamber positron camera which can be used to provide images using radionuclides such as 68 Ga, 124 I, 82 Rb, 55 Co, 18 F and 11 C is discussed. Clinical applications included PET imaging of the thyroid and the brain and possible future applications include PET imaging of the liver and tumour localization using antigen-specific monoclonal antibodies. Future developments to improve the sensitivity and spatial resolution of the detectors used in PET are discussed. (U.K.)

  13. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  14. Operating dedicated data centers – is it cost-effective?

    International Nuclear Information System (INIS)

    Ernst, M; Hogue, R; Hollowell, C; Strecker-Kellog, W; Wong, A; Zaytsev, A

    2014-01-01

    The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.

  15. Operating Dedicated Data Centers - Is It Cost-Effective?

    Science.gov (United States)

    Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.

    2014-06-01

    The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.

  16. 20 CFR 404.270 - Cost-of-living increases.

    Science.gov (United States)

    2010-04-01

    ... INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases § 404.270 Cost-of-living... rises in the cost of living. These automatic increases also apply to other benefit amounts, as described...

  17. A higher protein intake is not associated with 5-year change in mid-thigh muscle cross-sectional area by computed tomography in older adults : the health, aging, and body composition (Health ABC) study

    NARCIS (Netherlands)

    Verreijen, A.M.; Engberink, M.F.; Brouwer, I.A.; Cawthon, P.M.; Newman, A.B.; Tylavsky, F.A.; Harris, T.B.; Weijs, P.J.M.; Visser, M.

    2017-01-01

    Rationale: A higher protein intake is suggested to preserve muscle mass during aging, and may therefore reduce the risk for sarcopenia. We explored whether the amount, type (animal/vegetable) and essential amino acid (EAA) composition of protein intake were associated with 5-year change in mid-thigh

  18. Higher performance and lower cost optical DPSK receiver

    Data.gov (United States)

    National Aeronautics and Space Administration — To demonstrate (benchtop experiment) a DPSK receiver with a free-space interferometer, showing that fiber-optic coupling, associated adaptive optics, and optical...

  19. CECP, Decommissioning Costs for PWR and BWR

    International Nuclear Information System (INIS)

    Bierschbach, M.C.

    1997-01-01

    1 - Description of program or function: The Cost Estimating Computer Program CECP, designed for use on an IBM personal computer or equivalent, was developed for estimating the cost of decommissioning boiling water reactor (BWR) and light-water reactor (PWR) power stations to the point of license termination. 2 - Method of solution: Cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial volume and costs; and manpower staffing costs. Using equipment and consumables costs and inventory data supplied by the user, CECP calculates unit cost factors and then combines these factors with transportation and burial cost algorithms to produce a complete report of decommissioning costs. In addition to costs, CECP also calculates person-hours, crew-hours, and exposure person-hours associated with decommissioning. 3 - Restrictions on the complexity of the problem: The program is designed for a specific waste charge structure. The waste cost data structure cannot handle intermediate waste handlers or changes in the charge rate structures. The decommissioning of a reactor can be divided into 5 periods. 200 different items for special equipment costs are possible. The maximum amount for each special equipment item is 99,999,999$. You can support data for 10 buildings, 100 components each; ESTS1071/01: There are 65 components for 28 systems available to specify the contaminated systems costs (BWR). ESTS1071/02: There are 75 components for 25 systems available to specify the contaminated systems costs (PWR)

  20. Costs of hospital malnutrition.

    Science.gov (United States)

    Curtis, Lori Jane; Bernier, Paule; Jeejeebhoy, Khursheed; Allard, Johane; Duerksen, Donald; Gramlich, Leah; Laporte, Manon; Keller, Heather H

    2017-10-01

    Hospital malnutrition has been established as a critical, prevalent, and costly problem in many countries. Many cost studies are limited due to study population or cost data used. The aims of this study were to determine: the relationship between malnutrition and hospital costs; the influence of confounders on, and the drivers (medical or surgical patients or degree of malnutrition) of the relationship; and whether hospital reported cost data provide similar information to administrative data. To our knowledge, the last two goals have not been studied elsewhere. Univariate and multivariate analyses were performed on data from the Canadian Malnutrition Task Force prospective cohort study combined with administrative data from the Canadian Institute for Health Information. Subjective Global Assessment was used to assess the relationship between nutritional status and length of stay and hospital costs, controlling for health and demographic characteristics, for 956 patients admitted to medical and surgical wards in 18 hospitals across Canada. After controlling for patient and hospital characteristics, moderately malnourished patients' (34% of surveyed patients) hospital stays were 18% (p = 0.014) longer on average than well-nourished patients. Medical stays increased by 23% (p = 0.014), and surgical stays by 32% (p = 0.015). Costs were, on average, between 31% and 34% (p-values < 0.05) higher than for well-nourished patients with similar characteristics. Severely malnourished patients (11% of surveyed patients) stayed 34% (p = 0.000) longer and had 38% (p = 0.003) higher total costs than well-nourished patients. They stayed 53% (p = 0.001) longer in medical beds and had 55% (p = 0.003) higher medical costs, on average. Trends were similar no matter the type of costing data used. Over 40% of patients were found to be malnourished (1/3 moderately and 1/10 severely). Malnourished patients had longer hospital stays and as a result cost more than well