WorldWideScience

Sample records for current computational resources

  1. Current Resource Imagery Projects

    Data.gov (United States)

    Farm Service Agency, Department of Agriculture — Map showing coverage of current Resource imagery projects. High resolution/large scale Resource imagery is typically acquired for the U.S. Forest Service and other...

  2. Aggregated Computational Toxicology Resource (ACTOR)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aggregated Computational Toxicology Resource (ACTOR) is a database on environmental chemicals that is searchable by chemical name and other identifiers, and by...

  3. Aggregated Computational Toxicology Online Resource

    Data.gov (United States)

    U.S. Environmental Protection Agency — Aggregated Computational Toxicology Online Resource (AcTOR) is EPA's online aggregator of all the public sources of chemical toxicity data. ACToR aggregates data...

  4. Framework Resources Multiply Computing Power

    Science.gov (United States)

    2010-01-01

    As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.

  5. Computer Resources | College of Engineering & Applied Science

    Science.gov (United States)

    Engineering Concentration on Ergonomics M.S. Program in Computer Science Interdisciplinary Concentration on Structural Engineering Laboratory Water Resources Laboratory Computer Science Department Computer Science Academic Programs Computer Science Undergraduate Programs Computer Science Major Computer Science Tracks

  6. Turning Video Resource Management into Cloud Computing

    Directory of Open Access Journals (Sweden)

    Weili Kou

    2016-07-01

    Full Text Available Big data makes cloud computing more and more popular in various fields. Video resources are very useful and important to education, security monitoring, and so on. However, issues of their huge volumes, complex data types, inefficient processing performance, weak security, and long times for loading pose challenges in video resource management. The Hadoop Distributed File System (HDFS is an open-source framework, which can provide cloud-based platforms and presents an opportunity for solving these problems. This paper presents video resource management architecture based on HDFS to provide a uniform framework and a five-layer model for standardizing the current various algorithms and applications. The architecture, basic model, and key algorithms are designed for turning video resources into a cloud computing environment. The design was tested by establishing a simulation system prototype.

  7. LHCb Computing Resources: 2019 requests and reassessment of 2018 requests

    CERN Document Server

    Bozzi, Concezio

    2017-01-01

    This document presents the computing resources needed by LHCb in 2019 and a reassessment of the 2018 requests, as resulting from the current experience of Run2 data taking and minor changes in the LHCb computing model parameters.

  8. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

  9. Overview of water resource assessment in South Africa: Current ...

    African Journals Online (AJOL)

    Overview of water resource assessment in South Africa: Current state and future challenges. ... These studies illustrate how the exponential growth in computer power and the concomitant development of highly sophisticated tools have changed the manner in which our water resources have been appraised, allowing us to ...

  10. Resource Management in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Andrei IONESCU

    2015-01-01

    Full Text Available Mobile cloud computing is a major research topic in Information Technology & Communications. It integrates cloud computing, mobile computing and wireless networks. While mainly built on cloud computing, it has to operate using more heterogeneous resources with implications on how these resources are managed and used. Managing the resources of a mobile cloud is not a trivial task, involving vastly different architectures. The process is outside the scope of human users. Using the resources by the applications at both platform and software tiers come with its own challenges. This paper presents different approaches in use for managing cloud resources at infrastructure and platform levels.

  11. Statistics Online Computational Resource for Education

    Science.gov (United States)

    Dinov, Ivo D.; Christou, Nicolas

    2009-01-01

    The Statistics Online Computational Resource (http://www.SOCR.ucla.edu) provides one of the largest collections of free Internet-based resources for probability and statistics education. SOCR develops, validates and disseminates two core types of materials--instructional resources and computational libraries. (Contains 2 figures.)

  12. Resource allocation in grid computing

    NARCIS (Netherlands)

    Koole, Ger; Righter, Rhonda

    2007-01-01

    Grid computing, in which a network of computers is integrated to create a very fast virtual computer, is becoming ever more prevalent. Examples include the TeraGrid and Planet-lab.org, as well as applications on the existing Internet that take advantage of unused computing and storage capacity of

  13. Enabling opportunistic resources for CMS Computing Operations

    Energy Technology Data Exchange (ETDEWEB)

    Hufnagel, Dick [Fermilab

    2015-11-19

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resourcesresources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.

  14. ACToR - Aggregated Computational Toxicology Resource

    International Nuclear Information System (INIS)

    Judson, Richard; Richard, Ann; Dix, David; Houck, Keith; Elloumi, Fathi; Martin, Matthew; Cathey, Tommy; Transue, Thomas R.; Spencer, Richard; Wolf, Maritja

    2008-01-01

    ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food and Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Center for Computational Toxicology, ACToR helps manage large data sets being used in a high-throughput environmental chemical screening and prioritization program called ToxCast TM

  15. Efficient Resource Management in Cloud Computing

    OpenAIRE

    Rushikesh Shingade; Amit Patil; Shivam Suryawanshi; M. Venkatesan

    2015-01-01

    Cloud computing, one of the widely used technology to provide cloud services for users who are charged for receiving services. In the aspect of a maximum number of resources, evaluating the performance of Cloud resource management policies are difficult to optimize efficiently. There are different simulation toolkits available for simulation and modelling the Cloud computing environment like GridSim CloudAnalyst, CloudSim, GreenCloud, CloudAuction etc. In proposed Efficient Resource Manage...

  16. Overview of water resource assessment in South Africa: Current ...

    African Journals Online (AJOL)

    Overview of water resource assessment in South Africa: Current state and future challenges. ... a helpful Frequently Asked Questions about PDFs. Alternatively, you can download the PDF file directly to your computer, from where it can be opened using a PDF reader. To download the PDF, click the Download link above.

  17. Exploitation of heterogeneous resources for ATLAS Computing

    CERN Document Server

    Chudoba, Jiri; The ATLAS collaboration

    2018-01-01

    LHC experiments require significant computational resources for Monte Carlo simulations and real data processing and the ATLAS experiment is not an exception. In 2017, ATLAS exploited steadily almost 3M HS06 units, which corresponds to about 300 000 standard CPU cores. The total disk and tape capacity managed by the Rucio data management system exceeded 350 PB. Resources are provided mostly by Grid computing centers distributed in geographically separated locations and connected by the Grid middleware. The ATLAS collaboration developed several systems to manage computational jobs, data files and network transfers. ATLAS solutions for job and data management (PanDA and Rucio) were generalized and now are used also by other collaborations. More components are needed to include new resources such as private and public clouds, volunteers' desktop computers and primarily supercomputers in major HPC centers. Workflows and data flows significantly differ for these less traditional resources and extensive software re...

  18. SOCR: Statistics Online Computational Resource

    OpenAIRE

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis...

  19. Framework of Resource Management for Intercloud Computing

    Directory of Open Access Journals (Sweden)

    Mohammad Aazam

    2014-01-01

    Full Text Available There has been a very rapid increase in digital media content, due to which media cloud is gaining importance. Cloud computing paradigm provides management of resources and helps create extended portfolio of services. Through cloud computing, not only are services managed more efficiently, but also service discovery is made possible. To handle rapid increase in the content, media cloud plays a very vital role. But it is not possible for standalone clouds to handle everything with the increasing user demands. For scalability and better service provisioning, at times, clouds have to communicate with other clouds and share their resources. This scenario is called Intercloud computing or cloud federation. The study on Intercloud computing is still in its start. Resource management is one of the key concerns to be addressed in Intercloud computing. Already done studies discuss this issue only in a trivial and simplistic way. In this study, we present a resource management model, keeping in view different types of services, different customer types, customer characteristic, pricing, and refunding. The presented framework was implemented using Java and NetBeans 8.0 and evaluated using CloudSim 3.0.3 toolkit. Presented results and their discussion validate our model and its efficiency.

  20. Current and future resources for functional metagenomics

    Directory of Open Access Journals (Sweden)

    Kathy Nguyen Lam

    2015-10-01

    Full Text Available Functional metagenomics is a powerful experimental approach for studying gene function, starting from the extracted DNA of mixed microbial populations. A functional approach relies on the construction and screening of metagenomic libraries – physical libraries that contain DNA cloned from environmental metagenomes. The information obtained from functional metagenomics can help in future annotations of gene function and serve as a complement to sequence-based metagenomics. In this Perspective, we begin by summarizing the technical challenges of constructing metagenomic libraries and emphasize their value as resources. We then discuss libraries constructed using the popular cloning vector, pCC1FOS, and highlight the strengths and shortcomings of this system, alongside possible strategies to maximize existing pCC1FOS-based libraries by screening in diverse hosts. Finally, we discuss the known bias of libraries constructed from human gut and marine water samples, present results that suggest bias may also occur for soil libraries, and consider factors that bias metagenomic libraries in general. We anticipate that discussion of current resources and limitations will advance tools and technologies for functional metagenomics research.

  1. Integration of cloud resources in the LHCb distributed computing

    International Nuclear Information System (INIS)

    García, Mario Úbeda; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel; Muñoz, Víctor Méndez

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  2. Integration of Cloud resources in the LHCb Distributed Computing

    Science.gov (United States)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  3. Optimised resource construction for verifiable quantum computation

    International Nuclear Information System (INIS)

    Kashefi, Elham; Wallden, Petros

    2017-01-01

    Recent developments have brought the possibility of achieving scalable quantum networks and quantum devices closer. From the computational point of view these emerging technologies become relevant when they are no longer classically simulatable. Hence a pressing challenge is the construction of practical methods to verify the correctness of the outcome produced by universal or non-universal quantum devices. A promising approach that has been extensively explored is the scheme of verification via encryption through blind quantum computation. We present here a new construction that simplifies the required resources for any such verifiable protocol. We obtain an overhead that is linear in the size of the input (computation), while the security parameter remains independent of the size of the computation and can be made exponentially small (with a small extra cost). Furthermore our construction is generic and could be applied to any universal or non-universal scheme with a given underlying graph. (paper)

  4. 24th Current Trends in Computational Chemistry

    Science.gov (United States)

    2017-05-17

    Corps of Engineers Army Research Office Conference on Current Trends in Computational Chemistry 2016 NOVEMBER 11-12, 2016 JACKSON, MS... Chemistry and Biochemistry Jackson, MS 39217 U.S.A. Tel: 6019793723 E-mail: shonda@icnanotox.org Richard Alo Dean College of Science, Engineering ...Report: 24th Current Trends in Computational Chemistry The views, opinions and/or findings contained in this report are those of the author(s) and should

  5. VECTR: Virtual Environment Computational Training Resource

    Science.gov (United States)

    Little, William L.

    2018-01-01

    The Westridge Middle School Curriculum and Community Night is an annual event designed to introduce students and parents to potential employers in the Central Florida area. NASA participated in the event in 2017, and has been asked to come back for the 2018 event on January 25. We will be demonstrating our Microsoft Hololens Virtual Rovers project, and the Virtual Environment Computational Training Resource (VECTR) virtual reality tool.

  6. LHCb Computing Resource usage in 2017

    CERN Document Server

    Bozzi, Concezio

    2018-01-01

    This document reports the usage of computing resources by the LHCb collaboration during the period January 1st – December 31st 2017. The data in the following sections have been compiled from the EGI Accounting portal: https://accounting.egi.eu. For LHCb specific information, the data is taken from the DIRAC Accounting at the LHCb DIRAC Web portal: http://lhcb-portal-dirac.cern.ch.

  7. Function Package for Computing Quantum Resource Measures

    Science.gov (United States)

    Huang, Zhiming

    2018-05-01

    In this paper, we present a function package for to calculate quantum resource measures and dynamics of open systems. Our package includes common operators and operator lists, frequently-used functions for computing quantum entanglement, quantum correlation, quantum coherence, quantum Fisher information and dynamics in noisy environments. We briefly explain the functions of the package and illustrate how to use the package with several typical examples. We expect that this package is a useful tool for future research and education.

  8. Exploiting volatile opportunistic computing resources with Lobster

    Science.gov (United States)

    Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2015-12-01

    Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.

  9. Computer modelling of superconductive fault current limiters

    Energy Technology Data Exchange (ETDEWEB)

    Weller, R.A.; Campbell, A.M.; Coombs, T.A.; Cardwell, D.A.; Storey, R.J. [Cambridge Univ. (United Kingdom). Interdisciplinary Research Centre in Superconductivity (IRC); Hancox, J. [Rolls Royce, Applied Science Division, Derby (United Kingdom)

    1998-05-01

    Investigations are being carried out on the use of superconductors for fault current limiting applications. A number of computer programs are being developed to predict the behavior of different `resistive` fault current limiter designs under a variety of fault conditions. The programs achieve solution by iterative methods based around real measured data rather than theoretical models in order to achieve accuracy at high current densities. (orig.) 5 refs.

  10. Parallel visualization on leadership computing resources

    Energy Technology Data Exchange (ETDEWEB)

    Peterka, T; Ross, R B [Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Shen, H-W [Department of Computer Science and Engineering, Ohio State University, Columbus, OH 43210 (United States); Ma, K-L [Department of Computer Science, University of California at Davis, Davis, CA 95616 (United States); Kendall, W [Department of Electrical Engineering and Computer Science, University of Tennessee at Knoxville, Knoxville, TN 37996 (United States); Yu, H, E-mail: tpeterka@mcs.anl.go [Sandia National Laboratories, California, Livermore, CA 94551 (United States)

    2009-07-01

    Changes are needed in the way that visualization is performed, if we expect the analysis of scientific data to be effective at the petascale and beyond. By using similar techniques as those used to parallelize simulations, such as parallel I/O, load balancing, and effective use of interprocess communication, the supercomputers that compute these datasets can also serve as analysis and visualization engines for them. Our team is assessing the feasibility of performing parallel scientific visualization on some of the most powerful computational resources of the U.S. Department of Energy's National Laboratories in order to pave the way for analyzing the next generation of computational results. This paper highlights some of the conclusions of that research.

  11. Parallel visualization on leadership computing resources

    International Nuclear Information System (INIS)

    Peterka, T; Ross, R B; Shen, H-W; Ma, K-L; Kendall, W; Yu, H

    2009-01-01

    Changes are needed in the way that visualization is performed, if we expect the analysis of scientific data to be effective at the petascale and beyond. By using similar techniques as those used to parallelize simulations, such as parallel I/O, load balancing, and effective use of interprocess communication, the supercomputers that compute these datasets can also serve as analysis and visualization engines for them. Our team is assessing the feasibility of performing parallel scientific visualization on some of the most powerful computational resources of the U.S. Department of Energy's National Laboratories in order to pave the way for analyzing the next generation of computational results. This paper highlights some of the conclusions of that research.

  12. Automating usability of ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Tupputi, S A; Girolamo, A Di; Kouba, T; Schovancová, J

    2014-01-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  13. A multipurpose computing center with distributed resources

    Science.gov (United States)

    Chudoba, J.; Adam, M.; Adamová, D.; Kouba, T.; Mikula, A.; Říkal, V.; Švec, J.; Uhlířová, J.; Vokáč, P.; Svatoš, M.

    2017-10-01

    The Computing Center of the Institute of Physics (CC IoP) of the Czech Academy of Sciences serves a broad spectrum of users with various computing needs. It runs WLCG Tier-2 center for the ALICE and the ATLAS experiments; the same group of services is used by astroparticle physics projects the Pierre Auger Observatory (PAO) and the Cherenkov Telescope Array (CTA). OSG stack is installed for the NOvA experiment. Other groups of users use directly local batch system. Storage capacity is distributed to several locations. DPM servers used by the ATLAS and the PAO are all in the same server room, but several xrootd servers for the ALICE experiment are operated in the Nuclear Physics Institute in Řež, about 10 km away. The storage capacity for the ATLAS and the PAO is extended by resources of the CESNET - the Czech National Grid Initiative representative. Those resources are in Plzen and Jihlava, more than 100 km away from the CC IoP. Both distant sites use a hierarchical storage solution based on disks and tapes. They installed one common dCache instance, which is published in the CC IoP BDII. ATLAS users can use these resources using the standard ATLAS tools in the same way as the local storage without noticing this geographical distribution. Computing clusters LUNA and EXMAG dedicated to users mostly from the Solid State Physics departments offer resources for parallel computing. They are part of the Czech NGI infrastructure MetaCentrum with distributed batch system based on torque with a custom scheduler. Clusters are installed remotely by the MetaCentrum team and a local contact helps only when needed. Users from IoP have exclusive access only to a part of these two clusters and take advantage of higher priorities on the rest (1500 cores in total), which can also be used by any user of the MetaCentrum. IoP researchers can also use distant resources located in several towns of the Czech Republic with a capacity of more than 12000 cores in total.

  14. A Semi-Preemptive Computational Service System with Limited Resources and Dynamic Resource Ranking

    Directory of Open Access Journals (Sweden)

    Fang-Yie Leu

    2012-03-01

    Full Text Available In this paper, we integrate a grid system and a wireless network to present a convenient computational service system, called the Semi-Preemptive Computational Service system (SePCS for short, which provides users with a wireless access environment and through which a user can share his/her resources with others. In the SePCS, each node is dynamically given a score based on its CPU level, available memory size, current length of waiting queue, CPU utilization and bandwidth. With the scores, resource nodes are classified into three levels. User requests based on their time constraints are also classified into three types. Resources of higher levels are allocated to more tightly constrained requests so as to increase the total performance of the system. To achieve this, a resource broker with the Semi-Preemptive Algorithm (SPA is also proposed. When the resource broker cannot find suitable resources for the requests of higher type, it preempts the resource that is now executing a lower type request so that the request of higher type can be executed immediately. The SePCS can be applied to a Vehicular Ad Hoc Network (VANET, users of which can then exploit the convenient mobile network services and the wireless distributed computing. As a result, the performance of the system is higher than that of the tested schemes.

  15. NMRbox: A Resource for Biomolecular NMR Computation.

    Science.gov (United States)

    Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C

    2017-04-25

    Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.

  16. Computational chemistry reviews of current trends v.4

    CERN Document Server

    1999-01-01

    This volume presents a balanced blend of methodological and applied contributions. It supplements well the first three volumes of the series, revealing results of current research in computational chemistry. It also reviews the topographical features of several molecular scalar fields. A brief discussion of topographical concepts is followed by examples of their application to several branches of chemistry.The size of a basis set applied in a calculation determines the amount of computer resources necessary for a particular task. The details of a common strategy - the ab initio model potential

  17. Contract on using computer resources of another

    Directory of Open Access Journals (Sweden)

    Cvetković Mihajlo

    2016-01-01

    Full Text Available Contractual relations involving the use of another's property are quite common. Yet, the use of computer resources of others over the Internet and legal transactions arising thereof certainly diverge from the traditional framework embodied in the special part of contract law dealing with this issue. Modern performance concepts (such as: infrastructure, software or platform as high-tech services are highly unlikely to be described by the terminology derived from Roman law. The overwhelming novelty of high-tech services obscures the disadvantageous position of contracting parties. In most cases, service providers are global multinational companies which tend to secure their own unjustified privileges and gain by providing lengthy and intricate contracts, often comprising a number of legal documents. General terms and conditions in these service provision contracts are further complicated by the '.service level agreement', rules of conduct and (nonconfidentiality guarantees. Without giving the issue a second thought, users easily accept the pre-fabricated offer without reservations, unaware that such a pseudo-gratuitous contract actually conceals a highly lucrative and mutually binding agreement. The author examines the extent to which the legal provisions governing sale of goods and services, lease, loan and commodatum may apply to 'cloud computing' contracts, and analyses the scope and advantages of contractual consumer protection, as a relatively new area in contract law. The termination of a service contract between the provider and the user features specific post-contractual obligations which are inherent to an online environment.

  18. Computing Bounds on Resource Levels for Flexible Plans

    Science.gov (United States)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  19. Current Issues for Higher Education Information Resources Management.

    Science.gov (United States)

    CAUSE/EFFECT, 1996

    1996-01-01

    Issues identified as important to the future of information resources management and use in higher education include information policy in a networked environment, distributed computing, integrating information resources and college planning, benchmarking information technology, integrated digital libraries, technology integration in teaching,…

  20. Computational approaches to analogical reasoning current trends

    CERN Document Server

    Richard, Gilles

    2014-01-01

    Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity an...

  1. Computer modelling of eddy current probes

    International Nuclear Information System (INIS)

    Sullivan, S.P.

    1992-01-01

    Computer programs have been developed for modelling impedance and transmit-receive eddy current probes in two-dimensional axis-symmetric configurations. These programs, which are based on analytic equations, simulate bobbin probes in infinitely long tubes and surface probes on plates. They calculate probe signal due to uniform variations in conductor thickness, resistivity and permeability. These signals depend on probe design and frequency. A finite element numerical program has been procured to calculate magnetic permeability in non-linear ferromagnetic materials. Permeability values from these calculations can be incorporated into the above analytic programs to predict signals from eddy current probes with permanent magnets in ferromagnetic tubes. These programs were used to test various probe designs for new testing applications. Measurements of magnetic permeability in magnetically biased ferromagnetic materials have been performed by superimposing experimental signals, from special laboratory ET probes, on impedance plane diagrams calculated using these programs. (author). 3 refs., 2 figs

  2. Mobile devices and computing cloud resources allocation for interactive applications

    Directory of Open Access Journals (Sweden)

    Krawczyk Henryk

    2017-06-01

    Full Text Available Using mobile devices such as smartphones or iPads for various interactive applications is currently very common. In the case of complex applications, e.g. chess games, the capabilities of these devices are insufficient to run the application in real time. One of the solutions is to use cloud computing. However, there is an optimization problem of mobile device and cloud resources allocation. An iterative heuristic algorithm for application distribution is proposed. The algorithm minimizes the energy cost of application execution with constrained execution time.

  3. NASA Center for Computational Sciences: History and Resources

    Science.gov (United States)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  4. Computational methods in calculating superconducting current problems

    Science.gov (United States)

    Brown, David John, II

    Various computational problems in treating superconducting currents are examined. First, field inversion in spatial Fourier transform space is reviewed to obtain both one-dimensional transport currents flowing down a long thin tape, and a localized two-dimensional current. The problems associated with spatial high-frequency noise, created by finite resolution and experimental equipment, are presented, and resolved with a smooth Gaussian cutoff in spatial frequency space. Convergence of the Green's functions for the one-dimensional transport current densities is discussed, and particular attention is devoted to the negative effects of performing discrete Fourier transforms alone on fields asymptotically dropping like 1/r. Results of imaging simulated current densities are favorably compared to the original distributions after the resulting magnetic fields undergo the imaging procedure. The behavior of high-frequency spatial noise, and the behavior of the fields with a 1/r asymptote in the imaging procedure in our simulations is analyzed, and compared to the treatment of these phenomena in the published literature. Next, we examine calculation of Mathieu and spheroidal wave functions, solutions to the wave equation in elliptical cylindrical and oblate and prolate spheroidal coordinates, respectively. These functions are also solutions to Schrodinger's equations with certain potential wells, and are useful in solving time-varying superconducting problems. The Mathieu functions are Fourier expanded, and the spheroidal functions expanded in associated Legendre polynomials to convert the defining differential equations to recursion relations. The infinite number of linear recursion equations is converted to an infinite matrix, multiplied by a vector of expansion coefficients, thus becoming an eigenvalue problem. The eigenvalue problem is solved with root solvers, and the eigenvector problem is solved using a Jacobi-type iteration method, after preconditioning the

  5. A study of computer graphics technology in application of communication resource management

    Science.gov (United States)

    Li, Jing; Zhou, Liang; Yang, Fei

    2017-08-01

    With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.

  6. Some issues of creation of belarusian language computer resources

    OpenAIRE

    Rubashko, N.; Nevmerjitskaia, G.

    2003-01-01

    The main reason for creation of computer resources of natural language is the necessity to bring into accord the ways of language normalization with the form of its existence - the computer form of language usage should correspond to the computer form of language standards fixation. This paper discusses various aspects of the creation of Belarusian language computer resources. It also briefly gives an overview of the objectives of the project involved.

  7. Integration of Cloud resources in the LHCb Distributed Computing

    CERN Document Server

    Ubeda Garcia, Mario; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keepin...

  8. Resource management in utility and cloud computing

    CERN Document Server

    Zhao, Han

    2013-01-01

    This SpringerBrief reviews the existing market-oriented strategies for economically managing resource allocation in distributed systems. It describes three new schemes that address cost-efficiency, user incentives, and allocation fairness with regard to different scheduling contexts. The first scheme, taking the Amazon EC2? market as a case of study, investigates the optimal resource rental planning models based on linear integer programming and stochastic optimization techniques. This model is useful to explore the interaction between the cloud infrastructure provider and the cloud resource c

  9. Discovery of resources using MADM approaches for parallel and distributed computing

    Directory of Open Access Journals (Sweden)

    Mandeep Kaur

    2017-06-01

    Full Text Available Grid, a form of parallel and distributed computing, allows the sharing of data and computational resources among its users from various geographical locations. The grid resources are diverse in terms of their underlying attributes. The majority of the state-of-the-art resource discovery techniques rely on the static resource attributes during resource selection. However, the matching resources based on the static resource attributes may not be the most appropriate resources for the execution of user applications because they may have heavy job loads, less storage space or less working memory (RAM. Hence, there is a need to consider the current state of the resources in order to find the most suitable resources. In this paper, we have proposed a two-phased multi-attribute decision making (MADM approach for discovery of grid resources by using P2P formalism. The proposed approach considers multiple resource attributes for decision making of resource selection and provides the best suitable resource(s to grid users. The first phase describes a mechanism to discover all matching resources and applies SAW method to shortlist the top ranked resources, which are communicated to the requesting super-peer. The second phase of our proposed methodology applies integrated MADM approach (AHP enriched PROMETHEE-II on the list of selected resources received from different super-peers. The pairwise comparison of the resources with respect to their attributes is made and the rank of each resource is determined. The top ranked resource is then communicated to the grid user by the grid scheduler. Our proposed methodology enables the grid scheduler to allocate the most suitable resource to the user application and also reduces the search complexity by filtering out the less suitable resources during resource discovery.

  10. Improving ATLAS computing resource utilization with HammerCloud

    CERN Document Server

    Schovancova, Jaroslava; The ATLAS collaboration

    2018-01-01

    HammerCloud is a framework to commission, test, and benchmark ATLAS computing resources and components of various distributed systems with realistic full-chain experiment workflows. HammerCloud contributes to ATLAS Distributed Computing (ADC) Operations and automation efforts, providing the automated resource exclusion and recovery tools, that help re-focus operational manpower to areas which have yet to be automated, and improve utilization of available computing resources. We present recent evolution of the auto-exclusion/recovery tools: faster inclusion of new resources in testing machinery, machine learning algorithms for anomaly detection, categorized resources as master vs. slave for the purpose of blacklisting, and a tool for auto-exclusion/recovery of resources triggered by Event Service job failures that is being extended to other workflows besides the Event Service. We describe how HammerCloud helped commissioning various concepts and components of distributed systems: simplified configuration of qu...

  11. A Matchmaking Strategy Of Mixed Resource On Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wisam Elshareef

    2015-08-01

    Full Text Available Abstract Today cloud computing has become a key technology for online allotment of computing resources and online storage of user data in a lower cost where computing resources are available all the time over the Internet with pay per use concept. Recently there is a growing need for resource management strategies in a cloud computing environment that encompass both end-users satisfaction and a high job submission throughput with appropriate scheduling. One of the major and essential issues in resource management is related to allocate incoming tasks to suitable virtual machine matchmaking. The main objective of this paper is to propose a matchmaking strategy between the incoming requests and various resources in the cloud environment to satisfy the requirements of users and to load balance the workload on resources. Load Balancing is an important aspect of resource management in a cloud computing environment. So this paper proposes a dynamic weight active monitor DWAM load balance algorithm which allocates on the fly the incoming requests to the all available virtual machines in an efficient manner in order to achieve better performance parameters such as response time processing time and resource utilization. The feasibility of the proposed algorithm is analyzed using Cloudsim simulator which proves the superiority of the proposed DWAM algorithm over its counterparts in literature. Simulation results demonstrate that proposed algorithm dramatically improves response time data processing time and more utilized of resource compared Active monitor and VM-assign algorithms.

  12. Current NASA Plans for Mars In Situ Resource Utilization

    Science.gov (United States)

    Sanders, Gerald

    2018-01-01

    The presentation is to provide relevant information to the NASA funded Center for the Utilization of Biological Engineering in Space (CUBES) Institute. The presentation cover the following: 1) What is In Situ Resource Utilization (ISRU), 2) What are the resources of interest at the Moon and Mars, 3) ISRU-related mission requirements and ISRU economics, 4) Challenges and Risk for ISRU, 5) Concept of Operation for Mars ISRU Systems, 6) Current State of the Art (SOA) in ISRU, and 7) Current ISRU development and mission status.

  13. The current crisis in human resources for health in Africa

    African Journals Online (AJOL)

    Overview. The current crisis in human resources for health in. Africa has reached a serious level in many countries. A complex set of reasons has contributed to this problem, some exogenous, such as the severe economic measures introduced by structural adjustment, which often result in cutbacks in the number of health ...

  14. Tidal current energy resource assessment in Ireland: Current status and future update

    International Nuclear Information System (INIS)

    O'Rourke, Fergal; Boyle, Fergal; Reynolds, Anthony

    2010-01-01

    Interest in renewable energy in Ireland has increased continually over the past decade. This interest is due primarily to security of supply issues and the effects of climate change. Ireland imports over 90% of its primary energy consumption, mostly in the form of fossil fuels. The exploitation of Ireland's vast indigenous renewable energy resources is required in order to reduce this over-dependence on fossil fuel imports to meet energy demand. Various targets have been set by the Irish government to incorporate renewable energy technologies into Ireland's energy market. As a result of these targets, the development in wind energy has increased substantially over the past decade; however this method of energy extraction is intermittent and unpredictable. Ireland has an excellent tidal current energy resource and the use of this resource will assist in the development of a sustainable energy future. Energy extraction using tidal current energy technologies offers a vast and predictable energy resource. This paper reviews the currently accepted tidal current energy resource assessment for Ireland. This assessment was compiled by Sustainable Energy Ireland in a report in 2004. The assessment employed a 2-dimensional numerical model of the tidal current velocities around Ireland, and from this numerical model the theoretical tidal current energy resource was identified. With the introduction of constraints and limitations, the technical, practical, accessible and viable tidal current energy resources were obtained. The paper discusses why the assessment needs updating including the effect on the assessment of the current stage of development of tidal current turbines and their deployment technology. (author)

  15. Decentralized Resource Management in Distributed Computer Systems.

    Science.gov (United States)

    1982-02-01

    directly exchanging user state information. Eventcounts and sequencers correspond to semaphores in the sense that synchronization primitives are used to...and techniques are required to achieve synchronization in distributed computers without reliance on any centralized entity such as a semaphore ...known solutions to the access synchronization problem was Dijkstra’s semaphore [12]. The importance of the semaphore is that it correctly addresses the

  16. Physical-resource requirements and the power of quantum computation

    International Nuclear Information System (INIS)

    Caves, Carlton M; Deutsch, Ivan H; Blume-Kohout, Robin

    2004-01-01

    The primary resource for quantum computation is the Hilbert-space dimension. Whereas Hilbert space itself is an abstract construction, the number of dimensions available to a system is a physical quantity that requires physical resources. Avoiding a demand for an exponential amount of these resources places a fundamental constraint on the systems that are suitable for scalable quantum computation. To be scalable, the number of degrees of freedom in the computer must grow nearly linearly with the number of qubits in an equivalent qubit-based quantum computer. These considerations rule out quantum computers based on a single particle, a single atom, or a single molecule consisting of a fixed number of atoms or on classical waves manipulated using the transformations of linear optics

  17. Computational thermodynamics in electric current metallurgy

    DEFF Research Database (Denmark)

    Bhowmik, Arghya; Qin, R.S.

    2015-01-01

    . The method has been validated against the analytical solution of current distribution and experimental observation of microstructure evolution. It provides a basis for the design, prediction and implementation of the electric current metallurgy. The applicability of the theory is discussed in the derivations.......A priori derivation for the extra free energy caused by the passing electric current in metal is presented. The analytical expression and its discrete format in support of the numerical calculation of thermodynamics in electric current metallurgy have been developed. This enables the calculation...... of electric current distribution, current induced temperature distribution and free energy sequence of various phase transitions in multiphase materials. The work is particularly suitable for the study of magnetic materials that contain various magnetic phases. The latter has not been considered in literature...

  18. Development of Computer-Based Resources for Textile Education.

    Science.gov (United States)

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  19. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  20. ResourceGate: A New Solution for Cloud Computing Resource Allocation

    OpenAIRE

    Abdullah A. Sheikh

    2012-01-01

    Cloud computing has taken place to be focused by educational and business communities. These concerns include their needs to improve the Quality of Services (QoS) provided, also services such as reliability, performance and reducing costs. Cloud computing provides many benefits in terms of low cost and accessibility of data. Ensuring these benefits is considered to be the major factor in the cloud computing environment. This paper surveys recent research related to cloud computing resource al...

  1. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Science.gov (United States)

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  2. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Directory of Open Access Journals (Sweden)

    Bruno Guazzelli Batista

    Full Text Available Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  3. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    Science.gov (United States)

    1991-06-01

    Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent

  4. Processing Optimization of Typed Resources with Synchronized Storage and Computation Adaptation in Fog Computing

    Directory of Open Access Journals (Sweden)

    Zhengyang Song

    2018-01-01

    Full Text Available Wide application of the Internet of Things (IoT system has been increasingly demanding more hardware facilities for processing various resources including data, information, and knowledge. With the rapid growth of generated resource quantity, it is difficult to adapt to this situation by using traditional cloud computing models. Fog computing enables storage and computing services to perform at the edge of the network to extend cloud computing. However, there are some problems such as restricted computation, limited storage, and expensive network bandwidth in Fog computing applications. It is a challenge to balance the distribution of network resources. We propose a processing optimization mechanism of typed resources with synchronized storage and computation adaptation in Fog computing. In this mechanism, we process typed resources in a wireless-network-based three-tier architecture consisting of Data Graph, Information Graph, and Knowledge Graph. The proposed mechanism aims to minimize processing cost over network, computation, and storage while maximizing the performance of processing in a business value driven manner. Simulation results show that the proposed approach improves the ratio of performance over user investment. Meanwhile, conversions between resource types deliver support for dynamically allocating network resources.

  5. Current status and phenotypic characteristics of Bulgarian poultry genetic resources

    International Nuclear Information System (INIS)

    Teneva, A.; Gerzilov, V.; Lalev, M.; Lukanov, H.; Mincheva, N.; Oblakova, M.; Petrov, P.; Hristakieva, P.; Dimitrova, I.; Periasamy, K.

    2016-01-01

    Full text: Poultry biodiversity conservation is a great challenge for many countries. Within the last several years, the number of endangered local breeds has increased, leading to a considerable loss of genetic resources. A similar trend was observed among the poultry breeds, including chicken, local turkey and goose breeds/lines established in Bulgaria, part of which is definitely lost. Currently these breeds/lines are at risk and/or threatened with extinction. The information obtained by phenotypic characterization of these breeds is the first step for planning the management of poultry genetic resources through setting up improved selection schemes and conservation strategies. In this paper, we reviewed the current state of knowledge regarding the morphological and phenotypic diversity of local poultry breeds and some old productive poultry lines in Bulgaria. (author)

  6. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    International Nuclear Information System (INIS)

    Evans, D; Fisk, I; Holzman, B; Pordes, R; Tiradani, A; Melo, A; Sheldon, P; Metson, S

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely 'on-demand' as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the 'base-line' needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  7. US computer research networks: Current and future

    Science.gov (United States)

    Kratochvil, D.; Sood, D.; Verostko, A.

    1989-01-01

    During the last decade, NASA LeRC's Communication Program has conducted a series of telecommunications forecasting studies to project trends and requirements and to identify critical telecommunications technologies that must be developed to meet future requirements. The Government Networks Division of Contel Federal Systems has assisted NASA in these studies, and the current study builds upon these earlier efforts. The current major thrust of the NASA Communications Program is aimed at developing the high risk, advanced, communications satellite and terminal technologies required to significantly increase the capacity of future communications systems. Also, major new technological, economic, and social-political events and trends are now shaping the communications industry of the future. Therefore, a re-examination of future telecommunications needs and requirements is necessary to enable NASA to make management decisions in its Communications Program and to ensure the proper technologies and systems are addressed. This study, through a series of Task Orders, is helping NASA define the likely communication service needs and requirements of the future and thereby ensuring that the most appropriate technology developments are pursued.

  8. Shared-resource computing for small research labs.

    Science.gov (United States)

    Ackerman, M J

    1982-04-01

    A real time laboratory computer network is described. This network is composed of four real-time laboratory minicomputers located in each of four division laboratories and a larger minicomputer in a centrally located computer room. Off the shelf hardware and software were used with no customization. The network is configured for resource sharing using DECnet communications software and the RSX-11-M multi-user real-time operating system. The cost effectiveness of the shared resource network and multiple real-time processing using priority scheduling is discussed. Examples of utilization within a medical research department are given.

  9. Using OSG Computing Resources with (iLC)Dirac

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Petric, Marko

    2017-01-01

    CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called 'SiteDirectors', which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional sitespecific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were develo...

  10. Integration of Openstack cloud resources in BES III computing cluster

    Science.gov (United States)

    Li, Haibo; Cheng, Yaodong; Huang, Qiulan; Cheng, Zhenjing; Shi, Jingyan

    2017-10-01

    Cloud computing provides a new technical means for data processing of high energy physics experiment. However, the resource of each queue is fixed and the usage of the resource is static in traditional job management system. In order to make it simple and transparent for physicist to use, we developed a virtual cluster system (vpmanager) to integrate IHEPCloud and different batch systems such as Torque and HTCondor. Vpmanager provides dynamic virtual machines scheduling according to the job queue. The BES III use case results show that resource efficiency is greatly improved.

  11. Computer-aided resource planning and scheduling for radiological services

    Science.gov (United States)

    Garcia, Hong-Mei C.; Yun, David Y.; Ge, Yiqun; Khan, Javed I.

    1996-05-01

    There exists tremendous opportunity in hospital-wide resource optimization based on system integration. This paper defines the resource planning and scheduling requirements integral to PACS, RIS and HIS integration. An multi-site case study is conducted to define the requirements. A well-tested planning and scheduling methodology, called Constrained Resource Planning model, has been applied to the chosen problem of radiological service optimization. This investigation focuses on resource optimization issues for minimizing the turnaround time to increase clinical efficiency and customer satisfaction, particularly in cases where the scheduling of multiple exams are required for a patient. How best to combine the information system efficiency and human intelligence in improving radiological services is described. Finally, an architecture for interfacing a computer-aided resource planning and scheduling tool with the existing PACS, HIS and RIS implementation is presented.

  12. [Social and health resources in Catalonia. Current situation].

    Science.gov (United States)

    Bullich-Marín, Ingrid; Sánchez-Ferrín, Pau; Cabanes-Duran, Concepció; Salvà-Casanovas, Antoni

    The network of social and health care has advanced since its inception. Furthermore, news services have been created and some resources have been adapted within the framework of respective health plans. This article presents the current situation of the different social and health resources in Catalonia, as well as the main changes that have occurred in recent years, more specifically in the period of the Health Plan 2011-2015. This period is characterised by an adaptation of the social and health network within the context of chronic care, for which the development of intermediate care resources has become the most relevant aspect. There is also a need to create a single long-term care sector in which the health care quality is guaranteed. Moreover, in this period, integral and cross-care level is promoted in the health system through a greater coordination between all different levels of care. The social and health network, due to its trajectory and expertise, plays a key role in the quality of care for people with social and medical needs. Copyright © 2017 SEGG. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  14. GridFactory - Distributed computing on ephemeral resources

    DEFF Research Database (Denmark)

    Orellana, Frederik; Niinimaki, Marko

    2011-01-01

    A novel batch system for high throughput computing is presented. The system is specifically designed to leverage virtualization and web technology to facilitate deployment on cloud and other ephemeral resources. In particular, it implements a security model suited for forming collaborations...

  15. Can the Teachers' Creativity Overcome Limited Computer Resources?

    Science.gov (United States)

    Nikolov, Rumen; Sendova, Evgenia

    1988-01-01

    Describes experiences of the Research Group on Education (RGE) at the Bulgarian Academy of Sciences and the Ministry of Education in using limited computer resources when teaching informatics. Topics discussed include group projects; the use of Logo; ability grouping; and out-of-class activities, including publishing a pupils' magazine. (13…

  16. Recent development of computational resources for new antibiotics discovery

    DEFF Research Database (Denmark)

    Kim, Hyun Uk; Blin, Kai; Lee, Sang Yup

    2017-01-01

    Understanding a complex working mechanism of biosynthetic gene clusters (BGCs) encoding secondary metabolites is a key to discovery of new antibiotics. Computational resources continue to be developed in order to better process increasing volumes of genome and chemistry data, and thereby better...

  17. Computing Resource And Work Allocations Using Social Profiles

    Directory of Open Access Journals (Sweden)

    Peter Lavin

    2013-01-01

    Full Text Available If several distributed and disparate computer resources exist, many of whichhave been created for different and diverse reasons, and several large scale com-puting challenges also exist with similar diversity in their backgrounds, then oneproblem which arises in trying to assemble enough of these resources to addresssuch challenges is the need to align and accommodate the different motivationsand objectives which may lie behind the existence of both the resources andthe challenges. Software agents are offered as a mainstream technology formodelling the types of collaborations and relationships needed to do this. Asan initial step towards forming such relationships, agents need a mechanism toconsider social and economic backgrounds. This paper explores addressing so-cial and economic differences using a combination of textual descriptions knownas social profiles and search engine technology, both of which are integrated intoan agent technology.

  18. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    OpenAIRE

    Cirasella, Jill

    2009-01-01

    This article is an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news.

  19. Photonic entanglement as a resource in quantum computation and quantum communication

    OpenAIRE

    Prevedel, Robert; Aspelmeyer, Markus; Brukner, Caslav; Jennewein, Thomas; Zeilinger, Anton

    2008-01-01

    Entanglement is an essential resource in current experimental implementations for quantum information processing. We review a class of experiments exploiting photonic entanglement, ranging from one-way quantum computing over quantum communication complexity to long-distance quantum communication. We then propose a set of feasible experiments that will underline the advantages of photonic entanglement for quantum information processing.

  20. Towards minimal resources of measurement-based quantum computation

    International Nuclear Information System (INIS)

    Perdrix, Simon

    2007-01-01

    We improve the upper bound on the minimal resources required for measurement-only quantum computation (M A Nielsen 2003 Phys. Rev. A 308 96-100; D W Leung 2004 Int. J. Quantum Inform. 2 33; S Perdrix 2005 Int. J. Quantum Inform. 3 219-23). Minimizing the resources required for this model is a key issue for experimental realization of a quantum computer based on projective measurements. This new upper bound also allows one to reply in the negative to the open question presented by Perdrix (2004 Proc. Quantum Communication Measurement and Computing) about the existence of a trade-off between observable and ancillary qubits in measurement-only QC

  1. Computers in nuclear medicine - current trends and future directions

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    Previously, a decision to purchase computing equipment for nuclear medicine usually required evaluation of the 'local' needs. With the advent of Pacs and state of the art computer techniques for image acquisition and manipulation, purchase and subsequent application is to become much more complex. Some of the current trends and future possibilities which may influence the choice and operation of computers within and outside the nuclear medicine environment is discussed. (author)

  2. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    International Nuclear Information System (INIS)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie

    2014-01-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  3. Dynamic integration of remote cloud resources into local computing clusters

    Energy Technology Data Exchange (ETDEWEB)

    Fleig, Georg; Erli, Guenther; Giffels, Manuel; Hauth, Thomas; Quast, Guenter; Schnepf, Matthias [Institut fuer Experimentelle Kernphysik, Karlsruher Institut fuer Technologie (Germany)

    2016-07-01

    In modern high-energy physics (HEP) experiments enormous amounts of data are analyzed and simulated. Traditionally dedicated HEP computing centers are built or extended to meet this steadily increasing demand for computing resources. Nowadays it is more reasonable and more flexible to utilize computing power at remote data centers providing regular cloud services to users as they can be operated in a more efficient manner. This approach uses virtualization and allows the HEP community to run virtual machines containing a dedicated operating system and transparent access to the required software stack on almost any cloud site. The dynamic management of virtual machines depending on the demand for computing power is essential for cost efficient operation and sharing of resources with other communities. For this purpose the EKP developed the on-demand cloud manager ROCED for dynamic instantiation and integration of virtualized worker nodes into the institute's computing cluster. This contribution will report on the concept of our cloud manager and the implementation utilizing a remote OpenStack cloud site and a shared HPC center (bwForCluster located in Freiburg).

  4. Common accounting system for monitoring the ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Karavakis, E; Andreeva, J; Campana, S; Saiz, P; Gayazov, S; Jezequel, S; Sargsyan, L; Schovancova, J; Ueda, I

    2014-01-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  5. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    Science.gov (United States)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  6. Emotor control: computations underlying bodily resource allocation, emotions, and confidence.

    Science.gov (United States)

    Kepecs, Adam; Mensh, Brett D

    2015-12-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience-approaching subjective behavior as the result of mental computations instantiated in the brain-to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This "emotor" control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on "confidence." Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior.

  7. Investigating the computer analysis of eddy current NDT data

    International Nuclear Information System (INIS)

    Brown, R.L.

    1979-01-01

    The objective of this activity was to investigate and develop techniques for computer analysis of eddy current nondestructive testing (NDT) data. A single frequency commercial eddy current tester and a precision mechanical scanner were interfaced with a PDP-11/34 computer to obtain and analyze eddy current data from samples of 316 stainless steel tubing containing known discontinuities. Among the data analysis techniques investigated were: correlation, Fast Fourier Transforms (FFT), clustering, and Adaptive Learning Networks (ALN). The results were considered encouraging. ALN, for example, correctly identified 88% of the defects and non-defects from a group of 153 signal indications

  8. Multicriteria Resource Brokering in Cloud Computing for Streaming Service

    Directory of Open Access Journals (Sweden)

    Chih-Lun Chou

    2015-01-01

    Full Text Available By leveraging cloud computing such as Infrastructure as a Service (IaaS, the outsourcing of computing resources used to support operations, including servers, storage, and networking components, is quite beneficial for various providers of Internet application. With this increasing trend, resource allocation that both assures QoS via Service Level Agreement (SLA and avoids overprovisioning in order to reduce cost becomes a crucial priority and challenge in the design and operation of complex service-based platforms such as streaming service. On the other hand, providers of IaaS also concern their profit performance and energy consumption while offering these virtualized resources. In this paper, considering both service-oriented and infrastructure-oriented criteria, we regard this resource allocation problem as Multicriteria Decision Making problem and propose an effective trade-off approach based on goal programming model. To validate its effectiveness, a cloud architecture for streaming application is addressed and extensive analysis is performed for related criteria. The results of numerical simulations show that the proposed approach strikes a balance between these conflicting criteria commendably and achieves high cost efficiency.

  9. The current state of water resources of Transcarpathia

    Directory of Open Access Journals (Sweden)

    V. І. Nikolaichuk

    2015-07-01

    Full Text Available Throughout their existence, humans use the water of rivers, lakes and underground sources not only for water supply but also for dumping of polluted waters and wastes into it. Significant development of urbanization, concentration of urban industrial enterprises, transport, increase in mining, expansion of drainage and irrigation reclamation, plowing of the river channels, creating a large number of landfills resulted in significant, and in some regions critical, depletion and contamination of the surface and ground waters. Because of this disastrous situation, the society is getting more and more concerned about the state of the environment. The public became increasingly interested in the state of the soil cover, air, water resources, and biotic diversity. Transcarpathian region (Zakarpattya is situated in the heart of Europe, bordered by four Central European countries (Poland, Slovakia, Hungary and Romania and two regions of Ukraine (Lviv and Ivano-Frankivsk regions. Transcarpathian region (Zakarpattya is one of the richest regions of Ukraine in terms of water resources. The territory is permeated by the dense network of rivers. There are in total 9,429 rivers of 19,866 kmlength flowing in the region. Among them, the rivers Tysa, Borzhava, Latoryca, Uzh have the length of over 100 kmeach. 25 cities and urban settlements of the area are substantially provided with the centralized water intake of underground drinking water. The rural areas have virtually no centralized water supply; mainly, it is carried out due to domestic wells or water boreholes. Predicted resources of underground drinking waters in the region are equal to 1,109,300 m3/day. The use of fresh water in 2014 per capita amounted to 23,769 m3, 15% less than in 2009. The main pollutants of surface water bodies are the facilities of utility companies in the region. Analysis of studies of surface water quality in Transcarpathian region in 2014 shows that water quality meets the

  10. [The current state of the brain-computer interface problem].

    Science.gov (United States)

    Shurkhay, V A; Aleksandrova, E V; Potapov, A A; Goryainov, S A

    2015-01-01

    It was only 40 years ago that the first PC appeared. Over this period, rather short in historical terms, we have witnessed the revolutionary changes in lives of individuals and the entire society. Computer technologies are tightly connected with any field, either directly or indirectly. We can currently claim that computers are manifold superior to a human mind in terms of a number of parameters; however, machines lack the key feature: they are incapable of independent thinking (like a human). However, the key to successful development of humankind is collaboration between the brain and the computer rather than competition. Such collaboration when a computer broadens, supplements, or replaces some brain functions is known as the brain-computer interface. Our review focuses on real-life implementation of this collaboration.

  11. About the inclusion of eddy currents in micromagnetic computations

    International Nuclear Information System (INIS)

    Torres, L.; Martinez, E.; Lopez-Diaz, L.; Alejos, O.

    2004-01-01

    A three-dimensional dynamic micromagnetic model including the effect of eddy currents and its application to magnetization reversal processes in permalloy nanostructures is presented. Model assumptions are tangential current on the nanostructure surface, electrical neutrality and negligible displacement current. The method for solving Landau Lifschitz Gilbert equation coupled to Maxwell equations incorporating the Faraday's law is discussed in detail. The results presented for Permalloy nanocubes of 40 nm side show how the effect of eddy currents can anticipate the magnetization switching. The dependence of the calculations on computational cell size is also reported

  12. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  13. Computer Simulation and Digital Resources for Plastic Surgery Psychomotor Education.

    Science.gov (United States)

    Diaz-Siso, J Rodrigo; Plana, Natalie M; Stranix, John T; Cutting, Court B; McCarthy, Joseph G; Flores, Roberto L

    2016-10-01

    Contemporary plastic surgery residents are increasingly challenged to learn a greater number of complex surgical techniques within a limited period. Surgical simulation and digital education resources have the potential to address some limitations of the traditional training model, and have been shown to accelerate knowledge and skills acquisition. Although animal, cadaver, and bench models are widely used for skills and procedure-specific training, digital simulation has not been fully embraced within plastic surgery. Digital educational resources may play a future role in a multistage strategy for skills and procedures training. The authors present two virtual surgical simulators addressing procedural cognition for cleft repair and craniofacial surgery. Furthermore, the authors describe how partnerships among surgical educators, industry, and philanthropy can be a successful strategy for the development and maintenance of digital simulators and educational resources relevant to plastic surgery training. It is our responsibility as surgical educators not only to create these resources, but to demonstrate their utility for enhanced trainee knowledge and technical skills development. Currently available digital resources should be evaluated in partnership with plastic surgery educational societies to guide trainees and practitioners toward effective digital content.

  14. Space shuttle general purpose computers (GPCs) (current and future versions)

    Science.gov (United States)

    1988-01-01

    Current and future versions of general purpose computers (GPCs) for space shuttle orbiters are represented in this frame. The two boxes on the left (AP101B) represent the current GPC configuration, with the input-output processor at far left and the central processing unit (CPU) at its side. The upgraded version combines both elements in a single unit (far right, AP101S).

  15. The Current State Of Secondary Resource Usage In Ukraine

    OpenAIRE

    Julia Makovetska

    2011-01-01

    The state and the perspectives of the development of secondary resource usage in Ukraine have been analyzed in the article. The level of the main types recyclable materials as paper and cardboard, glass, plastics, waste tires are considered. Priority directions of development of the secondary resources usage have been defined.

  16. Negative quasi-probability as a resource for quantum computation

    International Nuclear Information System (INIS)

    Veitch, Victor; Ferrie, Christopher; Emerson, Joseph; Gross, David

    2012-01-01

    A central problem in quantum information is to determine the minimal physical resources that are required for quantum computational speed-up and, in particular, for fault-tolerant quantum computation. We establish a remarkable connection between the potential for quantum speed-up and the onset of negative values in a distinguished quasi-probability representation, a discrete analogue of the Wigner function for quantum systems of odd dimension. This connection allows us to resolve an open question on the existence of bound states for magic state distillation: we prove that there exist mixed states outside the convex hull of stabilizer states that cannot be distilled to non-stabilizer target states using stabilizer operations. We also provide an efficient simulation protocol for Clifford circuits that extends to a large class of mixed states, including bound universal states. (paper)

  17. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  18. Next Generation Computer Resources: Reference Model for Project Support Environments (Version 2.0)

    National Research Council Canada - National Science Library

    Brown, Alan

    1993-01-01

    The objective of the Next Generation Computer Resources (NGCR) program is to restructure the Navy's approach to acquisition of standard computing resources to take better advantage of commercial advances and investments...

  19. Current Trends in Cloud Computing A Survey of Cloud Computing Systems

    OpenAIRE

    Harjit Singh

    2012-01-01

    Cloud computing that has become an increasingly important trend, is a virtualization technology that uses the internet and central remote servers to offer the sharing of resources that include infrastructures, software, applications and business processes to the market environment to fulfill the elastic demand. In today’s competitive environment, the service vitality, elasticity, choices and flexibility offered by this scalable technology are too attractive that makes the cloud computing to i...

  20. A Survey of Current Computer Information Science (CIS) Students.

    Science.gov (United States)

    Los Rios Community Coll. District, Sacramento, CA. Office of Institutional Research.

    This document is a survey designed to be completed by current students of Computer Information Science (CIS) in the Los Rios Community College District (LRCCD), which consists of three community colleges: American River College, Cosumnes River College, and Sacramento City College. The students are asked about their educational goals and how…

  1. Resource characterization and variability studies for marine current power

    OpenAIRE

    Carpman, Nicole

    2017-01-01

    Producing electricity from marine renewable resources is a research area that develops continuously. The field of tidal energy is on the edge to progress from the prototype stage to the commercial stage. However, tidal resource characterization, and the effect of tidal turbines on the flow, is still an ongoing research area in which this thesis aims to contribute. In this thesis, measurements of flow velocities have been performed at three kinds of sites. Firstly, a tidal site has been invest...

  2. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    Science.gov (United States)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  3. Operating the worldwide LHC computing grid: current and future challenges

    International Nuclear Information System (INIS)

    Molina, J Flix; Forti, A; Girone, M; Sciaba, A

    2014-01-01

    The Wordwide LHC Computing Grid project (WLCG) provides the computing and storage resources required by the LHC collaborations to store, process and analyse their data. It includes almost 200,000 CPU cores, 200 PB of disk storage and 200 PB of tape storage distributed among more than 150 sites. The WLCG operations team is responsible for several essential tasks, such as the coordination of testing and deployment of Grid middleware and services, communication with the experiments and the sites, followup and resolution of operational issues and medium/long term planning. In 2012 WLCG critically reviewed all operational procedures and restructured the organisation of the operations team as a more coherent effort in order to improve its efficiency. In this paper we describe how the new organisation works, its recent successes and the changes to be implemented during the long LHC shutdown in preparation for the LHC Run 2.

  4. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    Science.gov (United States)

    Cirasella, Jill

    2009-01-01

    This article presents an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news. All of the quantum computing resources described in this article are freely available, English-language web sites that fall into one…

  5. Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.

    Science.gov (United States)

    Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y

    2018-02-01

    The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate

  6. Current Solutions: Recent Experience in Interconnecting Distributed Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, M.

    2003-09-01

    This report catalogues selected real-world technical experiences of utilities and customers that have interconnected distributed energy assets with the electric grid. This study was initiated to assess the actual technical practices for interconnecting distributed generation and had a particular focus on the technical issues covered under the Institute of Electrical and Electronics Engineers (IEEE) 1547(TM) Standard for Interconnecting Distributed Resources With Electric Power Systems.

  7. Big Data in Cloud Computing: A Resource Management Perspective

    Directory of Open Access Journals (Sweden)

    Saeed Ullah

    2018-01-01

    Full Text Available The modern day advancement is increasingly digitizing our lives which has led to a rapid growth of data. Such multidimensional datasets are precious due to the potential of unearthing new knowledge and developing decision-making insights from them. Analyzing this huge amount of data from multiple sources can help organizations to plan for the future and anticipate changing market trends and customer requirements. While the Hadoop framework is a popular platform for processing larger datasets, there are a number of other computing infrastructures, available to use in various application domains. The primary focus of the study is how to classify major big data resource management systems in the context of cloud computing environment. We identify some key features which characterize big data frameworks as well as their associated challenges and issues. We use various evaluation metrics from different aspects to identify usage scenarios of these platforms. The study came up with some interesting findings which are in contradiction with the available literature on the Internet.

  8. Current status and future prospects of uranium resources

    International Nuclear Information System (INIS)

    Kuronuma, Chosuke

    1997-01-01

    Uranium is contained in various things in natural world, for example, 3 ppm in granite and 3x10 -3 ppm in seawater. Uranium exists in the state of tetra, penta and hexa-valence in nature, and in oxidizing environment, it exists as uranyl radical of hexa-valence, forms soluble complexes, and easily moves with water. In reducing environment, it becomes insoluble state of tetra-valence and precipitates. This property of uranium is deeply related to the way of forming the deposit, and it is explained. The uranium resources of the recovery cost being 80 dollars per kg U or less are 2,120,000 t, and 60% of the total exists in Australia, Kazakstan and Canada. The cumulative production of uranium in the world from 1945 to 1995 was 1,810,000 t. Of the total production, 875,000 t was used for civil purpose, and 750,000 t was used for military purpose. The uranium deposits in Canada are very high quality, and produce 1/3 of the world uranium production. There are the inventories of 150,000-200,000 t U. The diversion of military high enriched uranium to civil purpose is reported. The state of uranium market, the prospect of demand and supply of uranium, and the exploration and development of uranium resources are described. (K.I.)

  9. A Review of Computer Science Resources for Learning and Teaching with K-12 Computing Curricula: An Australian Case Study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-01-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age…

  10. Teaching Social Policy: Integration of Current Legislation and Media Resources

    Science.gov (United States)

    DeRigne, LeaAnne

    2011-01-01

    Social work students enter the field of social work for many reasons--from wanting to become clinicians to wanting to advocate for a more socially just world. Social policy classes can be the ideal courses to provide instruction on conducting research on current policy issues. Teaching students about policy advocacy can lead to a class rich with…

  11. Campus Grids: Bringing Additional Computational Resources to HEP Researchers

    International Nuclear Information System (INIS)

    Weitzel, Derek; Fraser, Dan; Bockelman, Brian; Swanson, David

    2012-01-01

    It is common at research institutions to maintain multiple clusters that represent different owners or generations of hardware, or that fulfill different needs and policies. Many of these clusters are consistently under utilized while researchers on campus could greatly benefit from these unused capabilities. By leveraging principles from the Open Science Grid it is now possible to utilize these resources by forming a lightweight campus grid. The campus grids framework enables jobs that are submitted to one cluster to overflow, when necessary, to other clusters within the campus using whatever authentication mechanisms are available on campus. This framework is currently being used on several campuses to run HEP and other science jobs. Further, the framework has in some cases been expanded beyond the campus boundary by bridging campus grids into a regional grid, and can even be used to integrate resources from a national cyberinfrastructure such as the Open Science Grid. This paper will highlight 18 months of operational experiences creating campus grids in the US, and the different campus configurations that have successfully utilized the campus grid infrastructure.

  12. Monitoring of computing resource utilization of the ATLAS experiment

    International Nuclear Information System (INIS)

    Rousseau, David; Vukotic, Ilija; Schaffer, RD; Dimitrov, Gancho; Aidel, Osman; Albrand, Solveig

    2012-01-01

    Due to the good performance of the LHC accelerator, the ATLAS experiment has seen higher than anticipated levels for both the event rate and the average number of interactions per bunch crossing. In order to respond to these changing requirements, the current and future usage of CPU, memory and disk resources has to be monitored, understood and acted upon. This requires data collection at a fairly fine level of granularity: the performance of each object written and each algorithm run, as well as a dozen per-job variables, are gathered for the different processing steps of Monte Carlo generation and simulation and the reconstruction of both data and Monte Carlo. We present a system to collect and visualize the data from both the online Tier-0 system and distributed grid production jobs. Around 40 GB of performance data are expected from up to 200k jobs per day, thus making performance optimization of the underlying Oracle database of utmost importance.

  13. DrugSig: A resource for computational drug repositioning utilizing gene expression signatures.

    Directory of Open Access Journals (Sweden)

    Hongyu Wu

    Full Text Available Computational drug repositioning has been proved as an effective approach to develop new drug uses. However, currently existing strategies strongly rely on drug response gene signatures which scattered in separated or individual experimental data, and resulted in low efficient outputs. So, a fully drug response gene signatures database will be very helpful to these methods. We collected drug response microarray data and annotated related drug and targets information from public databases and scientific literature. By selecting top 500 up-regulated and down-regulated genes as drug signatures, we manually established the DrugSig database. Currently DrugSig contains more than 1300 drugs, 7000 microarray and 800 targets. Moreover, we developed the signature based and target based functions to aid drug repositioning. The constructed database can serve as a resource to quicken computational drug repositioning. Database URL: http://biotechlab.fudan.edu.cn/database/drugsig/.

  14. Computation of the current density in nonlinear materials subjected to large current pulses

    International Nuclear Information System (INIS)

    Hodgdon, M.L.; Hixson, R.S.; Parsons, W.M.

    1991-01-01

    This paper reports that the finite element method and the finite difference method are used to calculate the current distribution in two nonlinear conductors. The first conductor is a small ferromagnetic wire subjected to a current pulse that rises to 10,000 Amperes in 10 microseconds. Results from the transient thermal and transient magnetic solvers of the finite element code FLUX2D are used to compute the current density in the wire. The second conductor is a metal oxide varistor. Maxwell's equations, Ohm's law and the varistor relation for the resistivity and the current density of p = αj -β are used to derive a nonlinear differential equation. The solutions of the differential equation are obtained by a finite difference approximation and a shooting method. The behavior predicted by these calculations is in agreement with experiments

  15. Book Review: Current Issues in International Human Resource Management and Strategy Research

    DEFF Research Database (Denmark)

    Gretzinger, Susanne

    2009-01-01

    The article reviews the book "Current Issues in International Human Resource Management and Strategy Research," edited by Marion Festing and Susanne Royer.......The article reviews the book "Current Issues in International Human Resource Management and Strategy Research," edited by Marion Festing and Susanne Royer....

  16. Client/server models for transparent, distributed computational resources

    International Nuclear Information System (INIS)

    Hammer, K.E.; Gilman, T.L.

    1991-01-01

    Client/server models are proposed to address issues of shared resources in a distributed, heterogeneous UNIX environment. Recent development of automated Remote Procedure Call (RPC) interface generator has simplified the development of client/server models. Previously, implementation of the models was only possible at the UNIX socket level. An overview of RPCs and the interface generator will be presented and will include a discussion of generation and installation of remote services, the RPC paradigm, and the three levels of RPC programming. Two applications, the Nuclear Plant Analyzer (NPA) and a fluids simulation using molecular modelling, will be presented to demonstrate how client/server models using RPCs and External Data Representations (XDR) have been used production/computation situations. The NPA incorporates a client/server interface for transferring/translation of TRAC or RELAP results from the UNICOS Cray to a UNIX workstation. The fluids simulation program utilizes the client/server model to access the Cray via a single function allowing it to become a shared co-processor to the workstation application. 5 refs., 6 figs

  17. Application of Selective Algorithm for Effective Resource Provisioning in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Modern day continued demand for resource hungry services and applications in IT sector has led to development of Cloud computing. Cloud computing environment involves high cost infrastructure on one hand and need high scale computational resources on the other hand. These resources need to be provisioned (allocation and scheduling) to the end users in most efficient manner so that the tremendous capabilities of cloud are utilized effectively and efficiently. In this paper we discuss a selecti...

  18. Research on elastic resource management for multi-queue under cloud computing environment

    Science.gov (United States)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  19. SYSTEMATIC LITERATURE REVIEW ON RESOURCE ALLOCATION AND RESOURCE SCHEDULING IN CLOUD COMPUTING

    OpenAIRE

    B. Muni Lavanya; C. Shoba Bindu

    2016-01-01

    The objective the work is intended to highlight the key features and afford finest future directions in the research community of Resource Allocation, Resource Scheduling and Resource management from 2009 to 2016. Exemplifying how research on Resource Allocation, Resource Scheduling and Resource management has progressively increased in the past decade by inspecting articles, papers from scientific and standard publications. Survey materialized in three-fold process. Firstly, investigate on t...

  20. Cloud Computing and Information Technology Resource Cost Management for SMEs

    DEFF Research Database (Denmark)

    Kuada, Eric; Adanu, Kwame; Olesen, Henning

    2013-01-01

    This paper analyzes the decision-making problem confronting SMEs considering the adoption of cloud computing as an alternative to in-house computing services provision. The economics of choosing between in-house computing and a cloud alternative is analyzed by comparing the total economic costs...... in determining the relative value of cloud computing....

  1. A review of Computer Science resources for learning and teaching with K-12 computing curricula: an Australian case study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-10-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.

  2. Computing Education in Korea--Current Issues and Endeavors

    Science.gov (United States)

    Choi, Jeongwon; An, Sangjin; Lee, Youngjun

    2015-01-01

    Computer education has been provided for a long period of time in Korea. Starting as a vocational program, the content of computer education for students evolved to include content on computer literacy, Information Communication Technology (ICT) literacy, and brand-new computer science. While a new curriculum related to computer science was…

  3. Study on Cloud Computing Resource Scheduling Strategy Based on the Ant Colony Optimization Algorithm

    OpenAIRE

    Lingna He; Qingshui Li; Linan Zhu

    2012-01-01

    In order to replace the traditional Internet software usage patterns and enterprise management mode, this paper proposes a new business calculation mode- cloud computing, resources scheduling strategy is the key technology in cloud computing, Based on the study of cloud computing system structure and the mode of operation, The key research for cloud computing the process of the work scheduling and resource allocation problems based on ant colony algorithm , Detailed analysis and design of the...

  4. Computer programs for eddy-current defect studies

    International Nuclear Information System (INIS)

    Pate, J.R.; Dodd, C.V.

    1990-06-01

    Several computer programs to aid in the design of eddy-current tests and probes have been written. The programs, written in Fortran, deal in various ways with the response to defects exhibited by four types of probes: the pancake probe, the reflection probe, the circumferential boreside probe, and the circumferential encircling probe. Programs are included which calculate the impedance or voltage change in a coil due to a defect, which calculate and plot the defect sensitivity factor of a coil, and which invert calculated or experimental readings to obtain the size of a defect. The theory upon which the programs are based is the Burrows point defect theory, and thus the calculations of the programs will be more accurate for small defects. 6 refs., 21 figs

  5. Computer programs for eddy-current defect studies

    Energy Technology Data Exchange (ETDEWEB)

    Pate, J. R.; Dodd, C. V. [Oak Ridge National Lab., TN (USA)

    1990-06-01

    Several computer programs to aid in the design of eddy-current tests and probes have been written. The programs, written in Fortran, deal in various ways with the response to defects exhibited by four types of probes: the pancake probe, the reflection probe, the circumferential boreside probe, and the circumferential encircling probe. Programs are included which calculate the impedance or voltage change in a coil due to a defect, which calculate and plot the defect sensitivity factor of a coil, and which invert calculated or experimental readings to obtain the size of a defect. The theory upon which the programs are based is the Burrows point defect theory, and thus the calculations of the programs will be more accurate for small defects. 6 refs., 21 figs.

  6. Considerations on an automatic computed tomography tube current modulation system

    International Nuclear Information System (INIS)

    Moro, L.; Panizza, D.; D'Ambrosio, D.; Carne, I.

    2013-01-01

    The scope of this study was to evaluate the effects on radiation output and image noise varying the acquisition parameters with an automatic tube current modulation (ATCM) system in computed tomography (CT). Chest CT examinations of an anthropomorphic phantom were acquired using a GE LightSpeed VCT 64-slice tomograph. Acquisitions were performed using different pitch, slice thickness and noise index (NI) values and varying the orientation of the scanned projection radiograph (SPR). The radiation output was determined by the CT dose index (CTDI vol ). Image noise was evaluated measuring the standard deviation of CT numbers in several regions of interest. The radiation output was lower if the SPR was acquired in the anterior-posterior projection. The radiation dose with the posterior-anterior SPR was higher, because the divergence of the X-ray beam magnifies the anatomical structures closest to the tube, especially the spinal column, and this leads the ATCM system to estimate higher patient attenuation values and, therefore, to select higher tube current values. The NI was inversely proportional to the square root of the CTDI vol and, with fixed NI, the CTDI vol increased as the slice thickness decreased. This study suggests some important issues to use the GE ATCM system efficiently. (authors)

  7. An interactive computer approach to performing resource analysis for a multi-resource/multi-project problem. [Spacelab inventory procurement planning

    Science.gov (United States)

    Schlagheck, R. A.

    1977-01-01

    New planning techniques and supporting computer tools are needed for the optimization of resources and costs for space transportation and payload systems. Heavy emphasis on cost effective utilization of resources has caused NASA program planners to look at the impact of various independent variables that affect procurement buying. A description is presented of a category of resource planning which deals with Spacelab inventory procurement analysis. Spacelab is a joint payload project between NASA and the European Space Agency and will be flown aboard the Space Shuttle starting in 1980. In order to respond rapidly to the various procurement planning exercises, a system was built that could perform resource analysis in a quick and efficient manner. This system is known as the Interactive Resource Utilization Program (IRUP). Attention is given to aspects of problem definition, an IRUP system description, questions of data base entry, the approach used for project scheduling, and problems of resource allocation.

  8. Resource-Aware Load Balancing Scheme using Multi-objective Optimization in Cloud Computing

    OpenAIRE

    Kavita Rana; Vikas Zandu

    2016-01-01

    Cloud computing is a service based, on-demand, pay per use model consisting of an interconnected and virtualizes resources delivered over internet. In cloud computing, usually there are number of jobs that need to be executed with the available resources to achieve optimal performance, least possible total time for completion, shortest response time, and efficient utilization of resources etc. Hence, job scheduling is the most important concern that aims to ensure that use’s requirement are ...

  9. Impact of changing computer technology on hydrologic and water resource modeling

    OpenAIRE

    Loucks, D.P.; Fedra, K.

    1987-01-01

    The increasing availability of substantial computer power at relatively low costs and the increasing ease of using computer graphics, of communicating with other computers and data bases, and of programming using high-level problem-oriented computer languages, is providing new opportunities and challenges for those developing and using hydrologic and water resources models. This paper reviews some of the progress made towards the development and application of computer support systems designe...

  10. LHCb Computing Resources: 2011 re-assessment, 2012 request and 2013 forecast

    CERN Document Server

    Graciani, R

    2011-01-01

    This note covers the following aspects: re-assessment of computing resource usage estimates for 2011 data taking period, request of computing resource needs for 2012 data taking period and a first forecast of the 2013 needs, when no data taking is foreseen. Estimates are based on 2010 experienced and last updates from LHC schedule, as well as on a new implementation of the computing model simulation tool. Differences in the model and deviations in the estimates from previous presented results are stressed.

  11. LHCb Computing Resources: 2012 re-assessment, 2013 request and 2014 forecast

    CERN Document Server

    Graciani Diaz, Ricardo

    2012-01-01

    This note covers the following aspects: re-assessment of computing resource usage estimates for 2012 data-taking period, request of computing resource needs for 2013, and a first forecast of the 2014 needs, when restart of data-taking is foreseen. Estimates are based on 2011 experience, as well as on the results of a simulation of the computing model described in the document. Differences in the model and deviations in the estimates from previous presented results are stressed.

  12. Computed tomography: acquisition process, technology and current state

    Directory of Open Access Journals (Sweden)

    Óscar Javier Espitia Mendoza

    2016-02-01

    Full Text Available Computed tomography is a noninvasive scan technique widely applied in areas such as medicine, industry, and geology. This technique allows the three-dimensional reconstruction of the internal structure of an object which is lighted with an X-rays source. The reconstruction is formed with two-dimensional cross-sectional images of the object. Each cross-sectional is obtained from measurements of physical phenomena, such as attenuation, dispersion, and diffraction of X-rays, as result of their interaction with the object. In general, measurements acquisition is performed with methods based on any of these phenomena and according to various architectures classified in generations. Furthermore, in response to the need to simulate acquisition systems for CT, software dedicated to this task has been developed. The objective of this research is to determine the current state of CT techniques, for this, a review of methods, different architectures used for the acquisition and some of its applications is presented. Additionally, results of simulations are presented. The main contributions of this work are the detailed description of acquisition methods and the presentation of the possible trends of the technique.

  13. P300 brain computer interface: current challenges and emerging trends

    Science.gov (United States)

    Fazel-Rezai, Reza; Allison, Brendan Z.; Guger, Christoph; Sellers, Eric W.; Kleih, Sonja C.; Kübler, Andrea

    2012-01-01

    A brain-computer interface (BCI) enables communication without movement based on brain signals measured with electroencephalography (EEG). BCIs usually rely on one of three types of signals: the P300 and other components of the event-related potential (ERP), steady state visual evoked potential (SSVEP), or event related desynchronization (ERD). Although P300 BCIs were introduced over twenty years ago, the past few years have seen a strong increase in P300 BCI research. This closed-loop BCI approach relies on the P300 and other components of the ERP, based on an oddball paradigm presented to the subject. In this paper, we overview the current status of P300 BCI technology, and then discuss new directions: paradigms for eliciting P300s; signal processing methods; applications; and hybrid BCIs. We conclude that P300 BCIs are quite promising, as several emerging directions have not yet been fully explored and could lead to improvements in bit rate, reliability, usability, and flexibility. PMID:22822397

  14. Computational Identification of Novel Genes: Current and Future Perspectives.

    Science.gov (United States)

    Klasberg, Steffen; Bitard-Feildel, Tristan; Mallet, Ludovic

    2016-01-01

    While it has long been thought that all genomic novelties are derived from the existing material, many genes lacking homology to known genes were found in recent genome projects. Some of these novel genes were proposed to have evolved de novo, ie, out of noncoding sequences, whereas some have been shown to follow a duplication and divergence process. Their discovery called for an extension of the historical hypotheses about gene origination. Besides the theoretical breakthrough, increasing evidence accumulated that novel genes play important roles in evolutionary processes, including adaptation and speciation events. Different techniques are available to identify genes and classify them as novel. Their classification as novel is usually based on their similarity to known genes, or lack thereof, detected by comparative genomics or against databases. Computational approaches are further prime methods that can be based on existing models or leveraging biological evidences from experiments. Identification of novel genes remains however a challenging task. With the constant software and technologies updates, no gold standard, and no available benchmark, evaluation and characterization of genomic novelty is a vibrant field. In this review, the classical and state-of-the-art tools for gene prediction are introduced. The current methods for novel gene detection are presented; the methodological strategies and their limits are discussed along with perspective approaches for further studies.

  15. Science and Technology Resources on the Internet: Computer Security.

    Science.gov (United States)

    Kinkus, Jane F.

    2002-01-01

    Discusses issues related to computer security, including confidentiality, integrity, and authentication or availability; and presents a selected list of Web sites that cover the basic issues of computer security under subject headings that include ethics, privacy, kids, antivirus, policies, cryptography, operating system security, and biometrics.…

  16. Current Role of Computer Navigation in Total Knee Arthroplasty.

    Science.gov (United States)

    Jones, Christopher W; Jerabek, Seth A

    2018-01-31

    Computer-assisted surgical (CAS) navigation has been developed with the aim of improving the accuracy and precision of total knee arthroplasty (TKA) component positioning and therefore overall limb alignment. The historical goal of knee arthroplasty has been to restore the mechanical alignment of the lower limb by aligning the femoral and tibial components perpendicular to the mechanical axis of the femur and tibia. Despite over 4 decades of TKA component development and nearly 2 decades of interest in CAS, the fundamental question remains; does the alignment goal and/or the method of achieving that goal affect the outcome of the TKA in terms of patient-reported outcome measures and/or overall survivorship? The quest for reliable and reproducible achievement of the intraoperative alignment goal has been the primary motivator for the introduction, development, and refinement of CAS navigation. Numerous proprietary systems now exist, and rapid technological advancements in computer processing power are stimulating further development of robotic surgical systems. Three categories of CAS can be defined: image-based large-console navigation; imageless large-console navigation, and more recently, accelerometer-based handheld navigation systems have been developed. A review of the current literature demonstrates that there are enough well-designed studies to conclude that both large-console CAS and handheld navigation systems improve the accuracy and precision of component alignment in TKA. However, missing from the evidence base, other than the subgroup analysis provided by the Australian Orthopaedic Association National Joint Replacement Registry, are any conclusive demonstrations of a clinical superiority in terms of improved patient-reported outcome measures and/or decreased cumulative revision rates in the long term. Few authors would argue that accuracy of alignment is a goal to ignore; therefore, in the absence of clinical evidence, many of the arguments against

  17. ``Carbon Credits'' for Resource-Bounded Computations Using Amortised Analysis

    Science.gov (United States)

    Jost, Steffen; Loidl, Hans-Wolfgang; Hammond, Kevin; Scaife, Norman; Hofmann, Martin

    Bounding resource usage is important for a number of areas, notably real-time embedded systems and safety-critical systems. In this paper, we present a fully automatic static type-based analysis for inferring upper bounds on resource usage for programs involving general algebraic datatypes and full recursion. Our method can easily be used to bound any countable resource, without needing to revisit proofs. We apply the analysis to the important metrics of worst-case execution time, stack- and heap-space usage. Our results from several realistic embedded control applications demonstrate good matches between our inferred bounds and measured worst-case costs for heap and stack usage. For time usage we infer good bounds for one application. Where we obtain less tight bounds, this is due to the use of software floating-point libraries.

  18. Quantum computing with incoherent resources and quantum jumps.

    Science.gov (United States)

    Santos, M F; Cunha, M Terra; Chaves, R; Carvalho, A R R

    2012-04-27

    Spontaneous emission and the inelastic scattering of photons are two natural processes usually associated with decoherence and the reduction in the capacity to process quantum information. Here we show that, when suitably detected, these photons are sufficient to build all the fundamental blocks needed to perform quantum computation in the emitting qubits while protecting them from deleterious dissipative effects. We exemplify this by showing how to efficiently prepare graph states for the implementation of measurement-based quantum computation.

  19. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  20. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc; Vitriolo, Alessandro; Adamo, Antonio; Laise, Pasquale; Das, Vivek; Testa, Giuseppe

    2016-01-01

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  1. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    Directory of Open Access Journals (Sweden)

    Nan Zhang

    Full Text Available Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  2. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    Science.gov (United States)

    Zhang, Nan; Yang, Xiaolong; Zhang, Min; Sun, Yan

    2016-01-01

    Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  3. Tidal Current Energy Resource Assessment Around Buton Island, Southeast Sulawesi, Indonesia

    OpenAIRE

    Ribal, Agustinus; Amir, Amir Kamal; Toaha, Syamsuddin; Kusuma, Jeffry; Khaeruddin

    2017-01-01

    International Journal bereputasi An early stage of assessing tidal current energy resources is carried out in this present work. Tidal current power is estimated around Buton Island, Southeast Sulawesi province, Indonesia. Two-dimensional, depth-integrated of Advanced Circulation (ADCIRC) model has been used to simulate tidal elevation and barotropic tidal current around the island. Green???s function approach has been used to improve eight tidal constituents on the open boundary condition...

  4. Surgical resource utilization in urban terrorist bombing: a computer simulation.

    Science.gov (United States)

    Hirshberg, A; Stein, M; Walden, R

    1999-09-01

    The objective of this study was to analyze the utilization of surgical staff and facilities during an urban terrorist bombing incident. A discrete-event computer model of the emergency room and related hospital facilities was constructed and implemented, based on cumulated data from 12 urban terrorist bombing incidents in Israel. The simulation predicts that the admitting capacity of the hospital depends primarily on the number of available surgeons and defines an optimal staff profile for surgeons, residents, and trauma nurses. The major bottlenecks in the flow of critical casualties are the shock rooms and the computed tomographic scanner but not the operating rooms. The simulation also defines the number of reinforcement staff needed to treat noncritical casualties and shows that radiology is the major obstacle to the flow of these patients. Computer simulation is an important new tool for the optimization of surgical service elements for a multiple-casualty situation.

  5. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  6. Optimal Computing Resource Management Based on Utility Maximization in Mobile Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Haoyu Meng

    2017-01-01

    Full Text Available Mobile crowdsourcing, as an emerging service paradigm, enables the computing resource requestor (CRR to outsource computation tasks to each computing resource provider (CRP. Considering the importance of pricing as an essential incentive to coordinate the real-time interaction among the CRR and CRPs, in this paper, we propose an optimal real-time pricing strategy for computing resource management in mobile crowdsourcing. Firstly, we analytically model the CRR and CRPs behaviors in form of carefully selected utility and cost functions, based on concepts from microeconomics. Secondly, we propose a distributed algorithm through the exchange of control messages, which contain the information of computing resource demand/supply and real-time prices. We show that there exist real-time prices that can align individual optimality with systematic optimality. Finally, we also take account of the interaction among CRPs and formulate the computing resource management as a game with Nash equilibrium achievable via best response. Simulation results demonstrate that the proposed distributed algorithm can potentially benefit both the CRR and CRPs. The coordinator in mobile crowdsourcing can thus use the optimal real-time pricing strategy to manage computing resources towards the benefit of the overall system.

  7. A survey of current trends in computational drug repositioning.

    Science.gov (United States)

    Li, Jiao; Zheng, Si; Chen, Bin; Butte, Atul J; Swamidass, S Joshua; Lu, Zhiyong

    2016-01-01

    Computational drug repositioning or repurposing is a promising and efficient tool for discovering new uses from existing drugs and holds the great potential for precision medicine in the age of big data. The explosive growth of large-scale genomic and phenotypic data, as well as data of small molecular compounds with granted regulatory approval, is enabling new developments for computational repositioning. To achieve the shortest path toward new drug indications, advanced data processing and analysis strategies are critical for making sense of these heterogeneous molecular measurements. In this review, we show recent advancements in the critical areas of computational drug repositioning from multiple aspects. First, we summarize available data sources and the corresponding computational repositioning strategies. Second, we characterize the commonly used computational techniques. Third, we discuss validation strategies for repositioning studies, including both computational and experimental methods. Finally, we highlight potential opportunities and use-cases, including a few target areas such as cancers. We conclude with a brief discussion of the remaining challenges in computational drug repositioning. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

  8. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    Science.gov (United States)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in

  9. Several problems of algorithmization in integrated computation programs on third generation computers for short circuit currents in complex power networks

    Energy Technology Data Exchange (ETDEWEB)

    Krylov, V.A.; Pisarenko, V.P.

    1982-01-01

    Methods of modeling complex power networks with short circuits in the networks are described. The methods are implemented in integrated computation programs for short circuit currents and equivalents in electrical networks with a large number of branch points (up to 1000) on a computer with a limited on line memory capacity (M equals 4030 for the computer).

  10. Energy-efficient cloud computing : autonomic resource provisioning for datacenters

    OpenAIRE

    Tesfatsion, Selome Kostentinos

    2018-01-01

    Energy efficiency has become an increasingly important concern in data centers because of issues associated with energy consumption, such as capital costs, operating expenses, and environmental impact. While energy loss due to suboptimal use of facilities and non-IT equipment has largely been reduced through the use of best-practice technologies, addressing energy wastage in IT equipment still requires the design and implementation of energy-aware resource management systems. This thesis focu...

  11. TOWARDS NEW COMPUTATIONAL ARCHITECTURES FOR MASS-COLLABORATIVE OPENEDUCATIONAL RESOURCES

    OpenAIRE

    Ismar Frango Silveira; Xavier Ochoa; Antonio Silva Sprock; Pollyana Notargiacomo Mustaro; Yosly C. Hernandez Bieluskas

    2011-01-01

    Open Educational Resources offer several benefits mostly in education and training. Being potentially reusable, their use can reduce time and cost of developing educational programs, so that these savings could be transferred directly to students through the production of a large range of open, freely available content, which vary from hypermedia to digital textbooks. This paper discuss this issue and presents a project and a research network that, in spite of being directed to Latin America'...

  12. PNNL supercomputer to become largest computing resource on the Grid

    CERN Multimedia

    2002-01-01

    Hewlett Packard announced that the US DOE Pacific Northwest National Laboratory will connect a 9.3-teraflop HP supercomputer to the DOE Science Grid. This will be the largest supercomputer attached to a computer grid anywhere in the world (1 page).

  13. Computer System Resource Requirements of Novice Programming Students.

    Science.gov (United States)

    Nutt, Gary J.

    The characteristics of jobs that constitute the mix for lower division FORTRAN classes in a university were investigated. Samples of these programs were also benchmarked on a larger central site computer and two minicomputer systems. It was concluded that a carefully chosen minicomputer system could offer service at least the equivalent of the…

  14. A computational model for lower hybrid current drive

    International Nuclear Information System (INIS)

    Englade, R.C.; Bonoli, P.T.; Porkolab, M.

    1983-01-01

    A detailed simulation model for lower hybrid (LH) current drive in toroidal devices is discussed. This model accounts reasonably well for the magnitude of radio frequency (RF) current observed in the PLT and Alcator C devices. It also reproduces the experimental dependencies of RF current generation on toroidal magnetic field and has provided insights about mechanisms which may underlie the observed density limit of current drive. (author)

  15. The current state of the creation and modernization of national geodetic and cartographic resources in Poland

    Directory of Open Access Journals (Sweden)

    Doskocz Adam

    2016-01-01

    Full Text Available All official data are currently integrated and harmonized in a spatial reference system. This paper outlines a national geodetic and cartographic resources in Poland. The national geodetic and cartographic resources are an important part of the spatial information infrastructure in the European Community. They also provide reference data for other resources of Spatial Data Infrastructure (SDI, including: main and detailed geodetic control networks, base maps, land and buildings registries, geodetic registries of utilities and topographic maps. This paper presents methods of producing digital map data and technical standards for field surveys, and in addition paper also presents some aspects of building Global and Regional SDI.

  16. A Novel Resource Management Method of Providing Operating System as a Service for Mobile Transparent Computing

    Directory of Open Access Journals (Sweden)

    Yonghua Xiong

    2014-01-01

    Full Text Available This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU virtualization and mobile agent for mobile transparent computing (MTC to devise a method of managing shared resources and services management (SRSM. It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user’s requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  17. A novel resource management method of providing operating system as a service for mobile transparent computing.

    Science.gov (United States)

    Xiong, Yonghua; Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  18. Positron Computed Tomography: Current State, Clinical Results and Future Trends

    Science.gov (United States)

    Schelbert, H. R.; Phelps, M. E.; Kuhl, D. E.

    1980-09-01

    An overview is presented of positron computed tomography: its advantages over single photon emission tomography, its use in metabolic studies of the heart and chemical investigation of the brain, and future trends. (ACR)

  19. Positron computed tomography: current state, clinical results and future trends

    Energy Technology Data Exchange (ETDEWEB)

    Schelbert, H.R.; Phelps, M.E.; Kuhl, D.E.

    1980-09-01

    An overview is presented of positron computed tomography: its advantages over single photon emission tomography, its use in metabolic studies of the heart and chemical investigation of the brain, and future trends. (ACR)

  20. Positron computed tomography: current state, clinical results and future trends

    International Nuclear Information System (INIS)

    Schelbert, H.R.; Phelps, M.E.; Kuhl, D.E.

    1980-01-01

    An overview is presented of positron computed tomography: its advantages over single photon emission tomography, its use in metabolic studies of the heart and chemical investigation of the brain, and future trends

  1. Logical and physical resource management in the common node of a distributed function laboratory computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-01-01

    A scheme for managing resources required for transaction processing in the common node of a distributed function computer system has been given. The scheme has been found to be satisfactory for all common node services provided so far

  2. Current status and issues of nuclear human resource development/General activities of Japan nuclear human resource development network

    International Nuclear Information System (INIS)

    Murakami, Hiroyuki; Hino, Sadami; Tsuru, Hisanori

    2013-01-01

    The Japan Nuclear Human Resource Development Network (JN-HRD Net) was established in November 2010 with the aim of developing a framework for mutual cooperation and information sharing among nuclear-related organizations. Although the tasks and goals of developing human resources in the nuclear field have been shifted since the accident at the Tokyo Electric Power Company (TEPCO) Fukushima Daiichi Nuclear Power Plant, the necessity of fostering capable personnel in this field stays unchanged and the importance of our network activities has further emphasized. The meeting of JN-HRD Net was held on the 5th of February 2013, where its activities by each field were reported and views and opinions were actively exchanged between more than 90 participants. This paper briefly describes current status and issues of JN-HRD Net and its general activities conducted by the JN-HRD Net secretariat. (J.P.N.)

  3. Research on the digital education resources of sharing pattern in independent colleges based on cloud computing environment

    Science.gov (United States)

    Xiong, Ting; He, Zhiwen

    2017-06-01

    Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.

  4. Regional research exploitation of the LHC a case-study of the required computing resources

    CERN Document Server

    Almehed, S; Eerola, Paule Anna Mari; Mjörnmark, U; Smirnova, O G; Zacharatou-Jarlskog, C; Åkesson, T

    2002-01-01

    A simulation study to evaluate the required computing resources for a research exploitation of the Large Hadron Collider (LHC) has been performed. The evaluation was done as a case study, assuming existence of a Nordic regional centre and using the requirements for performing a specific physics analysis as a yard-stick. Other imput parameters were: assumption for the distribution of researchers at the institutions involved, an analysis model, and two different functional structures of the computing resources.

  5. Economic models for management of resources in peer-to-peer and grid computing

    Science.gov (United States)

    Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David

    2001-07-01

    The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.

  6. MCPLOTS: a particle physics resource based on volunteer computing

    CERN Document Server

    Karneyeu, A; Prestel, S; Skands, P Z

    2014-01-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC@HOME platform.

  7. MCPLOTS. A particle physics resource based on volunteer computing

    Energy Technology Data Exchange (ETDEWEB)

    Karneyeu, A. [Joint Inst. for Nuclear Research, Moscow (Russian Federation); Mijovic, L. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Irfu/SPP, CEA-Saclay, Gif-sur-Yvette (France); Prestel, S. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Lund Univ. (Sweden). Dept. of Astronomy and Theoretical Physics; Skands, P.Z. [European Organization for Nuclear Research (CERN), Geneva (Switzerland)

    2013-07-15

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  8. MCPLOTS: a particle physics resource based on volunteer computing

    International Nuclear Information System (INIS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.; Skands, P.Z.

    2014-01-01

    The mcplots.cern.ch web site (mcplots) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the hepdata online database of experimental results and on the rivet Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the lhc rate at home 2.0 platform. (orig.)

  9. MCPLOTS. A particle physics resource based on volunteer computing

    International Nuclear Information System (INIS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.

    2013-07-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  10. Computer simulation of transport driven current in tokamaks

    International Nuclear Information System (INIS)

    Nunan, W.J.; Dawson, J.M.

    1993-01-01

    Plasma transport phenomena can drive large currents parallel to an externally applied magnetic field. The Bootstrap Current Theory accounts for the effect of Banana diffusion on toroidal current, but the effect is not confined to that transport regime. The authors' 2 1/2-D, electromagnetic, particle simulations have demonstrated that Maxwellian plasmas in static toroidal and vertical fields spontaneously develop significant toroidal current, even in the absence of the open-quotes seed currentclose quotes which the Bootstrap Theory requires. Other simulations, in both toroidal and straight cylindrical geometries, and without any externally imposed electric field, show that if the plasma column is centrally fueled, and if the particle diffusion coefficient exceeds the magnetic diffusion coefficient (as is true in most tokamaks) then the toroidal current grows steadily. The simulations indicate that such fueling, coupled with central heating due to fusion reactions may drive all of the tokamak's toroidal current. The Bootstrap and dynamo mechanisms do not drive toroidal current where the poloidal magnetic field is zero. The simulations, as well as initial theoretical work, indicate that in tokamak plasmas, various processes naturally transport current from the outer regions of the plasma to the magnetic axis. The mechanisms which cause this effective electron viscosity include conventional binary collisions, wave emission and reabsorption, and also convection associated with rvec E x rvec B vortex motion. The simulations also exhibit preferential loss of particles carrying current opposing the bulk plasma current. This preferential loss generates current even at the magnetic axis. If these self-seeding mechanisms function in experiments as they do in the simulations, then transport driven current would eliminate the need for any external current drive in tokamaks, except simple ohmic heating for initial generation of the plasma

  11. General-purpose computer networks and resource sharing in ERDA. Volume 3. Remote resource-sharing experience and findings

    Energy Technology Data Exchange (ETDEWEB)

    1977-07-15

    The investigation focused on heterogeneous networks in which a variety of dissimilar computers and operating systems were interconnected nationwide. Homogeneous networks, such as MFE net and SACNET, were not considered since they could not be used for general purpose resource sharing. Issues of privacy and security are of concern in any network activity. However, consideration of privacy and security of sensitive data arise to a much lesser degree in unclassified scientific research than in areas involving personal or proprietary information. Therefore, the existing mechanisms at individual sites for protecting sensitive data were relied on, and no new protection mechanisms to prevent infringement of privacy and security were attempted. Further development of ERDA networking will need to incorporate additional mechanisms to prevent infringement of privacy. The investigation itself furnishes an excellent example of computational resource sharing through a heterogeneous network. More than twenty persons, representing seven ERDA computing sites, made extensive use of both ERDA and non-ERDA computers in coordinating, compiling, and formatting the data which constitute the bulk of this report. Volume 3 analyzes the benefits and barriers encountered in actual resource sharing experience, and provides case histories of typical applications.

  12. Computational prediction of chemical reactions: current status and outlook.

    Science.gov (United States)

    Engkvist, Ola; Norrby, Per-Ola; Selmi, Nidhal; Lam, Yu-Hong; Peng, Zhengwei; Sherer, Edward C; Amberg, Willi; Erhard, Thomas; Smyth, Lynette A

    2018-06-01

    Over the past few decades, various computational methods have become increasingly important for discovering and developing novel drugs. Computational prediction of chemical reactions is a key part of an efficient drug discovery process. In this review, we discuss important parts of this field, with a focus on utilizing reaction data to build predictive models, the existing programs for synthesis prediction, and usage of quantum mechanics and molecular mechanics (QM/MM) to explore chemical reactions. We also outline potential future developments with an emphasis on pre-competitive collaboration opportunities. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Using High Performance Computing to Support Water Resource Planning

    Energy Technology Data Exchange (ETDEWEB)

    Groves, David G. [RAND Corporation, Santa Monica, CA (United States); Lembert, Robert J. [RAND Corporation, Santa Monica, CA (United States); May, Deborah W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Leek, James R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Syme, James [RAND Corporation, Santa Monica, CA (United States)

    2015-10-22

    In recent years, decision support modeling has embraced deliberation-withanalysis— an iterative process in which decisionmakers come together with experts to evaluate a complex problem and alternative solutions in a scientifically rigorous and transparent manner. Simulation modeling supports decisionmaking throughout this process; visualizations enable decisionmakers to assess how proposed strategies stand up over time in uncertain conditions. But running these simulation models over standard computers can be slow. This, in turn, can slow the entire decisionmaking process, interrupting valuable interaction between decisionmakers and analytics.

  14. BelleII@home: Integrate volunteer computing resources into DIRAC in a secure way

    Science.gov (United States)

    Wu, Wenjing; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo; Kan, Wenxiao; Urquijo, Phillip

    2017-10-01

    The exploitation of volunteer computing resources has become a popular practice in the HEP computing community as the huge amount of potential computing power it provides. In the recent HEP experiments, the grid middleware has been used to organize the services and the resources, however it relies heavily on the X.509 authentication, which is contradictory to the untrusted feature of volunteer computing resources, therefore one big challenge to utilize the volunteer computing resources is how to integrate them into the grid middleware in a secure way. The DIRAC interware which is commonly used as the major component of the grid computing infrastructure for several HEP experiments proposes an even bigger challenge to this paradox as its pilot is more closely coupled with operations requiring the X.509 authentication compared to the implementations of pilot in its peer grid interware. The Belle II experiment is a B-factory experiment at KEK, and it uses DIRAC for its distributed computing. In the project of BelleII@home, in order to integrate the volunteer computing resources into the Belle II distributed computing platform in a secure way, we adopted a new approach which detaches the payload running from the Belle II DIRAC pilot which is a customized pilot pulling and processing jobs from the Belle II distributed computing platform, so that the payload can run on volunteer computers without requiring any X.509 authentication. In this approach we developed a gateway service running on a trusted server which handles all the operations requiring the X.509 authentication. So far, we have developed and deployed the prototype of BelleII@home, and tested its full workflow which proves the feasibility of this approach. This approach can also be applied on HPC systems whose work nodes do not have outbound connectivity to interact with the DIRAC system in general.

  15. Decision making in water resource planning: Models and computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Fedra, K; Carlsen, A J [ed.

    1987-01-01

    This paper describes some basic concepts of simulation-based decision support systems for water resources management and the role of symbolic, graphics-based user interfaces. Designed to allow direct and easy access to advanced methods of analysis and decision support for a broad and heterogeneous group of users, these systems combine data base management, system simulation, operations research techniques such as optimization, interactive data analysis, elements of advanced decision technology, and artificial intelligence, with a friendly and conversational, symbolic display oriented user interface. Important features of the interface are the use of several parallel or alternative styles of interaction and display, indlucing colour graphics and natural language. Combining quantitative numerical methods with qualitative and heuristic approaches, and giving the user direct and interactive control over the systems function, human knowledge, experience and judgement are integrated with formal approaches into a tightly coupled man-machine system through an intelligent and easily accessible user interface. 4 drawings, 42 references.

  16. Mobile Cloud Computing: Resource Discovery, Session Connectivity and Other Open Issues

    NARCIS (Netherlands)

    Schüring, Markus; Karagiannis, Georgios

    2011-01-01

    Abstract—Cloud computing can be considered as a model that provides network access to a shared pool of resources, such as storage and computing power, which can be rapidly provisioned and released with minimal management effort. This paper describes a research activity in the area of mobile cloud

  17. 24 CFR 990.175 - Utilities expense level: Computation of the current consumption level.

    Science.gov (United States)

    2010-04-01

    ...: Computation of the current consumption level. 990.175 Section 990.175 Housing and Urban Development... Calculating Formula Expenses § 990.175 Utilities expense level: Computation of the current consumption level. The current consumption level shall be the actual amount of each utility consumed during the 12-month...

  18. Sensitivity analysis of eddy current sensors using computational simulation

    OpenAIRE

    Neugebauer, Reimund; Drossel, W.-G.; Mainda, P.; Roscher, H.-J.; Wolf, K.; Kroschk, M.

    2011-01-01

    Eddy current sensors can detect the position and movement of metal parts without direct contact. The magnetic fields of these sensors can penetrate protective metal enclosures when designed and applied appropriately. Thus particularly robust solutions for industrial applications are possible, e.g. tracking objects electrically like conductive or ferromagnetic work pieces (device currently being tested) during a treatment process under difficult production conditions. The disadvantage of a tes...

  19. Getting the Most from Distributed Resources With an Analytics Platform for ATLAS Computing Services

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00225336; The ATLAS collaboration; Gardner, Robert; Bryant, Lincoln

    2016-01-01

    To meet a sharply increasing demand for computing resources for LHC Run 2, ATLAS distributed computing systems reach far and wide to gather CPU resources and storage capacity to execute an evolving ecosystem of production and analysis workflow tools. Indeed more than a hundred computing sites from the Worldwide LHC Computing Grid, plus many “opportunistic” facilities at HPC centers, universities, national laboratories, and public clouds, combine to meet these requirements. These resources have characteristics (such as local queuing availability, proximity to data sources and target destinations, network latency and bandwidth capacity, etc.) affecting the overall processing efficiency and throughput. To quantitatively understand and in some instances predict behavior, we have developed a platform to aggregate, index (for user queries), and analyze the more important information streams affecting performance. These data streams come from the ATLAS production system (PanDA), the distributed data management s...

  20. Argonne's Laboratory Computing Resource Center 2009 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B. (CLS-CI)

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  1. Current topics in pure and computational complex analysis

    CERN Document Server

    Dorff, Michael; Lahiri, Indrajit

    2014-01-01

    The book contains 13 articles, some of which are survey articles and others research papers. Written by eminent mathematicians, these articles were presented at the International Workshop on Complex Analysis and Its Applications held at Walchand College of Engineering, Sangli. All the contributing authors are actively engaged in research fields related to the topic of the book. The workshop offered a comprehensive exposition of the recent developments in geometric functions theory, planar harmonic mappings, entire and meromorphic functions and their applications, both theoretical and computational. The recent developments in complex analysis and its applications play a crucial role in research in many disciplines.

  2. Sensor and computing resource management for a small satellite

    Science.gov (United States)

    Bhatia, Abhilasha; Goehner, Kyle; Sand, John; Straub, Jeremy; Mohammad, Atif; Korvald, Christoffer; Nervold, Anders Kose

    A small satellite in a low-Earth orbit (e.g., approximately a 300 to 400 km altitude) has an orbital velocity in the range of 8.5 km/s and completes an orbit approximately every 90 minutes. For a satellite with minimal attitude control, this presents a significant challenge in obtaining multiple images of a target region. Presuming an inclination in the range of 50 to 65 degrees, a limited number of opportunities to image a given target or communicate with a given ground station are available, over the course of a 24-hour period. For imaging needs (where solar illumination is required), the number of opportunities is further reduced. Given these short windows of opportunity for imaging, data transfer, and sending commands, scheduling must be optimized. In addition to the high-level scheduling performed for spacecraft operations, payload-level scheduling is also required. The mission requires that images be post-processed to maximize spatial resolution and minimize data transfer (through removing overlapping regions). The payload unit includes GPS and inertial measurement unit (IMU) hardware to aid in image alignment for the aforementioned. The payload scheduler must, thus, split its energy and computing-cycle budgets between determining an imaging sequence (required to capture the highly-overlapping data required for super-resolution and adjacent areas required for mosaicking), processing the imagery (to perform the super-resolution and mosaicking) and preparing the data for transmission (compressing it, etc.). This paper presents an approach for satellite control, scheduling and operations that allows the cameras, GPS and IMU to be used in conjunction to acquire higher-resolution imagery of a target region.

  3. Application of Computational Methods in Planaria Research: A Current Update

    Directory of Open Access Journals (Sweden)

    Ghosh Shyamasree

    2017-07-01

    Full Text Available Planaria is a member of the Phylum Platyhelminthes including flatworms. Planarians possess the unique ability of regeneration from adult stem cells or neoblasts and finds importance as a model organism for regeneration and developmental studies. Although research is being actively carried out globally through conventional methods to understand the process of regeneration from neoblasts, biology of development, neurobiology and immunology of Planaria, there are many thought provoking questions related to stem cell plasticity, and uniqueness of regenerative potential in Planarians amongst other members of Phylum Platyhelminthes. The complexity of receptors and signalling mechanisms, immune system network, biology of repair, responses to injury are yet to be understood in Planaria. Genomic and transcriptomic studies have generated a vast repository of data, but their availability and analysis is a challenging task. Data mining, computational approaches of gene curation, bioinformatics tools for analysis of transcriptomic data, designing of databases, application of algorithms in deciphering changes of morphology by RNA interference (RNAi approaches, understanding regeneration experiments is a new venture in Planaria research that is helping researchers across the globe in understanding the biology. We highlight the applications of Hidden Markov models (HMMs in designing of computational tools and their applications in Planaria decoding their complex biology.

  4. Software Defined Resource Orchestration System for Multitask Application in Heterogeneous Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Qi Qi

    2016-01-01

    Full Text Available The mobile cloud computing (MCC that combines mobile computing and cloud concept takes wireless access network as the transmission medium and uses mobile devices as the client. When offloading the complicated multitask application to the MCC environment, each task executes individually in terms of its own computation, storage, and bandwidth requirement. Due to user’s mobility, the provided resources contain different performance metrics that may affect the destination choice. Nevertheless, these heterogeneous MCC resources lack integrated management and can hardly cooperate with each other. Thus, how to choose the appropriate offload destination and orchestrate the resources for multitask is a challenge problem. This paper realizes a programming resource provision for heterogeneous energy-constrained computing environments, where a software defined controller is responsible for resource orchestration, offload, and migration. The resource orchestration is formulated as multiobjective optimal problem that contains the metrics of energy consumption, cost, and availability. Finally, a particle swarm algorithm is used to obtain the approximate optimal solutions. Simulation results show that the solutions for all of our studied cases almost can hit Pareto optimum and surpass the comparative algorithm in approximation, coverage, and execution time.

  5. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    Science.gov (United States)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  6. Computation of transient 3-D eddy current in nonmagnetic conductor

    International Nuclear Information System (INIS)

    Yeh, H.T.

    1978-01-01

    A numerical procedure was developed to solve transient three-dimensional (3-D) eddy current problems for nonmagnetic conductor. Integral equation formulation in terms of vector potential is used to simplify the matching of boundary conditions. The resulting equations and their numerical approximation were shown to be singular and to require special handling. Several types of symmetries were introduced. They not only reduce the number of algebraic equations to be solved, but also modify the nature of the equations and render them nonsingular. Temporal behavior was obtained with the Runge-Kutta method. The program is tested in several examples of eddy currents for its spatial and temporal profiles, shielding, boundary surface effects, and application of various symmetry options

  7. Current strategies for dosage reduction in computed tomography

    International Nuclear Information System (INIS)

    May, M.S.; Wuest, W.; Lell, M.M.; Uder, M.; Kalender, W.A.; Schmidt, B.

    2012-01-01

    The potential risks of radiation exposure associated with computed tomography (CT) imaging are reason for ongoing concern for both medical staff and patients. Radiation dose reduction is, according to the as low as reasonably achievable principle, an important issue in clinical routine, research and development. The complex interaction of preparation, examination and post-processing provides a high potential for optimization on the one hand but on the other a high risk for errors. The radiologist is responsible for the quality of the CT examination which requires specialized and up-to-date knowledge. Most of the techniques for radiation dose reduction are independent of the system and manufacturer. The basic principle should be radiation dose optimization without loss of diagnostic image quality rather than just reduction. (orig.) [de

  8. Brain-computer interfaces current trends and applications

    CERN Document Server

    Azar, Ahmad

    2015-01-01

    The success of a BCI system depends as much on the system itself as on the user’s ability to produce distinctive EEG activity. BCI systems can be divided into two groups according to the placement of the electrodes used to detect and measure neurons firing in the brain. These groups are: invasive systems, electrodes are inserted directly into the cortex are used for single cell or multi unit recording, and electrocorticography (EcoG), electrodes are placed on the surface of the cortex (or dura); noninvasive systems, they are placed on the scalp and use electroencephalography (EEG) or magnetoencephalography (MEG) to detect neuron activity. The book is basically divided into three parts. The first part of the book covers the basic concepts and overviews of Brain Computer Interface. The second part describes new theoretical developments of BCI systems. The third part covers views on real applications of BCI systems.

  9. Computer-assisted Orthopaedic Surgery: Current State and Future Perspective

    Directory of Open Access Journals (Sweden)

    Guoyan eZheng

    2015-12-01

    Full Text Available Introduced about two decades ago, computer-assisted orthopaedic surgery (CAOS has emerged as a new and independent area, due to the importance of treatment of musculoskeletal diseases in orthopaedics and traumatology, increasing availability of different imaging modalities, and advances in analytics and navigation tools. The aim of this paper is to present the basic elements of CAOS devices and to review state-of-the-art examples of different imaging modalities used to create the virtual representations, of different position tracking devices for navigation systems, of different surgical robots, of different methods for registration and referencing, and of CAOS modules that have been realized for different surgical procedures. Future perspectives will also be outlined.

  10. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  11. Computational resources for ribosome profiling: from database to Web server and software.

    Science.gov (United States)

    Wang, Hongwei; Wang, Yan; Xie, Zhi

    2017-08-14

    Ribosome profiling is emerging as a powerful technique that enables genome-wide investigation of in vivo translation at sub-codon resolution. The increasing application of ribosome profiling in recent years has achieved remarkable progress toward understanding the composition, regulation and mechanism of translation. This benefits from not only the awesome power of ribosome profiling but also an extensive range of computational resources available for ribosome profiling. At present, however, a comprehensive review on these resources is still lacking. Here, we survey the recent computational advances guided by ribosome profiling, with a focus on databases, Web servers and software tools for storing, visualizing and analyzing ribosome profiling data. This review is intended to provide experimental and computational biologists with a reference to make appropriate choices among existing resources for the question at hand. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Göttingen

    Science.gov (United States)

    Meyer, Jörg; Quadt, Arnulf; Weber, Pavel; ATLAS Collaboration

    2011-12-01

    GoeGrid is a grid resource center located in Göttingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields of grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community, GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and personpower resources.

  13. EGI-EUDAT integration activity - Pair data and high-throughput computing resources together

    Science.gov (United States)

    Scardaci, Diego; Viljoen, Matthew; Vitlacil, Dejan; Fiameni, Giuseppe; Chen, Yin; sipos, Gergely; Ferrari, Tiziana

    2016-04-01

    EGI (www.egi.eu) is a publicly funded e-infrastructure put together to give scientists access to more than 530,000 logical CPUs, 200 PB of disk capacity and 300 PB of tape storage to drive research and innovation in Europe. The infrastructure provides both high throughput computing and cloud compute/storage capabilities. Resources are provided by about 350 resource centres which are distributed across 56 countries in Europe, the Asia-Pacific region, Canada and Latin America. EUDAT (www.eudat.eu) is a collaborative Pan-European infrastructure providing research data services, training and consultancy for researchers, research communities, research infrastructures and data centres. EUDAT's vision is to enable European researchers and practitioners from any research discipline to preserve, find, access, and process data in a trusted environment, as part of a Collaborative Data Infrastructure (CDI) conceived as a network of collaborating, cooperating centres, combining the richness of numerous community-specific data repositories with the permanence and persistence of some of Europe's largest scientific data centres. EGI and EUDAT, in the context of their flagship projects, EGI-Engage and EUDAT2020, started in March 2015 a collaboration to harmonise the two infrastructures, including technical interoperability, authentication, authorisation and identity management, policy and operations. The main objective of this work is to provide end-users with a seamless access to an integrated infrastructure offering both EGI and EUDAT services and, then, pairing data and high-throughput computing resources together. To define the roadmap of this collaboration, EGI and EUDAT selected a set of relevant user communities, already collaborating with both infrastructures, which could bring requirements and help to assign the right priorities to each of them. In this way, from the beginning, this activity has been really driven by the end users. The identified user communities are

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  15. Current algorithms for computed electron beam dose planning

    International Nuclear Information System (INIS)

    Brahme, A.

    1985-01-01

    Two- and sometimes three-dimensional computer algorithms for electron beam irradiation are capable of taking all irregularities of the body cross-section and the properties of the various tissues into account. This is achieved by dividing the incoming broad beams into a number of narrow pencil beams, the penetration of which can be described by essentially one-dimensional formalisms. The constituent pencil beams are most often described by Gaussian, experimentally or theoretically derived distributions. The accuracy of different dose planning algorithms is discussed in some detail based on their ability to take the different physical interaction processes of high energy electrons into account. It is shown that those programs that take the deviations from the simple Gaussian model into account give the best agreement with experimental results. With such programs a dosimetric relative accuracy of about 5% is generally achieved except in the most complex inhomogeneity configurations. Finally, the present limitations and possible future developments of electron dose planning are discussed. (orig.)

  16. Computer aided surgery. Current status and future directions

    International Nuclear Information System (INIS)

    Sato, Yoshinobu

    2006-01-01

    This review describes topics in the title in the order of 3D model reconstruction and therapeutic planning based on images before surgery; registration of the actual images in virtual physical space of the patient who is under surgical operation, to the preoperative ones with use of 3D-position sensor, ultrasonics, endoscopy and X-diaphanoscopy; and their accuracy analysis. Images before surgery usually obtained with CT and MR are reconstructed in 3D for the purpose of therapeutic planning by segmentation of the target organ/site, surrounding organs, bones and blood vessels. Navigation system at the surgery functions to make the images obtained before and during operation to be integrated for their registration and displaying. Usually, the optical marker and camera both equipped in the endoscope, and position sensor (tracker) are used for integration in the operation coordinate system. For this, the actual pictures at liver operation are given. For accuracy analysis there is a theory of target registration error, which has been established on FDA demands. In future, development of technology concerned in this field like that of robot, bio-dynamics, biomaterials, sensor and high performance computing together with 4D document of surgery for feed-back to technology are desirable for the systematic growing of this surgical technology. (T.I.)

  17. Current configuration and performance of the TFTR computer system

    International Nuclear Information System (INIS)

    Sauthoff, N.R.; Barnes, D.J.; Daniels, R.; Davis, S.; Reid, A.; Snyder, T.; Oliaro, G.; Stark, W.; Thompson, J.R. Jr.

    1986-01-01

    Developments in the TFTR (Tokamak Fusion Test Reactor) computer support system since its startup phases are described. Early emphasis on tokamak process control have been augmented by improved physics data handling, both on-line and off-line. Data acquisition volume and rate have been increased, and data is transmitted automatically to a new VAX-based off-line data reduction system. The number of interface points has increased dramatically, as has the number of man-machine interfaces. The graphics system performance has been accelerated by the introduction of parallelism, and new features such as shadowing and device independence have been added. To support multicycle operation for neutral beam conditioning and independence, the program control system has been generalized. A status and alarm system, including calculated variables, is in the installation phase. System reliability has been enhanced by both the redesign of weaker components and installation of a system status monitor. Development productivity has been enhanced by the addition of tools

  18. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  19. Computational electromagnetics and model-based inversion a modern paradigm for eddy-current nondestructive evaluation

    CERN Document Server

    Sabbagh, Harold A; Sabbagh, Elias H; Aldrin, John C; Knopp, Jeremy S

    2013-01-01

    Computational Electromagnetics and Model-Based Inversion: A Modern Paradigm for Eddy Current Nondestructive Evaluation describes the natural marriage of the computer to eddy-current NDE. Three distinct topics are emphasized in the book: (a) fundamental mathematical principles of volume-integral equations as a subset of computational electromagnetics, (b) mathematical algorithms applied to signal-processing and inverse scattering problems, and (c) applications of these two topics to problems in which real and model data are used. By showing how mathematics and the computer can solve problems more effectively than current analog practices, this book defines the modern technology of eddy-current NDE. This book will be useful to advanced students and practitioners in the fields of computational electromagnetics, electromagnetic inverse-scattering theory, nondestructive evaluation, materials evaluation and biomedical imaging. Users of eddy-current NDE technology in industries as varied as nuclear power, aerospace,...

  20. The domestic resource gap and current transaction deficit in Indonesia in 2010-2014

    OpenAIRE

    Anhulaila M. Palampanga; Bakri Hasanuddin

    2017-01-01

    The purpose of this study is to determine the relationship between domestic financial resource gaps and current account balance in Indonesia by using data from 2010 to 2014. Gaps in the domestic economy are classified into three types: 1) the domestic absorptive capacity of the national income gap (GNP), 2) gross national savings and investment gap, 3) private sector gap (private saving minus private investment), and public sector gap (tax minus government spending). By using a concept of ope...

  1. Open Educational Resources: The Role of OCW, Blogs and Videos in Computer Networks Classroom

    Directory of Open Access Journals (Sweden)

    Pablo Gil

    2012-09-01

    Full Text Available This paper analyzes the learning experiences and opinions obtained from a group of undergraduate students in their interaction with several on-line multimedia resources included in a free on-line course about Computer Networks. These new educational resources employed are based on the Web2.0 approach such as blogs, videos and virtual labs which have been added in a web-site for distance self-learning.

  2. AN ENHANCED METHOD FOREXTENDING COMPUTATION AND RESOURCES BY MINIMIZING SERVICE DELAY IN EDGE CLOUD COMPUTING

    OpenAIRE

    B.Bavishna*1, Mrs.M.Agalya2 & Dr.G.Kavitha3

    2018-01-01

    A lot of research has been done in the field of cloud computing in computing domain. For its effective performance, variety of algorithms has been proposed. The role of virtualization is significant and its performance is dependent on VM Migration and allocation. More of the energy is absorbed in cloud; therefore, the utilization of numerous algorithms is required for saving energy and efficiency enhancement in the proposed work. In the proposed work, green algorithm has been considered with ...

  3. Load/resource matching for period-of-record computer simulation

    International Nuclear Information System (INIS)

    Lindsey, E.D. Jr.; Robbins, G.E. III

    1991-01-01

    The Southwestern Power Administration (Southwestern), an agency of the Department of Energy, is responsible for marketing the power and energy produced at Federal hydroelectric power projects developed by the U.S. Army Corps of Engineers in the southwestern United States. This paper reports that in order to maximize benefits from limited resources, to evaluate proposed changes in the operation of existing projects, and to determine the feasibility and marketability of proposed new projects, Southwestern utilizes a period-of-record computer simulation model created in the 1960's. Southwestern is constructing a new computer simulation model to take advantage of changes in computers, policy, and procedures. Within all hydroelectric power reservoir systems, the ability of the resources to match the load demand is critical and presents complex problems. Therefore, the method used to compare available energy resources to energy load demands is a very important aspect of the new model. Southwestern has developed an innovative method which compares a resource duration curve with a load duration curve, adjusting the resource duration curve to make the most efficient use of the available resources

  4. An integrated system for land resources supervision based on the IoT and cloud computing

    Science.gov (United States)

    Fang, Shifeng; Zhu, Yunqiang; Xu, Lida; Zhang, Jinqu; Zhou, Peiji; Luo, Kan; Yang, Jie

    2017-01-01

    Integrated information systems are important safeguards for the utilisation and development of land resources. Information technologies, including the Internet of Things (IoT) and cloud computing, are inevitable requirements for the quality and efficiency of land resources supervision tasks. In this study, an economical and highly efficient supervision system for land resources has been established based on IoT and cloud computing technologies; a novel online and offline integrated system with synchronised internal and field data that includes the entire process of 'discovering breaches, analysing problems, verifying fieldwork and investigating cases' was constructed. The system integrates key technologies, such as the automatic extraction of high-precision information based on remote sensing, semantic ontology-based technology to excavate and discriminate public sentiment on the Internet that is related to illegal incidents, high-performance parallel computing based on MapReduce, uniform storing and compressing (bitwise) technology, global positioning system data communication and data synchronisation mode, intelligent recognition and four-level ('device, transfer, system and data') safety control technology. The integrated system based on a 'One Map' platform has been officially implemented by the Department of Land and Resources of Guizhou Province, China, and was found to significantly increase the efficiency and level of land resources supervision. The system promoted the overall development of informatisation in fields related to land resource management.

  5. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    Science.gov (United States)

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.

  6. Coupling between eddy currents and rigid body rotation: analysis, computation, and experiments

    International Nuclear Information System (INIS)

    Hua, T.Q.; Turner, L.R.

    1985-01-01

    Computation and experiment show that the coupling between eddy currents and the angular deflections resulting from those eddy currents can reduce electromagnetic effects such as forces, torques, and power dissipation to levels far less severe than would be predicted without regard for the coupling. This paper explores the coupling effects beyond the parameter range that has been explored experimentally, using analytical means and the eddy-current computer code EDDYNET. The paper also describes upcoming FELIX experiments with cantilevered beams

  7. Dynamic provisioning of local and remote compute resources with OpenStack

    Science.gov (United States)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  8. The current situation of uranium resources exploration in East China: Problems, thought and countermeasure

    International Nuclear Information System (INIS)

    He Xiaomei; Mao Mengcai

    2014-01-01

    Based on analyzing the current situation of uranium resources and exploration effort in East China, the main existing problems, technical thought and countermeasure for the future exploration in East China are discussed in this paper. The degree of both uranium exploration and study in East China is relatively high, philosophy of scientific mineral-prospecting should be established in the new round of mineral prospecting. Under guidance of metallogenic theory of large mineralization cluster area and uranium metallogenic theory of multi-sources, previous data and research achievement should be analyzed and summarized. With the help of metallogenic model, useful methods and means should be applied to set up exploration model in order to realize news phase of model exploration, comprehensive exploration, 3D exploration and quantitative exploration. Efficiency of exploration of uranium resources should be strugglingly increased. High profitable uranium resources will be actively found with rich, shallow, near and easy features. The prospecting targets and strategy reserves of uranium resources will be increased in East China. (authors)

  9. Analisis Teknik-Teknik Keamanan Pada Future Cloud Computing vs Current Cloud Computing: Survey Paper

    Directory of Open Access Journals (Sweden)

    Beny Nugraha

    2016-08-01

    Full Text Available Cloud computing adalah salah satu dari teknologi jaringan yang sedang berkembang pesat saat ini, hal ini dikarenakan cloud computing memiliki kelebihan dapat meningkatkan fleksibilitas dan kapabilitas dari proses komputer secara dinamis tanpa perlu mengeluarkan dana besar untuk membuat infrastruktur baru, oleh karena itu, peningkatan kualitas keamanan jaringan cloud computing sangat diperlukan. Penelitian ini akan meneliti teknik-teknik keamanan yang ada pada cloud computing saat ini dan arsitektur cloud computing masa depan, yaitu NEBULA. Teknik-teknik keamanan tersebut akan dibandingkan dalam hal kemampuannya dalam menangani serangan-serangan keamanan yang mungkin terjadi pada cloud computing. Metode yang digunakan pada penelitian ini adalah metode attack centric, yaitu setiap serangan keamanan dianalisis karakteristiknya dan kemudian diteliti mekanisme keamanan untuk menanganinya. Terdapat empat serangan keamanan yang diteliti dalam penelitian ini, dengan mengetahui bagaimana cara kerja sebuah serangan keamanan, maka akan diketahui juga mekanisme keamanan yang mana yang bisa mengatasi serangan tersebut. Dari penelitian ini didapatkan bahwa NEBULA memiliki tingkat keamanan yang paling tinggi. NEBULA memiliki tiga teknik baru yaitu Proof of Consent (PoC, Proof of Path (PoP, dan teknik kriptografi ICING. Ketiga teknik tersebut ditambah dengan teknik onion routing dapat mengatasi serangan keamanan yang dianalisa pada penelitian ini.

  10. PredMP: A Web Resource for Computationally Predicted Membrane Proteins via Deep Learning

    KAUST Repository

    Wang, Sheng; Fei, Shiyang; Zongan, Wang; Li, Yu; Zhao, Feng; Gao, Xin

    2018-01-01

    structures in Protein Data Bank (PDB). To elucidate the MP structures computationally, we developed a novel web resource, denoted as PredMP (http://52.87.130.56:3001/#/proteinindex), that delivers one-dimensional (1D) annotation of the membrane topology

  11. Resource-constrained project scheduling: computing lower bounds by solving minimum cut problems

    NARCIS (Netherlands)

    Möhring, R.H.; Nesetril, J.; Schulz, A.S.; Stork, F.; Uetz, Marc Jochen

    1999-01-01

    We present a novel approach to compute Lagrangian lower bounds on the objective function value of a wide class of resource-constrained project scheduling problems. The basis is a polynomial-time algorithm to solve the following scheduling problem: Given a set of activities with start-time dependent

  12. Selecting, Evaluating and Creating Policies for Computer-Based Resources in the Behavioral Sciences and Education.

    Science.gov (United States)

    Richardson, Linda B., Comp.; And Others

    This collection includes four handouts: (1) "Selection Critria Considerations for Computer-Based Resources" (Linda B. Richardson); (2) "Software Collection Policies in Academic Libraries" (a 24-item bibliography, Jane W. Johnson); (3) "Circulation and Security of Software" (a 19-item bibliography, Sara Elizabeth Williams); and (4) "Bibliography of…

  13. The Usage of informal computer based communication in the context of organization’s technological resources

    OpenAIRE

    Raišienė, Agota Giedrė; Jonušauskas, Steponas

    2011-01-01

    Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization's technological resources. Methodology - meta analysis, survey and descriptive analysis. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the ...

  14. Developing Online Learning Resources: Big Data, Social Networks, and Cloud Computing to Support Pervasive Knowledge

    Science.gov (United States)

    Anshari, Muhammad; Alas, Yabit; Guan, Lim Sei

    2016-01-01

    Utilizing online learning resources (OLR) from multi channels in learning activities promise extended benefits from traditional based learning-centred to a collaborative based learning-centred that emphasises pervasive learning anywhere and anytime. While compiling big data, cloud computing, and semantic web into OLR offer a broader spectrum of…

  15. Computer Processing 10-20-30. Teacher's Manual. Senior High School Teacher Resource Manual.

    Science.gov (United States)

    Fisher, Mel; Lautt, Ray

    Designed to help teachers meet the program objectives for the computer processing curriculum for senior high schools in the province of Alberta, Canada, this resource manual includes the following sections: (1) program objectives; (2) a flowchart of curriculum modules; (3) suggestions for short- and long-range planning; (4) sample lesson plans;…

  16. Modified stretched exponential model of computer system resources management limitations-The case of cache memory

    Science.gov (United States)

    Strzałka, Dominik; Dymora, Paweł; Mazurek, Mirosław

    2018-02-01

    In this paper we present some preliminary results in the field of computer systems management with relation to Tsallis thermostatistics and the ubiquitous problem of hardware limited resources. In the case of systems with non-deterministic behaviour, management of their resources is a key point that guarantees theirs acceptable performance and proper working. This is very wide problem that stands for many challenges in financial, transport, water and food, health, etc. areas. We focus on computer systems with attention paid to cache memory and propose to use an analytical model that is able to connect non-extensive entropy formalism, long-range dependencies, management of system resources and queuing theory. Obtained analytical results are related to the practical experiment showing interesting and valuable results.

  17. Universal resources for approximate and stochastic measurement-based quantum computation

    International Nuclear Information System (INIS)

    Mora, Caterina E.; Piani, Marco; Miyake, Akimasa; Van den Nest, Maarten; Duer, Wolfgang; Briegel, Hans J.

    2010-01-01

    We investigate which quantum states can serve as universal resources for approximate and stochastic measurement-based quantum computation in the sense that any quantum state can be generated from a given resource by means of single-qubit (local) operations assisted by classical communication. More precisely, we consider the approximate and stochastic generation of states, resulting, for example, from a restriction to finite measurement settings or from possible imperfections in the resources or local operations. We show that entanglement-based criteria for universality obtained in M. Van den Nest et al. [New J. Phys. 9, 204 (2007)] for the exact, deterministic case can be lifted to the much more general approximate, stochastic case. This allows us to move from the idealized situation (exact, deterministic universality) considered in previous works to the practically relevant context of nonperfect state preparation. We find that any entanglement measure fulfilling some basic requirements needs to reach its maximum value on some element of an approximate, stochastic universal family of resource states, as the resource size grows. This allows us to rule out various families of states as being approximate, stochastic universal. We prove that approximate, stochastic universality is in general a weaker requirement than deterministic, exact universality and provide resources that are efficient approximate universal, but not exact deterministic universal. We also study the robustness of universal resources for measurement-based quantum computation under realistic assumptions about the (imperfect) generation and manipulation of entangled states, giving an explicit expression for the impact that errors made in the preparation of the resource have on the possibility to use it for universal approximate and stochastic state preparation. Finally, we discuss the relation between our entanglement-based criteria and recent results regarding the uselessness of states with a high

  18. Including Alternative Resources in State Renewable Portfolio Standards: Current Design and Implementation Experience

    Energy Technology Data Exchange (ETDEWEB)

    Heeter, J.; Bird, L.

    2012-11-01

    Currently, 29 states, the District of Columbia, and Puerto Rico have instituted a renewable portfolio standard (RPS). An RPS sets a minimum threshold for how much renewable energy must be generated in a given year. Each state policy is unique, varying in percentage targets, timetables, and eligible resources. This paper examines state experience with implementing renewable portfolio standards that include energy efficiency, thermal resources, and non-renewable energy and explores compliance experience, costs, and how states evaluate, measure, and verify energy efficiency and convert thermal energy. It aims to gain insights from the experience of states for possible federal clean energy policy as well as to share experience and lessons for state RPS implementation.

  19. Human resource aspects of antiretroviral treatment delivery models: current practices and recommendations.

    Science.gov (United States)

    Assefa, Yibeltal; Van Damme, Wim; Hermann, Katharina

    2010-01-01

    PURPOSE OF VIEW: To illustrate and critically assess what is currently being published on the human resources for health dimension of antiretroviral therapy (ART) delivery models. The use of human resources for health can have an effect on two crucial aspects of successful ART programmes, namely the scale-up capacity and the long-term retention in care. Task shifting as the delegation of tasks from higher qualified to lower qualified cadres has become a widespread practice in ART delivery models in low-income countries in recent years. It is increasingly shown to effectively reduce the workload for scarce medical doctors without compromising the quality of care. At the same time, it becomes clear that task shifting can only be successful when accompanied by intensive training, supervision and support from existing health system structures. Although a number of recent publications have focussed on task shifting in ART delivery models, there is a lack of accessible information on the link between task shifting and patient outcomes. Current ART delivery models do not focus sufficiently on retention in care as arguably one of the most important issues for the long-term success of ART programmes. There is a need for context-specific re-designing of current ART delivery models in order to increase access to ART and improve long-term retention.

  20. The Indus basin in the framework of current and future water resources management

    Science.gov (United States)

    Laghari, A. N.; Vanham, D.; Rauch, W.

    2012-04-01

    The Indus basin is one of the regions in the world that is faced with major challenges for its water sector, due to population growth, rapid urbanisation and industrialisation, environmental degradation, unregulated utilization of the resources, inefficient water use and poverty, all aggravated by climate change. The Indus Basin is shared by 4 countries - Pakistan, India, Afghanistan and China. With a current population of 237 million people which is projected to increase to 319 million in 2025 and 383 million in 2050, already today water resources are abstracted almost entirely (more than 95% for irrigation). Climate change will result in increased water availability in the short term. However in the long term water availability will decrease. Some current aspects in the basin need to be re-evaluated. During the past decades water abstractions - and especially groundwater extractions - have augmented continuously to support a rice-wheat system where rice is grown during the kharif (wet, summer) season (as well as sugar cane, cotton, maize and other crops) and wheat during the rabi (dry, winter) season. However, the sustainability of this system in its current form is questionable. Additional water for domestic and industrial purposes is required for the future and should be made available by a reduction in irrigation requirements. This paper gives a comprehensive listing and description of available options for current and future sustainable water resources management (WRM) within the basin. Sustainable WRM practices include both water supply management and water demand management options. Water supply management options include: (1) reservoir management as the basin is characterised by a strong seasonal behaviour in water availability (monsoon and meltwater) and water demands; (2) water quality conservation and investment in wastewater infrastructure; (3) the use of alternative water resources like the recycling of wastewater and desalination; (4) land use

  1. The Indus basin in the framework of current and future water resources management

    Directory of Open Access Journals (Sweden)

    A. N. Laghari

    2012-04-01

    Full Text Available The Indus basin is one of the regions in the world that is faced with major challenges for its water sector, due to population growth, rapid urbanisation and industrialisation, environmental degradation, unregulated utilization of the resources, inefficient water use and poverty, all aggravated by climate change. The Indus Basin is shared by 4 countries – Pakistan, India, Afghanistan and China. With a current population of 237 million people which is projected to increase to 319 million in 2025 and 383 million in 2050, already today water resources are abstracted almost entirely (more than 95% for irrigation. Climate change will result in increased water availability in the short term. However in the long term water availability will decrease. Some current aspects in the basin need to be re-evaluated. During the past decades water abstractions – and especially groundwater extractions – have augmented continuously to support a rice-wheat system where rice is grown during the kharif (wet, summer season (as well as sugar cane, cotton, maize and other crops and wheat during the rabi (dry, winter season. However, the sustainability of this system in its current form is questionable. Additional water for domestic and industrial purposes is required for the future and should be made available by a reduction in irrigation requirements. This paper gives a comprehensive listing and description of available options for current and future sustainable water resources management (WRM within the basin. Sustainable WRM practices include both water supply management and water demand management options. Water supply management options include: (1 reservoir management as the basin is characterised by a strong seasonal behaviour in water availability (monsoon and meltwater and water demands; (2 water quality conservation and investment in wastewater infrastructure; (3 the use of alternative water resources like the recycling of wastewater and desalination; (4

  2. Integrating GRID tools to build a computing resource broker: activities of DataGrid WP1

    International Nuclear Information System (INIS)

    Anglano, C.; Barale, S.; Gaido, L.; Guarise, A.; Lusso, S.; Werbrouck, A.

    2001-01-01

    Resources on a computational Grid are geographically distributed, heterogeneous in nature, owned by different individuals or organizations with their own scheduling policies, have different access cost models with dynamically varying loads and availability conditions. This makes traditional approaches to workload management, load balancing and scheduling inappropriate. The first work package (WP1) of the EU-funded DataGrid project is addressing the issue of optimizing the distribution of jobs onto Grid resources based on a knowledge of the status and characteristics of these resources that is necessarily out-of-date (collected in a finite amount of time at a very loosely coupled site). The authors describe the DataGrid approach in integrating existing software components (from Condor, Globus, etc.) to build a Grid Resource Broker, and the early efforts to define a workable scheduling strategy

  3. Impact of remote sensing upon the planning, management and development of water resources. Summary of computers and computer growth trends for hydrologic modeling and the input of ERTS image data processing load

    Science.gov (United States)

    Castruccio, P. A.; Loats, H. L., Jr.

    1975-01-01

    An analysis of current computer usage by major water resources users was made to determine the trends of usage and costs for the principal hydrologic users/models. The laws and empirical relationships governing the growth of the data processing loads were described and applied to project the future data loads. Data loads for ERTS CCT image processing were computed and projected through the 1985 era. The analysis showns significant impact due to the utilization and processing of ERTS CCT's data.

  4. Current Knowledge on the Use of Computational Toxicology in Hazard Assessment of Metallic Engineered Nanomaterials

    Directory of Open Access Journals (Sweden)

    Guangchao Chen

    2017-07-01

    Full Text Available As listed by the European Chemicals Agency, the three elements in evaluating the hazards of engineered nanomaterials (ENMs include the integration and evaluation of toxicity data, categorization and labeling of ENMs, and derivation of hazard threshold levels for human health and the environment. Assessing the hazards of ENMs solely based on laboratory tests is time-consuming, resource intensive, and constrained by ethical considerations. The adoption of computational toxicology into this task has recently become a priority. Alternative approaches such as (quantitative structure–activity relationships ((QSAR and read-across are of significant help in predicting nanotoxicity and filling data gaps, and in classifying the hazards of ENMs to individual species. Thereupon, the species sensitivity distribution (SSD approach is able to serve the establishment of ENM hazard thresholds sufficiently protecting the ecosystem. This article critically reviews the current knowledge on the development of in silico models in predicting and classifying the hazard of metallic ENMs, and the development of SSDs for metallic ENMs. Further discussion includes the significance of well-curated experimental datasets and the interpretation of toxicity mechanisms of metallic ENMs based on reported models. An outlook is also given on future directions of research in this frontier.

  5. Current Knowledge on the Use of Computational Toxicology in Hazard Assessment of Metallic Engineered Nanomaterials.

    Science.gov (United States)

    Chen, Guangchao; Peijnenburg, Willie; Xiao, Yinlong; Vijver, Martina G

    2017-07-12

    As listed by the European Chemicals Agency, the three elements in evaluating the hazards of engineered nanomaterials (ENMs) include the integration and evaluation of toxicity data, categorization and labeling of ENMs, and derivation of hazard threshold levels for human health and the environment. Assessing the hazards of ENMs solely based on laboratory tests is time-consuming, resource intensive, and constrained by ethical considerations. The adoption of computational toxicology into this task has recently become a priority. Alternative approaches such as (quantitative) structure-activity relationships ((Q)SAR) and read-across are of significant help in predicting nanotoxicity and filling data gaps, and in classifying the hazards of ENMs to individual species. Thereupon, the species sensitivity distribution (SSD) approach is able to serve the establishment of ENM hazard thresholds sufficiently protecting the ecosystem. This article critically reviews the current knowledge on the development of in silico models in predicting and classifying the hazard of metallic ENMs, and the development of SSDs for metallic ENMs. Further discussion includes the significance of well-curated experimental datasets and the interpretation of toxicity mechanisms of metallic ENMs based on reported models. An outlook is also given on future directions of research in this frontier.

  6. PredMP: A Web Resource for Computationally Predicted Membrane Proteins via Deep Learning

    KAUST Repository

    Wang, Sheng

    2018-02-06

    Experimental determination of membrane protein (MP) structures is challenging as they are often too large for nuclear magnetic resonance (NMR) experiments and difficult to crystallize. Currently there are only about 510 non-redundant MPs with solved structures in Protein Data Bank (PDB). To elucidate the MP structures computationally, we developed a novel web resource, denoted as PredMP (http://52.87.130.56:3001/#/proteinindex), that delivers one-dimensional (1D) annotation of the membrane topology and secondary structure, two-dimensional (2D) prediction of the contact/distance map, together with three-dimensional (3D) modeling of the MP structure in the lipid bilayer, for each MP target from a given model organism. The precision of the computationally constructed MP structures is leveraged by state-of-the-art deep learning methods as well as cutting-edge modeling strategies. In particular, (i) we annotate 1D property via DeepCNF (Deep Convolutional Neural Fields) that not only models complex sequence-structure relationship but also interdependency between adjacent property labels; (ii) we predict 2D contact/distance map through Deep Transfer Learning which learns the patterns as well as the complex relationship between contacts/distances and protein features from non-membrane proteins; and (iii) we model 3D structure by feeding its predicted contacts and secondary structure to the Crystallography & NMR System (CNS) suite combined with a membrane burial potential that is residue-specific and depth-dependent. PredMP currently contains more than 2,200 multi-pass transmembrane proteins (length<700 residues) from Human. These transmembrane proteins are classified according to IUPHAR/BPS Guide, which provides a hierarchical organization of receptors, channels, transporters, enzymes and other drug targets according to their molecular relationships and physiological functions. Among these MPs, we estimated that our approach could predict correct folds for 1

  7. Current Control and Performance Evaluation of Converter Interfaced Distribution Resources in Grid Connected Mode

    Directory of Open Access Journals (Sweden)

    SINGH Alka

    2012-10-01

    Full Text Available Use of distributed resources is growing in developing countries like India and in developed nations too. The increased acceptance of suchresources is mainly due to their modularity, increased reliability, good power quality and environment friendly operation. These are currently being interfaced to the existing systems using voltage source inverters (VSC’s. The control of such distributed resources is significantly different than the conventional power systems mainly because the VSC’s have no inertia unlike the synchronous generators.This paper deals with the Matlab modeling and design of control aspects of one such distributed source feeding a common load. A grid connected supply is also available. The control algorithm is developed for real and reactive power sharing of the load between thedistributed source and the grid. The developed control scheme is tested for linear (R-L load as well as nonlinear loads. With suitable modifications, the control algorithm can be extended for several distributed resources connected in parallel.

  8. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  9. [Development strategy of Paris based on combination of domestic patent and current resource application and development].

    Science.gov (United States)

    Zhao, Fei-Ya; Tao, Ai-En; Xia, Cong-Long

    2018-01-01

    Paris is a commonly used traditional Chinese medicine (TCM), and has antitumor, antibacterial, sedative, analgesic and hemostatic effects. It has been used as an ingredient of 81 Chinese patent medicines, with a wide application and large market demand. Based on the data retrieved from state Intellectual Property Office patent database, a comprehensive analysis was made on Paris patents, so as to explore the current features of Paris patents in the aspects of domestic patent output, development trend, technology field distribution, time dimension, technology growth rate and patent applicant, and reveal the development trend of China's Paris industry. In addition, based on the current Paris resource application and development, a sustainable, multi-channel and multi-level industrial development approach was built. According to the results, studies of Paris in China are at the rapid development period, with a good development trend. However, because wild Paris resources tend to be exhausted, the studies for artificial cultivation technology should be strengthened to promote the industrial development. Copyright© by the Chinese Pharmaceutical Association.

  10. The ATLAS Computing Agora: a resource web site for citizen science projects

    CERN Document Server

    Bourdarios, Claire; The ATLAS collaboration

    2016-01-01

    The ATLAS collaboration has recently setup a number of citizen science projects which have a strong IT component and could not have been envisaged without the growth of general public computing resources and network connectivity: event simulation through volunteer computing, algorithms improvement via Machine Learning challenges, event display analysis on citizen science platforms, use of open data, etc. Most of the interactions with volunteers are handled through message boards, but specific outreach material was also developed, giving an enhanced visibility to the ATLAS software and computing techniques, challenges and community. In this talk the Atlas Computing Agora (ACA) web platform will be presented as well as some of the specific material developed for some of the projects.

  11. A 20-Year High-Resolution Wave Resource Assessment of Japan with Wave-Current Interactions

    Science.gov (United States)

    Webb, A.; Waseda, T.; Kiyomatsu, K.

    2016-02-01

    Energy harvested from surface ocean waves and tidal currents has the potential to be a significant source of green energy, particularly for countries with extensive coastlines such as Japan. As part of a larger marine renewable energy project*, The University of Tokyo (in cooperation with JAMSTEC) has conducted a state-of-the-art wave resource assessment (with uncertainty estimates) to assist with wave generator site identification and construction in Japan. This assessment will be publicly available and is based on a large-scale NOAA WAVEWATCH III (version 4.18) simulation using NCEP and JAMSTEC forcings. It includes several key components to improve model skill: a 20-year simulation to reduce aleatory uncertainty, a four-nested-layer approach to resolve a 1 km shoreline, and finite-depth and current effects included in all wave power density calculations. This latter component is particularly important for regions near strong currents such as the Kuroshio. Here, we will analyze the different wave power density equations, discuss the model setup, and present results from the 20-year assessment (with a focus on the role of wave-current interactions). Time permitting, a comparison will also be made with simulations using JMA MSM 5 km winds. *New Energy and Industrial Technology Development Organization (NEDO): "Research on the Framework and Infrastructure of Marine Renewable Energy; an Energy Potential Assessment"

  12. Advances in ATLAS@Home towards a major ATLAS computing resource

    CERN Document Server

    Cameron, David; The ATLAS collaboration

    2018-01-01

    The volunteer computing project ATLAS@Home has been providing a stable computing resource for the ATLAS experiment since 2013. It has recently undergone some significant developments and as a result has become one of the largest resources contributing to ATLAS computing, by expanding its scope beyond traditional volunteers and into exploitation of idle computing power in ATLAS data centres. Removing the need for virtualization on Linux and instead using container technology has made the entry barrier significantly lower data centre participation and in this paper, we describe the implementation and results of this change. We also present other recent changes and improvements in the project. In early 2017 the ATLAS@Home project was merged into a combined LHC@Home platform, providing a unified gateway to all CERN-related volunteer computing projects. The ATLAS Event Service shifts data processing from file-level to event-level and we describe how ATLAS@Home was incorporated into this new paradigm. The finishing...

  13. Critical phenomena in communication/computation networks with various topologies and suboptimal to optimal resource allocation

    Science.gov (United States)

    Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi

    2015-01-01

    We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.

  14. The NILE system architecture: fault-tolerant, wide-area access to computing and data resources

    International Nuclear Information System (INIS)

    Ricciardi, Aleta; Ogg, Michael; Rothfus, Eric

    1996-01-01

    NILE is a multi-disciplinary project building a distributed computing environment for HEP. It provides wide-area, fault-tolerant, integrated access to processing and data resources for collaborators of the CLEO experiment, though the goals and principles are applicable to many domains. NILE has three main objectives: a realistic distributed system architecture design, the design of a robust data model, and a Fast-Track implementation providing a prototype design environment which will also be used by CLEO physicists. This paper focuses on the software and wide-area system architecture design and the computing issues involved in making NILE services highly-available. (author)

  15. Tracking the Flow of Resources in Electronic Waste - The Case of End-of-Life Computer Hard Disk Drives.

    Science.gov (United States)

    Habib, Komal; Parajuly, Keshav; Wenzel, Henrik

    2015-10-20

    Recovery of resources, in particular, metals, from waste flows is widely seen as a prioritized option to reduce their potential supply constraints in the future. The current waste electrical and electronic equipment (WEEE) treatment system is more focused on bulk metals, where the recycling rate of specialty metals, such as rare earths, is negligible compared to their increasing use in modern products, such as electronics. This study investigates the challenges in recovering these resources in the existing WEEE treatment system. It is illustrated by following the material flows of resources in a conventional WEEE treatment plant in Denmark. Computer hard disk drives (HDDs) containing neodymium-iron-boron (NdFeB) magnets were selected as the case product for this experiment. The resulting output fractions were tracked until their final treatment in order to estimate the recovery potential of rare earth elements (REEs) and other resources contained in HDDs. The results further show that out of the 244 kg of HDDs treated, 212 kg comprising mainly of aluminum and steel can be finally recovered from the metallurgic process. The results further demonstrate the complete loss of REEs in the existing shredding-based WEEE treatment processes. Dismantling and separate processing of NdFeB magnets from their end-use products can be a more preferred option over shredding. However, it remains a technological and logistic challenge for the existing system.

  16. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE) Model of Water Resources and Water Environments

    OpenAIRE

    Guohua Fang; Ting Wang; Xinyi Si; Xin Wen; Yu Liu

    2016-01-01

    To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE) model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and out...

  17. Reducing usage of the computational resources by event driven approach to model predictive control

    Science.gov (United States)

    Misik, Stefan; Bradac, Zdenek; Cela, Arben

    2017-08-01

    This paper deals with a real-time and optimal control of dynamic systems while also considers the constraints which these systems might be subject to. Main objective of this work is to propose a simple modification of the existing Model Predictive Control approach to better suit needs of computational resource-constrained real-time systems. An example using model of a mechanical system is presented and the performance of the proposed method is evaluated in a simulated environment.

  18. Piping data bank and erection system of Angra 2: structure, computational resources and systems

    International Nuclear Information System (INIS)

    Abud, P.R.; Court, E.G.; Rosette, A.C.

    1992-01-01

    The Piping Data Bank of Angra 2 called - Erection Management System - Was developed to manage the piping erection of the Nuclear Power Plant of Angra 2. Beyond the erection follow-up of piping and supports, it manages: the piping design, the material procurement, the flow of the fabrication documents, testing of welds and material stocks at the Warehouse. The works developed in the sense of defining the structure of the Data Bank, Computational Resources and System are here described. (author)

  19. Blockchain-Empowered Fair Computational Resource Sharing System in the D2D Network

    Directory of Open Access Journals (Sweden)

    Zhen Hong

    2017-11-01

    Full Text Available Device-to-device (D2D communication is becoming an increasingly important technology in future networks with the climbing demand for local services. For instance, resource sharing in the D2D network features ubiquitous availability, flexibility, low latency and low cost. However, these features also bring along challenges when building a satisfactory resource sharing system in the D2D network. Specifically, user mobility is one of the top concerns for designing a cooperative D2D computational resource sharing system since mutual communication may not be stably available due to user mobility. A previous endeavour has demonstrated and proven how connectivity can be incorporated into cooperative task scheduling among users in the D2D network to effectively lower average task execution time. There are doubts about whether this type of task scheduling scheme, though effective, presents fairness among users. In other words, it can be unfair for users who contribute many computational resources while receiving little when in need. In this paper, we propose a novel blockchain-based credit system that can be incorporated into the connectivity-aware task scheduling scheme to enforce fairness among users in the D2D network. Users’ computational task cooperation will be recorded on the public blockchain ledger in the system as transactions, and each user’s credit balance can be easily accessible from the ledger. A supernode at the base station is responsible for scheduling cooperative computational tasks based on user mobility and user credit balance. We investigated the performance of the credit system, and simulation results showed that with a minor sacrifice of average task execution time, the level of fairness can obtain a major enhancement.

  20. EDDYMULT: a computing system for solving eddy current problems in a multi-torus system

    International Nuclear Information System (INIS)

    Nakamura, Yukiharu; Ozeki, Takahisa

    1989-03-01

    A new computing system EDDYMULT based on the finite element circuit method has been developed to solve actual eddy current problems in a multi-torus system, which consists of many torus-conductors and various kinds of axisymmetric poloidal field coils. The EDDYMULT computing system can deal three-dimensionally with the modal decomposition of eddy current in a multi-torus system, the transient phenomena of eddy current distributions and the resultant magnetic field. Therefore, users can apply the computing system to the solution of the eddy current problems in a tokamak fusion device, such as the design of poloidal field coil power supplies, the mechanical stress design of the intensive electromagnetic loading on device components and the control analysis of plasma position. The present report gives a detailed description of the EDDYMULT system as an user's manual: 1) theory, 2) structure of the code system, 3) input description, 4) problem restrictions, 5) description of the subroutines, etc. (author)

  1. Collocational Relations in Japanese Language Textbooks and Computer-Assisted Language Learning Resources

    Directory of Open Access Journals (Sweden)

    Irena SRDANOVIĆ

    2011-05-01

    Full Text Available In this paper, we explore presence of collocational relations in the computer-assisted language learning systems and other language resources for the Japanese language, on one side, and, in the Japanese language learning textbooks and wordlists, on the other side. After introducing how important it is to learn collocational relations in a foreign language, we examine their coverage in the various learners’ resources for the Japanese language. We particularly concentrate on a few collocations at the beginner’s level, where we demonstrate their treatment across various resources. A special attention is paid to what is referred to as unpredictable collocations, which have a bigger foreign language learning-burden than the predictable ones.

  2. Resource allocation on computational grids using a utility model and the knapsack problem

    CERN Document Server

    Van der ster, Daniel C; Parra-Hernandez, Rafael; Sobie, Randall J

    2009-01-01

    This work introduces a utility model (UM) for resource allocation on computational grids and formulates the allocation problem as a variant of the 0–1 multichoice multidimensional knapsack problem. The notion of task-option utility is introduced, and it is used to effect allocation policies. We present a variety of allocation policies, which are expressed as functions of metrics that are both intrinsic and external to the task and resources. An external user-defined credit-value metric is shown to allow users to intervene in the allocation of urgent or low priority tasks. The strategies are evaluated in simulation against random workloads as well as those drawn from real systems. We measure the sensitivity of the UM-derived schedules to variations in the allocation policies and their corresponding utility functions. The UM allocation strategy is shown to optimally allocate resources congruent with the chosen policies.

  3. Classroon demonstration: Foucault s currents explored by the computer hard disc (HD

    Directory of Open Access Journals (Sweden)

    Jorge Roberto Pimentel

    2008-09-01

    Full Text Available This paper making an experimental exploration of Foucault s currents (eddy currents through a rotor magnetically coupled to computer hard disc (HD that is no longer being used. The set-up allows to illustrate in a stimulating way electromagnetism classes in High Schools for mean of the qualitative observations of the currents which are created as consequence of the movement of an electric conductor in a region where a magnetic field exists.

  4. Exploring the current application of professional competencies in human resource management in the South African context

    Directory of Open Access Journals (Sweden)

    Nico Schutte

    2015-11-01

    Full Text Available Orientation: Human research (HR practitioners have an important role to play in the sustainability and competitiveness of organisations. Yet their strategic contribution and the value they add remain unrecognised. Research purpose: The main objective of this research was to explore the extent to which HR practitioners are currently allowed to display HR competencies in the workplace, and whether any significant differences exist between perceived HR competencies, based on the respondents’ demographic characteristics. Motivation for the study: Limited empirical research exists on the extent to which HR practitioners are allowed to display key competencies in the South African workplace. Research approach, design, and method: A quantitative research approach was followed. A Human Resource Management Professional Competence Questionnaire was administered to HR practitioners and managers (N = 481. Main findings: The results showed that HR competencies are poorly applied in selected South African workplaces. The competencies that were indicated as having the poorest application were talent management, HR metrics, HR business knowledge, and innovation. The white ethic group experienced a poorer application of all human research management (HRM competencies compared to the black African ethnic group. Practical/managerial implications: The findings of the research highlighted the need for management to evaluate the current application of HR practices in the workplace and also the extent to which HR professionals are involved as strategic business partners. Contribution/value-add: This research highlights the need for the current application of HR competencies in South African workplaces to be improved.

  5. The Trope Tank: A Laboratory with Material Resources for Creative Computing

    Directory of Open Access Journals (Sweden)

    Nick Montfort

    2014-12-01

    Full Text Available http://dx.doi.org/10.5007/1807-9288.2014v10n2p53 Principles for organizing and making use of a laboratory with material computing resources are articulated. This laboratory, the Trope Tank, is a facility for teaching, research, and creative collaboration and offers hardware (in working condition and set up for use from the 1970s, 1980s, and 1990s, including videogame systems, home computers, and an arcade cabinet. To aid in investigating the material history of texts, the lab has a small 19th century letterpress, a typewriter, a print terminal, and dot-matrix printers. Other resources include controllers, peripherals, manuals, books, and software on physical media. These resources are used for teaching, loaned for local exhibitions and presentations, and accessed by researchers and artists. The space is primarily a laboratory (rather than a library, studio, or museum, so materials are organized by platform and intended use. Textual information about the historical contexts of the available systems, and resources are set up to allow easy operation, and even casual use, by researchers, teachers, students, and artists.

  6. Testing a computer-based ostomy care training resource for staff nurses.

    Science.gov (United States)

    Bales, Isabel

    2010-05-01

    Fragmented teaching and ostomy care provided by nonspecialized clinicians unfamiliar with state-of-the-art care and products have been identified as problems in teaching ostomy care to the new ostomate. After conducting a literature review of theories and concepts related to the impact of nurse behaviors and confidence on ostomy care, the author developed a computer-based learning resource and assessed its effect on staff nurse confidence. Of 189 staff nurses with a minimum of 1 year acute-care experience employed in the acute care, emergency, and rehabilitation departments of an acute care facility in the Midwestern US, 103 agreed to participate and returned completed pre- and post-tests, each comprising the same eight statements about providing ostomy care. F and P values were computed for differences between pre- and post test scores. Based on a scale where 1 = totally disagree and 5 = totally agree with the statement, baseline confidence and perceived mean knowledge scores averaged 3.8 and after viewing the resource program post-test mean scores averaged 4.51, a statistically significant improvement (P = 0.000). The largest difference between pre- and post test scores involved feeling confident in having the resources to learn ostomy skills independently. The availability of an electronic ostomy care resource was rated highly in both pre- and post testing. Studies to assess the effects of increased confidence and knowledge on the quality and provision of care are warranted.

  7. Current situation of the facilities, equipments and human resources in nuclear medicine in Argentina

    International Nuclear Information System (INIS)

    Chiliutti, Claudia A.

    2008-01-01

    The current situation of nuclear medicine in Argentina, taking into account the facilities, their equipment and human resources available is presented in this paper. A review and analysis of the equipment, including technical characteristics and a survey of the professionals and technicians of the area, was carried out. In Argentina, there are 266 centers of nuclear medicine distributed all over the country. The operating licenses are granted by the Nuclear Regulatory Authority (Autoridad Regulatoria Nuclear - ARN). Forty four percent of the installed equipment are SPECT of 1 or 2 heads and 39,4 % are gamma camera. Besides, there are eleven PET operating in Argentina. There are 416 nuclear medicine physicians with individual permit for diagnostic purposes and 50% of them has also individual permit for treatment purposes. With the purpose of analyzing the regional distribution of the available resources in nuclear medicine, the country was divided into 7 geographical regions: City of Buenos Aires, Province of Buenos Aires, Pampa, Cuyo, Northeast, Northwest and Patagonia. From the analysis of the gathered information it is possible to conclude that the nuclear medicine equipment as well as the personnel present an irregular distribution, with a major concentration in the City of Buenos Aires and Province of Buenos Aires. The Northeast region presents the lowest number of Nuclear Medicine centers and the Patagonia region has the lowest number of medicine nuclear physicians with individual permits. The number of SPECT and gamma cameras is 7,3 per million of inhabitants. The information about the available resources in nuclear medicine presented in this paper and its comparison with the international information available provide elements for a better planning of the future activities in the area not only for the operators but also from the regulatory point of view. (author)

  8. [Current situation of human resources of parasitic disease control and prevention organizations in Henan Province].

    Science.gov (United States)

    Ya-Lan, Zhang; Yan-Kun, Zhu; Wei-Qi, Chen; Yan, Deng; Peng, Li

    2018-01-10

    To understand the current status of human resources of parasitic disease control and prevention organizations in Henan Province, so as to provide the reference for promoting the integrative ability of the prevention and control of parasitic diseases in Henan Province. The questionnaires were designed and the method of census was adopted. The information, such as the amounts, majors, education background, technical titles, working years, and turnover in each parasitic disease control and prevention organization was collected by the centers for disease control and prevention (CDCs) at all levels. The data were descriptively analyzed. Totally 179 CDCs were investigated, in which only 19.0% (34/179) had the independent parasitic diseases control institution (department) . There were only 258 full-time staffs working on parasitic disease control and prevention in the whole province, in which only 61.9% (159/258) were health professionals. Those with junior college degree or below in the health professionals accounted for 60.3% (96/159) . Most of them (42.1%) had over 20 years of experience, but 57.9% (92/159) of their technical post titles were at primary level or below. The proportion of the health professionals is low in the parasitic disease control and prevention organizations in Henan Province. The human resource construction for parasitic disease control and prevention at all levels should be strengthened.

  9. Experiences on current national income measures with reference to environmental and natural resources

    International Nuclear Information System (INIS)

    Franzese, R.; Gaudioso, D.

    1995-06-01

    The environment provides both a source of goods and services and a 'sink' for residues of the production and consumption processes. This is not reflected into conventional estimate of GDP (gross domestic product), the most commonly used measure of aggregate income. The purpose of this paper is to explore whether environmentally-adjusted national income measure can be derived. In the first part, the authors discuss both the shortcomings of the current national income measures, with reference to environmental and natural resources, and the debate on this issues; then they analyse the existing experiences to provide environmentally-adjusted indicators of national accounts. In the second part, the authors present an evaluation of the costs of environmental degradation in Italy in the period 1988-1990, based on the methodologies adopted in a pilot study carried out by UNSO (United Nations Statistical Office) and the World Bank for Mexico

  10. Computation of quantum electron transport with local current conservation using quantum trajectories

    International Nuclear Information System (INIS)

    Alarcón, A; Oriols, X

    2009-01-01

    A recent proposal for modeling time-dependent quantum electron transport with Coulomb and exchange correlations using quantum (Bohm) trajectories (Oriols 2007 Phys. Rev. Lett. 98 066803) is extended towards the computation of the total (particle plus displacement) current in mesoscopic devices. In particular, two different methods for the practical computation of the total current are compared. The first method computes the particle and the displacement currents from the rate of Bohm particles crossing a particular surface and the time-dependent variations of the electric field there. The second method uses the Ramo–Shockley theorem to compute the total current on that surface from the knowledge of the Bohm particle dynamics in a 3D volume and the time-dependent variations of the electric field on the boundaries of that volume. From a computational point of view, it is shown that both methods achieve local current conservation, but the second is preferred because it is free from 'spurious' peaks. A numerical example, a Bohm trajectory crossing a double-barrier tunneling structure, is presented, supporting the conclusions

  11. Optimizing qubit resources for quantum chemistry simulations in second quantization on a quantum computer

    International Nuclear Information System (INIS)

    Moll, Nikolaj; Fuhrer, Andreas; Staar, Peter; Tavernelli, Ivano

    2016-01-01

    Quantum chemistry simulations on a quantum computer suffer from the overhead needed for encoding the Fermionic problem in a system of qubits. By exploiting the block diagonality of a Fermionic Hamiltonian, we show that the number of required qubits can be reduced while the number of terms in the Hamiltonian will increase. All operations for this reduction can be performed in operator space. The scheme is conceived as a pre-computational step that would be performed prior to the actual quantum simulation. We apply this scheme to reduce the number of qubits necessary to simulate both the Hamiltonian of the two-site Fermi–Hubbard model and the hydrogen molecule. Both quantum systems can then be simulated with a two-qubit quantum computer. Despite the increase in the number of Hamiltonian terms, the scheme still remains a useful tool to reduce the dimensionality of specific quantum systems for quantum simulators with a limited number of resources. (paper)

  12. Applying computer modeling to eddy current signal analysis for steam generator and heat exchanger tube inspections

    International Nuclear Information System (INIS)

    Sullivan, S.P.; Cecco, V.S.; Carter, J.R.; Spanner, M.; McElvanney, M.; Krause, T.W.; Tkaczyk, R.

    2000-01-01

    Licensing requirements for eddy current inspections for nuclear steam generators and heat exchangers are becoming increasingly stringent. The traditional industry-standard method of comparing inspection signals with flaw signals from simple in-line calibration standards is proving to be inadequate. A more complete understanding of eddy current and magnetic field interactions with flaws and other anomalies is required for the industry to generate consistently reliable inspections. Computer modeling is a valuable tool in improving the reliability of eddy current signal analysis. Results from computer modeling are helping inspectors to properly discriminate between real flaw signals and false calls, and improving reliability in flaw sizing. This presentation will discuss complementary eddy current computer modeling techniques such as the Finite Element Method (FEM), Volume Integral Method (VIM), Layer Approximation and other analytic methods. Each of these methods have advantages and limitations. An extension of the Layer Approximation to model eddy current probe responses to ferromagnetic materials will also be presented. Finally examples will be discussed demonstrating how some significant eddy current signal analysis problems have been resolved using appropriate electromagnetic computer modeling tools

  13. Using ecological thresholds to inform resource management: current options and future possibilities

    Directory of Open Access Journals (Sweden)

    Melissa M Foley

    2015-11-01

    Full Text Available In the face of growing human impacts on ecosystems, scientists and managers recognize the need to better understand thresholds and nonlinear dynamics in ecological systems to help set management targets. However, our understanding of the factors that drive threshold dynamics, and when and how rapidly thresholds will be crossed is currently limited in many systems. In spite of these limitations, there are approaches available to practitioners today—including ecosystem monitoring, statistical methods to identify thresholds and indicators, and threshold-based adaptive management—that can be used to help avoid ecological thresholds or restore systems that have crossed them. We briefly review the current state of knowledge and then use real-world examples to demonstrate how resource managers can use available approaches to avoid crossing ecological thresholds. We also highlight new tools and indicators being developed that have the potential to enhance our ability to detect change, predict when a system is approaching an ecological threshold, or restore systems that have already crossed a tipping point.

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  15. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    Science.gov (United States)

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  16. A Safety Resource Allocation Mechanism against Connection Fault for Vehicular Cloud Computing

    Directory of Open Access Journals (Sweden)

    Tianpeng Ye

    2016-01-01

    Full Text Available The Intelligent Transportation System (ITS becomes an important component of the smart city toward safer roads, better traffic control, and on-demand service by utilizing and processing the information collected from sensors of vehicles and road side infrastructure. In ITS, Vehicular Cloud Computing (VCC is a novel technology balancing the requirement of complex services and the limited capability of on-board computers. However, the behaviors of the vehicles in VCC are dynamic, random, and complex. Thus, one of the key safety issues is the frequent disconnections between the vehicle and the Vehicular Cloud (VC when this vehicle is computing for a service. More important, the connection fault will disturb seriously the normal services of VCC and impact the safety works of the transportation. In this paper, a safety resource allocation mechanism is proposed against connection fault in VCC by using a modified workflow with prediction capability. We firstly propose the probability model for the vehicle movement which satisfies the high dynamics and real-time requirements of VCC. And then we propose a Prediction-based Reliability Maximization Algorithm (PRMA to realize the safety resource allocation for VCC. The evaluation shows that our mechanism can improve the reliability and guarantee the real-time performance of the VCC.

  17. Uniform physical theory of diffraction equivalent edge currents for implementation in general computer codes

    DEFF Research Database (Denmark)

    Johansen, Peter Meincke

    1996-01-01

    New uniform closed-form expressions for physical theory of diffraction equivalent edge currents are derived for truncated incremental wedge strips. In contrast to previously reported expressions, the new expressions are well-behaved for all directions of incidence and observation and take a finite...... value for zero strip length. Consequently, the new equivalent edge currents are, to the knowledge of the author, the first that are well-suited for implementation in general computer codes...

  18. Current state and future direction of computer systems at NASA Langley Research Center

    Science.gov (United States)

    Rogers, James L. (Editor); Tucker, Jerry H. (Editor)

    1992-01-01

    Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.

  19. Improved Cloud resource allocation: how INDIGO-DataCloud is overcoming the current limitations in Cloud schedulers

    Science.gov (United States)

    Lopez Garcia, Alvaro; Zangrando, Lisa; Sgaravatto, Massimo; Llorens, Vincent; Vallero, Sara; Zaccolo, Valentina; Bagnasco, Stefano; Taneja, Sonia; Dal Pra, Stefano; Salomoni, Davide; Donvito, Giacinto

    2017-10-01

    Performing efficient resource provisioning is a fundamental aspect for any resource provider. Local Resource Management Systems (LRMS) have been used in data centers for decades in order to obtain the best usage of the resources, providing their fair usage and partitioning for the users. In contrast, current cloud schedulers are normally based on the immediate allocation of resources on a first-come, first-served basis, meaning that a request will fail if there are no resources (e.g. OpenStack) or it will be trivially queued ordered by entry time (e.g. OpenNebula). Moreover, these scheduling strategies are based on a static partitioning of the resources, meaning that existing quotas cannot be exceeded, even if there are idle resources allocated to other projects. This is a consequence of the fact that cloud instances are not associated with a maximum execution time and leads to a situation where the resources are under-utilized. These facts have been identified by the INDIGO-DataCloud project as being too simplistic for accommodating scientific workloads in an efficient way, leading to an underutilization of the resources, a non desirable situation in scientific data centers. In this work, we will present the work done in the scheduling area during the first year of the INDIGO project and the foreseen evolutions.

  20. Integrating resource efficiency and EU State aid. An evaluation of resource efficiency considerations in the current EU State aid framework

    Energy Technology Data Exchange (ETDEWEB)

    Bennink, D.; Faber, J.; Smit, M. [CE Delft, Delft (Netherlands); Goba, V. [SIA Estonian, Latvian and Lithuanian Environment ELLE, Tallinn (Estonia); Miller, K.; Williams, E. [AEA Technology plc, London (United Kingdom)

    2012-10-15

    This study, for the European Commission, analyses the issues that need to be addressed in the revision of the EU State aid framework to ensure that they do not hinder environmental, resource efficiency and sustainable development goals. In some cases, State aid can be considered an environmentally harmful subsidy (EHS). The study analyses (1) the extent to which the Environmental Aid Guidelines (EAG) need to be changed to take into account recent European environmental policy developments; (2) existing and potential resource efficiency considerations in a) the Regional Aid Guidelines; b) the Research, Development and Innovation (RDI) Guidelines and c) the Agriculture and Forestry Guidelines; assesses cases and schemes using these guidelines to identify whether resource efficiency considerations are taken into account. The study also considers the social, environmental and economic impacts of these cases and schemes. It develops recommendations for the review of the EAG and a number of horizontal guidelines. One of the conclusions of the analysis is that the way in which multiple objectives and impacts are balanced, when deciding to approve state aid, is unclear. Also, EU member states are not required to provide information on certain types of (estimated) impacts. To guarantee that multiple objectives and impacts are sufficiently balanced, it is recommended that the State aid framework prescribes that applicants identify social, economic and environmental objectives and impacts and describe how these are taken into account in the procedure of balancing multiple (conflicting) objectives. Objectives and impacts should be quantified as much as possible, for example by making use of the method of external cost calculation laid down in 'the Handbook on estimation of external costs in the transport Sector'. The results of the study are used by the European Commission as an input for evaluating and improving the EU State aid framework.

  1. Integrating resource efficiency and EU State aid. An evaluation of resource efficiency considerations in the current EU State aid framework

    Energy Technology Data Exchange (ETDEWEB)

    Bennink, D.; Faber, J.; Smit, M. [CE Delft, Delft (Netherlands); Goba, V. [SIA Estonian, Latvian and Lithuanian Environment ELLE, Tallinn (Estonia); Miller, K.; Williams, E. [AEA Technology plc, London (United Kingdom)

    2012-10-15

    This study, for the European Commission, analyses the issues that need to be addressed in the revision of the EU State aid framework to ensure that they do not hinder environmental, resource efficiency and sustainable development goals. In some cases, State aid can be considered an environmentally harmful subsidy (EHS). The study analyses (1) the extent to which the Environmental Aid Guidelines (EAG) need to be changed to take into account recent European environmental policy developments; (2) existing and potential resource efficiency considerations in a) the Regional Aid Guidelines; b) the Research, Development and Innovation (RDI) Guidelines and c) the Agriculture and Forestry Guidelines; assesses cases and schemes using these guidelines to identify whether resource efficiency considerations are taken into account. The study also considers the social, environmental and economic impacts of these cases and schemes. It develops recommendations for the review of the EAG and a number of horizontal guidelines. One of the conclusions of the analysis is that the way in which multiple objectives and impacts are balanced, when deciding to approve state aid, is unclear. Also, EU member states are not required to provide information on certain types of (estimated) impacts. To guarantee that multiple objectives and impacts are sufficiently balanced, it is recommended that the State aid framework prescribes that applicants identify social, economic and environmental objectives and impacts and describe how these are taken into account in the procedure of balancing multiple (conflicting) objectives. Objectives and impacts should be quantified as much as possible, for example by making use of the method of external cost calculation laid down in 'the Handbook on estimation of external costs in the transport Sector'. The results of the study are used by the European Commission as an input for evaluating and improving the EU State aid framework.

  2. An Architecture of IoT Service Delegation and Resource Allocation Based on Collaboration between Fog and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Aymen Abdullah Alsaffar

    2016-01-01

    Full Text Available Despite the wide utilization of cloud computing (e.g., services, applications, and resources, some of the services, applications, and smart devices are not able to fully benefit from this attractive cloud computing paradigm due to the following issues: (1 smart devices might be lacking in their capacity (e.g., processing, memory, storage, battery, and resource allocation, (2 they might be lacking in their network resources, and (3 the high network latency to centralized server in cloud might not be efficient for delay-sensitive application, services, and resource allocations requests. Fog computing is promising paradigm that can extend cloud resources to edge of network, solving the abovementioned issue. As a result, in this work, we propose an architecture of IoT service delegation and resource allocation based on collaboration between fog and cloud computing. We provide new algorithm that is decision rules of linearized decision tree based on three conditions (services size, completion time, and VMs capacity for managing and delegating user request in order to balance workload. Moreover, we propose algorithm to allocate resources to meet service level agreement (SLA and quality of services (QoS as well as optimizing big data distribution in fog and cloud computing. Our simulation result shows that our proposed approach can efficiently balance workload, improve resource allocation efficiently, optimize big data distribution, and show better performance than other existing methods.

  3. Elastic Extension of a CMS Computing Centre Resources on External Clouds

    Science.gov (United States)

    Codispoti, G.; Di Maria, R.; Aiftimiei, C.; Bonacorsi, D.; Calligola, P.; Ciaschini, V.; Costantini, A.; Dal Pra, S.; DeGirolamo, D.; Grandi, C.; Michelotto, D.; Panella, M.; Peco, G.; Sapunenko, V.; Sgaravatto, M.; Taneja, S.; Zizzi, G.

    2016-10-01

    After the successful LHC data taking in Run-I and in view of the future runs, the LHC experiments are facing new challenges in the design and operation of the computing facilities. The computing infrastructure for Run-II is dimensioned to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, CMS - along the lines followed by other LHC experiments - is exploring the opportunity to access Cloud resources provided by external partners or commercial providers. Specific use cases have already been explored and successfully exploited during Long Shutdown 1 (LS1) and the first part of Run 2. In this work we present the proof of concept of the elastic extension of a CMS site, specifically the Bologna Tier-3, on an external OpenStack infrastructure. We focus on the “Cloud Bursting” of a CMS Grid site using a newly designed LSF configuration that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on the OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage. The amount of resources allocated thus can be elastically modeled to cope up with the needs of CMS experiment and local users. Moreover, a direct access/integration of OpenStack resources to the CMS workload management system is explored. In this paper we present this approach, we report on the performances of the on-demand allocated resources, and we discuss the lessons learned and the next steps.

  4. Current status of the HAL/S compiler on the Modcomp classic 7870 computer

    Science.gov (United States)

    Lytle, P. J.

    1981-01-01

    A brief history of the HAL/S language, including the experience of other users of the language at the Jet Propulsion Laboratory is presented. The current status of the compiler, as implemented on the Modcomp 7870 Classi computer, and future applications in the Deep Space Network (DSN) are discussed. The primary applications in the DSN will be in the Mark IVA network.

  5. Computation of magnetic fields within source regions of ionospheric and magnetospheric currents

    DEFF Research Database (Denmark)

    Engels, U.; Olsen, Nils

    1998-01-01

    A general method of computing the magnetic effect caused by a predetermined three-dimensional external current density is presented. It takes advantage of the representation of solenoidal vector fields in terms of toroidal and poloidal modes expressed by two independent series of spherical harmon...

  6. Cryptography in the Cloud Computing: the Current State and Logical Tasks

    OpenAIRE

    Sergey Nikolaevich Kyazhin; Andrey Vladimirovich Moiseev

    2013-01-01

    The current state of the cloud computing (CC) information security is analysed and logical problems of storage and data transmission security at CC are allocated. Cryptographic methods of data security in CC, in particular, lightweight cryptography and the cryptography based on bilinear pairings are described.

  7. Cryptography in the Cloud Computing: the Current State and Logical Tasks

    Directory of Open Access Journals (Sweden)

    Sergey Nikolaevich Kyazhin

    2013-09-01

    Full Text Available The current state of the cloud computing (CC information security is analysed and logical problems of storage and data transmission security at CC are allocated. Cryptographic methods of data security in CC, in particular, lightweight cryptography and the cryptography based on bilinear pairings are described.

  8. Exploiting short-term memory in soft body dynamics as a computational resource.

    Science.gov (United States)

    Nakajima, K; Li, T; Hauser, H; Pfeifer, R

    2014-11-06

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  9. Topics in Current Science Research: Closing the Achievement Gap for Under Resourced Students of Color

    Science.gov (United States)

    Loya Villalpando, Alvaro; Daal, Miguel; Phipps, Arran; Speller, Danielle; Sadoulet, Bernard; Winheld, Rachel; Cryogenic Dark Matter Search Collaboration

    2015-04-01

    Topics in Current Science Research (TCSR) is a five-week summer course offered at the University of California, Berkeley through a collaboration between the Level Playing Field Institute's Summer Math and Science Honors Academy (SMASH) Program and the Cryogenic Dark Matter Search (CDMS) group at UC Berkeley. SMASH is an academic enrichment program geared towards under-resourced, high school students of color. The goals of the course are to expand the students' conception of STEM, to teach the students that science is a method of inquiry and not just a collection of facts that are taught in school, and to expose the scholars to critical thinking within a scientific setting. The course's curriculum engages the scholars in hands-on scientific research, project proposal writing, and presentation of their scientific work to their peers as well as to a panel of UC Berkeley scientists. In this talk, we describe the course and the impact it has had on previous scholars, we discuss how the course's pedagogy has evolved over the past 10 years to enhance students' perception and understanding of science, and we present previous participants' reflections and feedback about the course and its success in providing high school students a genuine research experience at the university level.

  10. The Usage of Informal Computer Based Communication in the Context of Organization’s Technological Resources

    Directory of Open Access Journals (Sweden)

    Steponas Jonušauskas

    2011-12-01

    Full Text Available Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization’s technological resources.Methodology—meta analysis, survey and descriptive analysis.Findings. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the individuals’ recognition of reality, creates general context of environment between talkers, and strengthens interpersonal attraction. For these reasons, informal communication is desirable and even necessary in organizations because it helps to ensure efficient functioning of the enterprise. However, communicating through electronic channels suppresses informal connections or addresses them to the outside of the organization. So, electronic communication is not beneficial for developing ties in informal organizational network. The empirical research showed, thatsignificant part of courts administration staff is prone to use technological resources of their office for informal communication. Representatives of courts administration choose friends for computer based communication much more often than colleagues (72% and 63%respectively. 93%of the research respondents use an additional e-mail box serviced by commercial providers for non work communication. High intensity of informal electronic communication with friends and familiars shows that workers of court administration are used to meet their psycho emotional needs outside the work place. The survey confirmed conclusion of the theoretical analysis: computer based communication is not beneficial for developing informal contacts between workers. In order for the informal communication could carry out its functions and technological recourses of organization would be used effectively, staff

  11. The Usage of Informal Computer Based Communication in the Context of Organization’s Technological Resources

    Directory of Open Access Journals (Sweden)

    Agota Giedrė Raišienė

    2013-08-01

    Full Text Available Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization’s technological resources. Methodology—meta analysis, survey and descriptive analysis. Findings. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the individuals’ recognition of reality, creates general context of environment between talkers, and strengthens interpersonal attraction. For these reasons, informal communication is desirable and even necessary in organizations because it helps to ensure efficient functioning of the enterprise. However, communicating through electronic channels suppresses informal connections or addresses them to the outside of the organization. So, electronic communication is not beneficial for developing ties in informal organizational network. The empirical research showed, that significant part of courts administration staff is prone to use technological resources of their office for informal communication. Representatives of courts administration choose friends for computer based communication much more often than colleagues (72% and 63%respectively. 93%of the research respondents use an additional e-mail box serviced by commercial providers for non work communication. High intensity of informal electronic communication with friends and familiars shows that workers of court administration are used to meet their psycho emotional needs outside the work place. The survey confirmed conclusion of the theoretical analysis: computer based communication is not beneficial for developing informal contacts between workers. In order for the informal communication could carry out its functions and technological recourses of organization would be used effectively, staff

  12. Computer simulation of current percolation in polycrystalline high-temperature superconductors

    Energy Technology Data Exchange (ETDEWEB)

    Zeimetz, B [Department of Materials Science and Interdisciplinary Research Centre in Superconductivity, Cambridge University, Pembroke Street, Cambridge (United Kingdom); Rutter, N A; Glowacki, B A; Evetts, J E [Department of Materials Science and Interdisciplinary Research Centre in Superconductivity, Cambridge University, Pembroke Street, Cambridge (United Kingdom)

    2001-09-01

    YBCO-coated conductors were modelled in a computer simulation using a resistor network concept, with the resistors representing the grain boundaries. Dissipation above the critical current, accompanied by flux penetration into the grain boundaries, was described by a linear (flux-flow) resistivity. The model allowed calculation of the combined percolation of current and magnetic flux. Current-voltage data showed scaling in agreement with percolation theory for two-dimensional systems. The influence of grain alignment and electromagnetic parameters on conductor performance was investigated. (author)

  13. Computer modelling of the UK wind energy resource. Phase 2. Application of the methodology

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Makari, M; Newton, K; Ravenscroft, F; Whittaker, J

    1993-12-31

    This report presents the results of the second phase of a programme to estimate the UK wind energy resource. The overall objective of the programme is to provide quantitative resource estimates using a mesoscale (resolution about 1km) numerical model for the prediction of wind flow over complex terrain, in conjunction with digitised terrain data and wind data from surface meteorological stations. A network of suitable meteorological stations has been established and long term wind data obtained. Digitised terrain data for the whole UK were obtained, and wind flow modelling using the NOABL computer program has been performed. Maps of extractable wind power have been derived for various assumptions about wind turbine characteristics. Validation of the methodology indicates that the results are internally consistent, and in good agreement with available comparison data. Existing isovent maps, based on standard meteorological data which take no account of terrain effects, indicate that 10m annual mean wind speeds vary between about 4.5 and 7 m/s over the UK with only a few coastal areas over 6 m/s. The present study indicates that 28% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. The results will be useful for broad resource studies and initial site screening. Detailed resource evaluation for local sites will require more detailed local modelling or ideally long term field measurements. (12 figures, 14 tables, 21 references). (Author)

  14. Data Security, Privacy, Availability and Integrity in Cloud Computing: Issues and Current Solutions

    OpenAIRE

    Sultan Aldossary; William Allen

    2016-01-01

    Cloud computing changed the world around us. Now people are moving their data to the cloud since data is getting bigger and needs to be accessible from many devices. Therefore, storing the data on the cloud becomes a norm. However, there are many issues that counter data stored in the cloud starting from virtual machine which is the mean to share resources in cloud and ending on cloud storage itself issues. In this paper, we present those issues that are preventing people from adopting the cl...

  15. A Resource Service Model in the Industrial IoT System Based on Transparent Computing.

    Science.gov (United States)

    Li, Weimin; Wang, Bin; Sheng, Jinfang; Dong, Ke; Li, Zitong; Hu, Yixiang

    2018-03-26

    The Internet of Things (IoT) has received a lot of attention, especially in industrial scenarios. One of the typical applications is the intelligent mine, which actually constructs the Six-Hedge underground systems with IoT platforms. Based on a case study of the Six Systems in the underground metal mine, this paper summarizes the main challenges of industrial IoT from the aspects of heterogeneity in devices and resources, security, reliability, deployment and maintenance costs. Then, a novel resource service model for the industrial IoT applications based on Transparent Computing (TC) is presented, which supports centralized management of all resources including operating system (OS), programs and data on the server-side for the IoT devices, thus offering an effective, reliable, secure and cross-OS IoT service and reducing the costs of IoT system deployment and maintenance. The model has five layers: sensing layer, aggregation layer, network layer, service and storage layer and interface and management layer. We also present a detailed analysis on the system architecture and key technologies of the model. Finally, the efficiency of the model is shown by an experiment prototype system.

  16. Monitoring of Computing Resource Use of Active Software Releases in ATLAS

    CERN Document Server

    Limosani, Antonio; The ATLAS collaboration

    2016-01-01

    The LHC is the world's most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the Tier0 at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as "MemoryMonitor", to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed...

  17. Monitoring of computing resource use of active software releases at ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00219183; The ATLAS collaboration

    2017-01-01

    The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and dis...

  18. Modeling of tube current modulation methods in computed tomography dose calculations for adult and pregnant patients

    International Nuclear Information System (INIS)

    Caracappa, Peter F.; Xu, X. George; Gu, Jianwei

    2011-01-01

    The comparatively high dose and increasing frequency of computed tomography (CT) examinations have spurred the development of techniques for reducing radiation dose to imaging patients. Among these is the application of tube current modulation (TCM), which can be applied either longitudinally along the body or rotationally along the body, or both. Existing computational models for calculating dose from CT examinations do not include TCM techniques. Dose calculations using Monte Carlo methods have been previously prepared for constant-current rotational exposures at various positions along the body and for the principle exposure projections for several sets of computational phantoms, including adult male and female and pregnant patients. Dose calculations from CT scans with TCM are prepared by appropriately weighting the existing dose data. Longitudinal TCM doses can be obtained by weighting the dose at the z-axis scan position by the relative tube current at that position. Rotational TCM doses are weighted using the relative organ doses from the principle projections as a function of the current at the rotational angle. Significant dose reductions of 15% to 25% to fetal tissues are found from simulations of longitudinal TCM schemes to pregnant patients of different gestational ages. Weighting factors for each organ in rotational TCM schemes applied to adult male and female patients have also been found. As the application of TCM techniques becomes more prevalent, the need for including TCM in CT dose estimates will necessarily increase. (author)

  19. [Current situation of the organisation, resources and activity in paediatric cardiology in Spain].

    Science.gov (United States)

    Sánchez Ferrer, Francisco; Castro García, Francisco José; Pérez-Lescure Picarzo, Javier; Roses Noguer, Ferrán; Centeno Malfaz, Fernándo; Grima Murcia, María Dolores; Brotons, Dimpna Albert

    2018-04-26

    The results are presented on the «current situation of the organisation, resources and activity in paediatric cardiology in Spain». It was promoted by the Spanish Society of Paediatric Cardiology and Congenital Heart disease. An analysis was carried out on the results obtained from a specifically designed questionnaire, prepared by the Spanish Society of Paediatric Cardiology and Congenital Heart disease, that was sent to all hospitals around the country that offer the speciality of paediatric cardiology. A total of 86 questionnaires were obtained, including 14 hospitals that perform cardiac surgery on children. A total of 190 paediatric cardiology consultants, 40 cardiac surgeons, and 27 middle grade doctors performing their paediatric residency (MIR program) were identified. All hospitals had adequate equipment to perform an optimal initial evaluation of any child with a possible cardiac abnormality, but only tertiary centres could perform complex diagnostic procedures, interventional cardiology, and cardiac surgery. In almost all units around the country, paediatric cardiology consultants were responsible for outpatient clinics and hospital admissions, whereas foetal cardiology units were still mainly managed by obstetricians. The number of diagnostic and therapeutic procedures was similar to those reported in the first survey, except for a slight decrease in the total number of closed cardiac surgery procedures, and a proportional increase in the number of therapeutic catheterisations. Paediatric Cardiology in Spain is performed by paediatric cardiology consultants that were trained initially as general paediatricians, and then completed a paediatric cardiology training period. Almost all units have adequate means for diagnosis and treatment. Efforts should be directed to create a national registry that would not only allow a prospective quantification of diagnostic and therapeutic procedures, but also focus on their clinical outcomes. Copyright © 2018

  20. Water Resource Impacts Embedded in the Western US Electrical Energy Trade; Current Patterns and Adaptation to Future Drought

    Science.gov (United States)

    Adams, E. A.; Herron, S.; Qiu, Y.; Tidwell, V. C.; Ruddell, B. L.

    2013-12-01

    Water resources are a key element in the global coupled natural-human (CNH) system, because they are tightly coupled with the world's social, environmental, and economic subsystems, and because water resources are under increasing pressure worldwide. A fundamental adaptive tool used especially by cities to overcome local water resource scarcity is the outsourcing of water resource impacts through substitutionary economic trade. This is generally understood as the indirect component of a water footprint, and as ';virtual water' trade. This work employs generalized CNH methods to reveal the trade in water resource impacts embedded in electrical energy within the Western US power grid, and utilizes a general equilibrium economic trade model combined with drought and demand growth constraints to estimate the future status of this trade. Trade in embedded water resource impacts currently increases total water used for electricity production in the Western US and shifts water use to more water-limited States. Extreme drought and large increases in electrical energy demand increase the need for embedded water resource impact trade, while motivating a shift to more water-efficient generation technologies and more water-abundant generating locations. Cities are the largest users of electrical energy, and in the 21st Century will outsource a larger fraction of their water resource impacts through trade. This trade exposes cities to risks associated with disruption of long-distance transmission and distant hydrological droughts.

  1. Evaluation of the individual tube current setting in electrocardiogram-gated cardiac computed tomography estimated from plain chest computed tomography using computed tomography automatic exposure control

    International Nuclear Information System (INIS)

    Kishimoto, Junichi; Sakou, Toshio; Ohta, Yasutoshi

    2013-01-01

    The aim of this study was to estimate the tube current on a cardiac computed tomography (CT) from a plain chest CT using CT-automatic exposure control (CT-AEC), to obtain consistent image noise, and to optimize the scan tube current by individualizing the tube current. Sixty-five patients (Group A) underwent cardiac CT at fixed tube current. The mAs value for plain chest CT using CT-AEC (AEC value) and cardiac CT image noise were measured. The tube current needed to obtain the intended level of image noise in the cardiac CT was determined from their correlation. Another 65 patients (Group B) underwent cardiac CT with tube currents individually determined from the AEC value. Image noise was compared among Group A and B. Image noise of cardiac CT in Group B was 24.4±3.1 Hounsfield unit (HU) and was more uniform than in Group A (21.2±6.1 HU). The error with the desired image noise of 25 HU was lower in Group B (2.4%) than in Group A (15.2%). Individualized tube current selection based on AEC value thus provided consistent image noise and a scan tube current optimized for cardiac CT. (author)

  2. Internet resources for dentistry: computer, Internet, reference, and sites for enhancing personal productivity of the dental professional.

    Science.gov (United States)

    Guest, G F

    2000-08-15

    At the onset of the new millennium the Internet has become the new standard means of distributing information. In the last two to three years there has been an explosion of e-commerce with hundreds of new web sites being created every minute. For most corporate entities, a web site is as essential as the phone book listing used to be. Twenty years ago technologist directed how computer-based systems were utilized. Now it is the end users of personal computers that have gained expertise and drive the functionality of software applications. The computer, initially invented for mathematical functions, has transitioned from this role to an integrated communications device that provides the portal to the digital world. The Web needs to be used by healthcare professionals, not only for professional activities, but also for instant access to information and services "just when they need it." This will facilitate the longitudinal use of information as society continues to gain better information access skills. With the demand for current "just in time" information and the standards established by Internet protocols, reference sources of information may be maintained in dynamic fashion. News services have been available through the Internet for several years, but now reference materials such as online journals and digital textbooks have become available and have the potential to change the traditional publishing industry. The pace of change should make us consider Will Rogers' advice, "It isn't good enough to be moving in the right direction. If you are not moving fast enough, you can still get run over!" The intent of this article is to complement previous articles on Internet Resources published in this journal, by presenting information about web sites that present information on computer and Internet technologies, reference materials, news information, and information that lets us improve personal productivity. Neither the author, nor the Journal endorses any of the

  3. Computer-modeling codes to improve exploration nuclear-logging methods. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Wilson, R.D.; Price, R.K.; Kosanke, K.L.

    1983-03-01

    As part of the Department of Energy's National Uranium Resource Evaluation (NURE) project's Technology Development effort, a number of computer codes and accompanying data bases were assembled for use in modeling responses of nuclear borehole logging Sondes. The logging methods include fission neutron, active and passive gamma-ray, and gamma-gamma. These CDC-compatible computer codes and data bases are available on magnetic tape from the DOE Technical Library at its Grand Junction Area Office. Some of the computer codes are standard radiation-transport programs that have been available to the radiation shielding community for several years. Other codes were specifically written to model the response of borehole radiation detectors or are specialized borehole modeling versions of existing Monte Carlo transport programs. Results from several radiation modeling studies are available as two large data bases (neutron and gamma-ray). These data bases are accompanied by appropriate processing programs that permit the user to model a wide range of borehole and formation-parameter combinations for fission-neutron, neutron-, activation and gamma-gamma logs. The first part of this report consists of a brief abstract for each code or data base. The abstract gives the code name and title, short description, auxiliary requirements, typical running time (CDC 6600), and a list of references. The next section gives format specifications and/or directory for the tapes. The final section of the report presents listings for programs used to convert data bases between machine floating-point and EBCDIC

  4. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems.

    Science.gov (United States)

    Ehsan, Shoaib; Clark, Adrian F; Naveed ur Rehman; McDonald-Maier, Klaus D

    2015-07-10

    The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  5. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems

    Directory of Open Access Journals (Sweden)

    Shoaib Ehsan

    2015-07-01

    Full Text Available The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF, allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video. Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44% in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  6. Including alternative resources in state renewable portfolio standards: Current design and implementation experience

    International Nuclear Information System (INIS)

    Heeter, Jenny; Bird, Lori

    2013-01-01

    As of October 2012, 29 states, the District of Columbia, and Puerto Rico have instituted a renewable portfolio standard (RPS). Each state policy is unique, varying in percentage targets, timetables, and eligible resources. Increasingly, new RPS polices have included alternative resources. Alternative resources have included energy efficiency, thermal resources, and, to a lesser extent, non-renewables. This paper examines state experience with implementing renewable portfolio standards that include energy efficiency, thermal resources, and non-renewable energy and explores compliance experience, costs, and how states evaluate, measure, and verify energy efficiency and convert thermal energy. It aims to gain insights from the experience of states for possible federal clean energy policy as well as to share experience and lessons for state RPS implementation. - Highlights: • Increasingly, new RPS policies have included alternative resources. • Nearly all states provide a separate tier or cap on the quantity of eligible alternative resources. • Where allowed, non-renewables and energy efficiency are being heavily utilized

  7. Computer simulation of induced electric currents and fields in biological bodies by 60 Hz magnetic fields

    International Nuclear Information System (INIS)

    Xi Weiguo; Stuchly, M.A.; Gandhi, O.P.

    1993-01-01

    Possible health effects of human exposure to 60 Hz magnetic fields are a subject of increasing concern. An understanding of the coupling of electromagnetic fields to human body tissues is essential for assessment of their biological effects. A method is presented for the computerized simulation of induced electric currents and fields in bodies of men and rodents from power-line frequency magnetic fields. In the impedance method, the body is represented by a 3 dimensional impedance network. The computational model consists of several tens of thousands of cubic numerical cells and thus represented a realistic shape. The modelling for humans is performed with two models, a heterogeneous model based on cross-section anatomy and a homogeneous one using an average tissue conductivity. A summary of computed results of induced electric currents and fields is presented. It is confirmed that induced currents are lower than endangerous current levels for most environmental exposures. However, the induced current density varies greatly, with the maximum being at least 10 times larger than the average. This difference is likely to be greater when more detailed anatomy and morphology are considered. 15 refs., 2 figs., 1 tab

  8. The Current Status of Germplum Database: a Tool for Characterization of Plum Genetic Resources in Romania

    Directory of Open Access Journals (Sweden)

    Monica Harta

    2016-11-01

    Full Text Available In Romania, Prunus genetic resources are kept in collections of varieties, populations and biotypes, mainly located in research and development institutes or fruit growing stations and, in the last years, by some private enterprises. Creating the experimental model for the Germplum database based on phenotypic descriptors and SSR molecular markers analysis is an important and topical objective for the efficient characterization of genetic resources and also for establishing a public-private partnership for the effective management of plum germplasm resources in Romania. The technical development of the Germplum database was completed and data will be added continuously after characterizing each new accession.

  9. Current state and problems of integrated development of mineral resources base in Russia

    Science.gov (United States)

    Filimonova, I. V.; Eder, L. V.; Mishenin, M. V.; Mamakhatov, T. M.

    2017-09-01

    The article deals with the issues of integrated development of subsoil resources taking into account the actual problems facing the Russian oil and gas complex. The key factors determining the need for integrated development of subsoil resources have been systematized and investigated. These factors are the change of the hydrocarbon resource base quality, the improvement of the depletion degree of basic (unique and major) oil fields, the increase in the number of small and smallest oil fields discovered and introduced into development, the increased capital intensity and the riskiness of geological exploration, and the territorial location of new subsoil use facilities.

  10. Interactive Whiteboards and Computer Games at Highschool Level: Digital Resources for Enhancing Reflection in Teaching and Learning

    DEFF Research Database (Denmark)

    Sorensen, Elsebeth Korsgaard; Poulsen, Mathias; Houmann, Rita

    The general potential of computer games for teaching and learning is becoming widely recognized. In particular, within the application contexts of primary and lower secondary education, the relevance and value and computer games seem more accepted, and the possibility and willingness to incorporate...... computer games as a possible resource at the level of other educational resources seem more frequent. For some reason, however, to apply computer games in processes of teaching and learning at the high school level, seems an almost non-existent event. This paper reports on study of incorporating...... the learning game “Global Conflicts: Latin America” as a resource into the teaching and learning of a course involving the two subjects “English language learning” and “Social studies” at the final year in a Danish high school. The study adapts an explorative research design approach and investigates...

  11. Invasive alien plants and water resources in South Africa: current understanding, predictive ability and research challenges

    CSIR Research Space (South Africa)

    Gorgens, AHM

    2004-01-01

    Full Text Available were made by combining the results of hydrological experiments, conducted to assess the effects of afforestation with alien trees on water resources, with an ecological understanding of the spread and establishment of invasive trees. The forecasts were...

  12. THE UNITED RESCUE SYSTEM IN BULGARIA. CURRENT RESOURCE RELATED ISSUES AND PROSPECTIVE SOLUTIONS

    Directory of Open Access Journals (Sweden)

    Daniela Baleva

    2017-11-01

    Full Text Available The article presents some problems related to securing the Bulgarian system for disaster management with the necessary resources for its proper functioning. The main challenges for the united rescue system in the country are analyzed, including those related to ensuring the system with the necessary material, financial and human resources. Some possibilities for solving these problems with the use of funds from the European Union are presented.

  13. Managing carbon regulatory risk in utility resource planning: Current practices in the Western United States

    International Nuclear Information System (INIS)

    Barbose, Galen; Wiser, Ryan; Phadke, Amol; Goldman, Charles

    2008-01-01

    Concerns about global climate change have substantially increased the likelihood that future policy will seek to minimize carbon dioxide emissions. As such, even today, electric utilities are making resource planning and investment decisions that consider the possible implications of these future carbon regulations. In this article, we examine the manner in which utilities assess the financial risks associated with future carbon regulations within their long-term resource plans. We base our analysis on a review of the most recent resource plans filed by 15 electric utilities in the Western United States. Virtually all of these utilities made some effort to quantitatively evaluate the potential cost of future carbon regulations when analyzing alternate supply- and demand-side resource options for meeting customer load. Even without federal climate regulation in the US, the prospect of that regulation is already having an impact on utility decision-making and resource choices. That said, the methods and assumptions used by utilities to analyze carbon regulatory risk, and the impact of that analysis on their choice of a particular resource strategy, vary considerably, revealing a number of opportunities for analytic improvement. Though our review focuses on a subset of US electric utilities, this work holds implications for all electric utilities and energy policymakers who are seeking to minimize the compliance costs associated with future carbon regulations

  14. Problems of software financial resources agrarian sector in the current economic conditions of management

    Directory of Open Access Journals (Sweden)

    Grischuk Nadiya Viktorivna

    2016-12-01

    Full Text Available Research of financial science on questions providing of financial resources does not exhaust and needs a further study that acquires new descriptions and vectors of development constantly, what costing illuminations in the conditions of present time. Research of the state of provision of financial resources agrarian to the sector of economy with allocating of main segment – loan and attracted financial resources, today topically. In the article the essence funds are considered sources of agricultural enterprises financial resources and problems associated with the formation and use of financial resources in the modern world. Also the problems arising in improving the process of raising funds agricultural enterprises. Revealed that an effective tool to attract financial resources is the issue of convertible bonds and the introduction of agricultural receipts. It is well-proven that in the conditions of unstable environment forward development of the system of agrarian relations must be carried out on the basis of the government programs, and normatively-legal adjusting that take into account not only the existent state of affairs at the market of agroindustrial products but also economic provision of enterprises national agrarian to the sector.

  15. Using multiple metaphors and multimodalities as a semiotic resource when teaching year 2 students computational strategies

    Science.gov (United States)

    Mildenhall, Paula; Sherriff, Barbara

    2017-06-01

    Recent research indicates that using multimodal learning experiences can be effective in teaching mathematics. Using a social semiotic lens within a participationist framework, this paper reports on a professional learning collaboration with a primary school teacher designed to explore the use of metaphors and modalities in mathematics instruction. This video case study was conducted in a year 2 classroom over two terms, with the focus on building children's understanding of computational strategies. The findings revealed that the teacher was able to successfully plan both multimodal and multiple metaphor learning experiences that acted as semiotic resources to support the children's understanding of abstract mathematics. The study also led to implications for teaching when using multiple metaphors and multimodalities.

  16. Adaptive resource allocation scheme using sliding window subchannel gain computation: context of OFDMA wireless mobiles systems

    International Nuclear Information System (INIS)

    Khelifa, F.; Samet, A.; Ben Hassen, W.; Afif, M.

    2011-01-01

    Multiuser diversity combined with Orthogonal Frequency Division Multiple Access (OFDMA) are a promising technique for achieving high downlink capacities in new generation of cellular and wireless network systems. The total capacity of OFDMA based-system is maximized when each subchannel is assigned to the mobile station with the best channel to noise ratio for that subchannel with power is uniformly distributed between all subchannels. A contiguous method for subchannel construction is adopted in IEEE 802.16 m standard in order to reduce OFDMA system complexity. In this context, new subchannel gain computation method, can contribute, jointly with optimal assignment subchannel to maximize total system capacity. In this paper, two new methods have been proposed in order to achieve a better trade-off between fairness and efficiency use of resources. Numerical results show that proposed algorithms provide low complexity, higher total system capacity and fairness among users compared to others recent methods.

  17. Application of a Resource Theory for Magic States to Fault-Tolerant Quantum Computing.

    Science.gov (United States)

    Howard, Mark; Campbell, Earl

    2017-03-03

    Motivated by their necessity for most fault-tolerant quantum computation schemes, we formulate a resource theory for magic states. First, we show that robustness of magic is a well-behaved magic monotone that operationally quantifies the classical simulation overhead for a Gottesman-Knill-type scheme using ancillary magic states. Our framework subsequently finds immediate application in the task of synthesizing non-Clifford gates using magic states. When magic states are interspersed with Clifford gates, Pauli measurements, and stabilizer ancillas-the most general synthesis scenario-then the class of synthesizable unitaries is hard to characterize. Our techniques can place nontrivial lower bounds on the number of magic states required for implementing a given target unitary. Guided by these results, we have found new and optimal examples of such synthesis.

  18. Radiotherapy infrastructure and human resources in Switzerland : Present status and projected computations for 2020.

    Science.gov (United States)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-09-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology "Quantification of Radiation Therapy Infrastructure and Staffing" guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO "Health Economics in Radiation Oncology" (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland.

  19. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

    International Nuclear Information System (INIS)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-01-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology ''Quantification of Radiation Therapy Infrastructure and Staffing'' guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO ''Health Economics in Radiation Oncology'' (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland. (orig.) [de

  20. Computer-aided detection in computed tomography colonography. Current status and problems with detection of early colorectal cancer

    International Nuclear Information System (INIS)

    Morimoto, Tsuyoshi; Nakijima, Yasuo; Iinuma, Gen; Arai, Yasuaki; Shiraishi, Junji; Moriyama, Noriyuki; Beddoe, G.

    2008-01-01

    The aim of this study was to evaluate the usefulness of computer-aided detection (CAD) in diagnosing early colorectal cancer using computed tomography colonography (CTC). A total of 30 CTC data sets for 30 early colorectal cancers in 30 patients were retrospectively reviewed by three radiologists. After primary evaluation, a second reading was performed using CAD findings. The readers evaluated each colorectal segment for the presence or absence of colorectal cancer using five confidence rating levels. To compare the assessment results, the sensitivity and specificity with and without CAD were calculated on the basis of the confidence rating, and differences in these variables were analyzed by receiver operating characteristic (ROC) analysis. The average sensitivities for the detection without and with CAD for the three readers were 81.6% and 75.6%, respectively. Among the three readers, only one reader improved sensitivity with CAD compared to that without. CAD decreased specificity in all three readers. CAD detected 100% of protruding lesions but only 69.2% of flat lesions. On ROC analysis, the diagnostic performance of all three readers was decreased by use of CAD. Currently available CAD with CTC does not improve diagnostic performance for detecting early colorectal cancer. An improved CAD algorithm is required for detecting fiat lesions and reducing the false-positive rate. (author)

  1. Efficient Nash Equilibrium Resource Allocation Based on Game Theory Mechanism in Cloud Computing by Using Auction.

    Science.gov (United States)

    Nezarat, Amin; Dastghaibifard, G H

    2015-01-01

    One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer's utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider.

  2. Computer modelling of the UK wind energy resource: UK wind speed data package and user manual

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Ravenscroft, F

    1993-12-31

    A software package has been developed for IBM-PC or true compatibles. It is designed to provide easy access to the results of a programme of work to estimate the UK wind energy resource. Mean wind speed maps and quantitative resource estimates were obtained using the NOABL mesoscale (1 km resolution) numerical model for the prediction of wind flow over complex terrain. NOABL was used in conjunction with digitised terrain data and wind data from surface meteorological stations for a ten year period (1975-1984) to provide digital UK maps of mean wind speed at 10m, 25m and 45m above ground level. Also included in the derivation of these maps was the use of the Engineering Science Data Unit (ESDU) method to model the effect on wind speed of the abrupt change in surface roughness that occurs at the coast. With the wind speed software package, the user is able to obtain a display of the modelled wind speed at 10m, 25m and 45m above ground level for any location in the UK. The required co-ordinates are simply supplied by the user, and the package displays the selected wind speed. This user manual summarises the methodology used in the generation of these UK maps and shows computer generated plots of the 25m wind speeds in 200 x 200 km regions covering the whole UK. The uncertainties inherent in the derivation of these maps are also described, and notes given on their practical usage. The present study indicated that 23% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. (18 figures, 3 tables, 6 references). (author)

  3. Monitoring of computing resource use of active software releases at ATLAS

    Science.gov (United States)

    Limosani, Antonio; ATLAS Collaboration

    2017-10-01

    The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed in plots generated using Python visualization libraries and collected into pre-formatted auto-generated Web pages, which allow the ATLAS developer community to track the performance of their algorithms. This information is however preferentially filtered to domain leaders and developers through the use of JIRA and via reports given at ATLAS software meetings. Finally, we take a glimpse of the future by reporting on the expected CPU and RAM usage in benchmark workflows associated with the High Luminosity LHC and anticipate the ways performance monitoring will evolve to understand and benchmark future workflows.

  4. Resource Assessment of Tidal Current Energy in Hangzhou Bay Based on Long Term Measurement

    Science.gov (United States)

    Zhang, Feng; Dai, Chun-Ni; Xu, Xue-Feng; Wang, Chuan-Kun; Ye, Qin

    2017-05-01

    Compared with other marine renewable energy, tidal current energy benefits a lot in high energy density and good predictability. Based on the measured tidal current data in Hangzhou Bay from Nov 2012 to Oct 2012, this paper analysed temporal and spatial changes of tidal current energy in the site. It is the first time measured data of such long time been taken in tidal current energy analysis. Occurrence frequency and duration of the current of different speed are given out in the paper. According to the analysis results, monthly average power density changed a lot in different month, and installation orientation of tidal current turbine significantly affected energy acquisition. Finally, the annual average power density of tidal current energy with coefficient Cp in the site was calculated, and final output of a tidal current plant was also estimated.

  5. A resource facility for kinetic analysis: modeling using the SAAM computer programs.

    Science.gov (United States)

    Foster, D M; Boston, R C; Jacquez, J A; Zech, L

    1989-01-01

    Kinetic analysis and integrated system modeling have contributed significantly to understanding the physiology and pathophysiology of metabolic systems in humans and animals. Many experimental biologists are aware of the usefulness of these techniques and recognize that kinetic modeling requires special expertise. The Resource Facility for Kinetic Analysis (RFKA) provides this expertise through: (1) development and application of modeling technology for biomedical problems, and (2) development of computer-based kinetic modeling methodologies concentrating on the computer program Simulation, Analysis, and Modeling (SAAM) and its conversational version, CONversational SAAM (CONSAM). The RFKA offers consultation to the biomedical community in the use of modeling to analyze kinetic data and trains individuals in using this technology for biomedical research. Early versions of SAAM were widely applied in solving dosimetry problems; many users, however, are not familiar with recent improvements to the software. The purpose of this paper is to acquaint biomedical researchers in the dosimetry field with RFKA, which, together with the joint National Cancer Institute-National Heart, Lung and Blood Institute project, is overseeing SAAM development and applications. In addition, RFKA provides many service activities to the SAAM user community that are relevant to solving dosimetry problems.

  6. A comprehensive overview of computational resources to aid in precision genome editing with engineered nucleases.

    Science.gov (United States)

    Periwal, Vinita

    2017-07-01

    Genome editing with engineered nucleases (zinc finger nucleases, TAL effector nucleases s and Clustered regularly inter-spaced short palindromic repeats/CRISPR-associated) has recently been shown to have great promise in a variety of therapeutic and biotechnological applications. However, their exploitation in genetic analysis and clinical settings largely depends on their specificity for the intended genomic target. Large and complex genomes often contain highly homologous/repetitive sequences, which limits the specificity of genome editing tools and could result in off-target activity. Over the past few years, various computational approaches have been developed to assist the design process and predict/reduce the off-target activity of these nucleases. These tools could be efficiently used to guide the design of constructs for engineered nucleases and evaluate results after genome editing. This review provides a comprehensive overview of various databases, tools, web servers and resources for genome editing and compares their features and functionalities. Additionally, it also describes tools that have been developed to analyse post-genome editing results. The article also discusses important design parameters that could be considered while designing these nucleases. This review is intended to be a quick reference guide for experimentalists as well as computational biologists working in the field of genome editing with engineered nucleases. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. [Funding, public spending and management of health resources: the current situation in a Brazilian state].

    Science.gov (United States)

    Leite, Valéria Rodrigues; Lima, Kenio Costa; de Vasconcelos, Cipriano Maia

    2012-07-01

    This article investigates the issue of funding and the decentralization process in order to examine the composition, application and management of resources in the healthcare area. The sample surveyed involved 14 municipalities in the state of Rio Grande do Norte, Brazil. The research involved data gathering of financial transfers, the municipality's own resources and primary healthcare expenses. Management analysis included a survey of local managers and counselors. It was seen that the Unified Health System is funded mainly by federal transfers and municipal revenues and to a far lesser extent by state resources. Funds have been applied predominantly in primary healthcare. The management process saw centralization of actions in the city governments. Municipal secretarial offices and councils comply partially with legislation, though they have problems with autonomy and social control. The results show that planning and management instruments are limited, due to the contradictions inherent to the institutional, political and cultural context of the region.

  8. A Critical Assessment of the Resource Depletion Potential of Current and Future Lithium-Ion Batteries

    Directory of Open Access Journals (Sweden)

    Jens F. Peters

    2016-12-01

    Full Text Available Resource depletion aspects are repeatedly used as an argument for a shift towards new battery technologies. However, whether serious shortages due to the increased demand for traction and stationary batteries can actually be expected is subject to an ongoing discussion. In order to identify the principal drivers of resource depletion for battery production, we assess different lithium-ion battery types and a new lithium-free battery technology (sodium-ion under this aspect, applying different assessment methodologies. The findings show that very different results are obtained with existing impact assessment methodologies, which hinders clear interpretation. While cobalt, nickel and copper can generally be considered as critical metals, the magnitude of their depletion impacts in comparison with that of other battery materials like lithium, aluminum or manganese differs substantially. A high importance is also found for indirect resource depletion effects caused by the co-extraction of metals from mixed ores. Remarkably, the resource depletion potential per kg of produced battery is driven only partially by the electrode materials and thus depends comparably little on the battery chemistry itself. One of the key drivers for resource depletion seems to be the metals (and co-products in electronic parts required for the battery management system, a component rather independent from the actual battery chemistry. However, when assessing the batteries on a capacity basis (per kWh storage capacity, a high-energy density also turns out to be relevant, since it reduces the mass of battery required for providing one kWh, and thus the associated resource depletion impacts.

  9. Computation of stationary 3D halo currents in fusion devices with accuracy control

    Science.gov (United States)

    Bettini, Paolo; Specogna, Ruben

    2014-09-01

    This paper addresses the calculation of the resistive distribution of halo currents in three-dimensional structures of large magnetic confinement fusion machines. A Neumann electrokinetic problem is solved on a geometry so complicated that complementarity is used to monitor the discretization error. An irrotational electric field is obtained by a geometric formulation based on the electric scalar potential, whereas three geometric formulations are compared to obtain a solenoidal current density: a formulation based on the electric vector potential and two geometric formulations inspired from mixed and mixed-hybrid Finite Elements. The electric vector potential formulation is usually considered impractical since an enormous computing power is wasted by the topological pre-processing it requires. To solve this challenging problem, we present novel algorithms based on lazy cohomology generators that enable to save orders of magnitude computational time with respect to all other state-of-the-art solutions proposed in literature. Believing that our results are useful in other fields of scientific computing, the proposed algorithm is presented as a detailed pseudocode in such a way that it can be easily implemented.

  10. Resource Loss Moderates the Association Between Child Abuse and Current PTSD Symptoms Among Women in Primary-Care Settings.

    Science.gov (United States)

    Costa, Eleonora C V; Guimarães, Sara; Ferreira, Domingos; Pereira, M Graça

    2016-09-01

    This study examined if abuse during childhood, rape in adulthood, and loss of resources predict a woman's probability of reporting symptoms of posttraumatic stress disorder (PTSD), and whether resource loss moderates the association between reporting childhood abuse and PTSD symptoms. The sample included 767 women and was collected in publicly funded primary-care settings. Women who reported having been abused during childhood also reported more resource loss, more acute PTSD symptoms, and having suffered more adult rape than those who reported no childhood abuse. Hierarchical logistic regression yielded a two-variable additive model in which child abuse and adult rape predict the probability of reporting or not any PTSD symptoms, explaining 59.7% of the variance. Women abused as children were 1 to 2 times more likely to report PTSD symptoms, with sexual abuse during childhood contributing most strongly to this result. Similarly, women reporting adult rape were almost twice as likely to report symptoms of PTSD as those not reporting it. Resource loss was unexpectedly not among the predictors but a moderation analysis showed that such loss moderated the association between child abuse and current PTSD symptoms, with resource loss increasing the number and severity of PTSD symptoms in women who also reported childhood abuse. The findings highlight the importance of early assessment and intervention in providing mental health care to abused, neglected, and impoverished women to help them prevent and reverse resource loss and revictimization.

  11. Police at School: A Brief History and Current Status of School Resource Officers

    Science.gov (United States)

    Weiler, Spencer C.; Cray, Martha

    2011-01-01

    The school resource officer (SRO) program began in the United States in the early to mid-1950s, however, the program did not gain prominence until the 1990s in response to various school shootings. According to national data, SROs can be found in 35 percent of school across America, regardless of level (elementary, middle, or high school),…

  12. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    Science.gov (United States)

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  13. Social media as an open-learning resource in medical education: current perspectives.

    Science.gov (United States)

    Sutherland, S; Jalali, A

    2017-01-01

    Numerous studies evaluate the use of social media as an open-learning resource in education, but there is a little published knowledge of empirical evidence that such open-learning resources produce educative outcomes, particularly with regard to student performance. This study undertook a systematic review of the published literature in medical education to determine the state of the evidence as to empirical studies that conduct an evaluation or research regarding social media and open-learning resources. The authors searched MEDLINE, ERIC, Embase, PubMed, Scopus, and Google Scholar from 2012 to 2017. This search included using keywords related to social media, medical education, research, and evaluation, while restricting the search to peer reviewed, English language articles only. To meet inclusion criteria, manuscripts had to employ evaluative methods and undertake empirical research. Empirical work designed to evaluate the impact of social media as an open-learning resource in medical education is limited as only 13 studies met inclusion criteria. The majority of these studies used undergraduate medical education as the backdrop to investigate open-learning resources, such as Facebook, Twitter, and YouTube. YouTube appears to have little educational value due to the unsupervised nature of content added on a daily basis. Overall, extant reviews have demonstrated that we know a considerable amount about social media use, although to date, its impacts remain unclear. There is a paucity of outcome-based, empirical studies assessing the impact of social media in medical education. The few empirical studies identified tend to focus on evaluating the affective outcomes of social media and medical education as opposed to understanding any linkages between social media and performance outcomes. Given the potential for social media use in medical education, more empirical evaluative studies are required to determine educational value.

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  15. TomoMINT - the current status and future plan of MINT's computed tomography scanner

    International Nuclear Information System (INIS)

    Kanesan Sinnakaruppan; Jaafar Abdullah

    2000-01-01

    TomoMINT, a second generation computed tomography scanner developed by MINT, is a powerful non-destructive evaluation (NDE) technique for producing two-dimensional cross-sectional images of an object without physically sectioning it. Characteristics of the internal structure of an object such as dimensions, shape, internal defects, density and component distribution are readily available from the scan. Tomographs of wood, metal components and concrete slabs have been successfully obtained from TomoMINT. This paper deals with the current status and future development of this scanner. (author)

  16. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    Energy Technology Data Exchange (ETDEWEB)

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.

    1996-08-01

    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  17. FELIX experiments and computational needs for eddy current analysis of fusion reactors

    International Nuclear Information System (INIS)

    Turner, L.R.

    1984-01-01

    In a fusion reactor, changing magnetic fields are closely coupled to the electrically-conducting metal structure. This coupling is particularly pronounced in a tokamak reactor in which magnetic fields are used to confine, stabilize, drive, and heat the plasma. Electromagnetic effects in future fusion reactors will have far-reaching implications in the configuration, operation, and maintenance of the reactors. This paper describes the impact of eddy-current effects on future reactors, the requirements of computer codes for analyzing those effects, and the FELIX experiments which will provide needed data for code validation

  18. Computer-aided diagnosis in radiological imaging: current status and future challenges

    Science.gov (United States)

    Doi, Kunio

    2009-10-01

    Computer-aided diagnosis (CAD) has become one of the major research subjects in medical imaging and diagnostic radiology. Many different types of CAD schemes are being developed for detection and/or characterization of various lesions in medical imaging, including conventional projection radiography, CT, MRI, and ultrasound imaging. Commercial systems for detection of breast lesions on mammograms have been developed and have received FDA approval for clinical use. CAD may be defined as a diagnosis made by a physician who takes into account the computer output as a "second opinion". The purpose of CAD is to improve the quality and productivity of physicians in their interpretation of radiologic images. The quality of their work can be improved in terms of the accuracy and consistency of their radiologic diagnoses. In addition, the productivity of radiologists is expected to be improved by a reduction in the time required for their image readings. The computer output is derived from quantitative analysis of radiologic images by use of various methods and techniques in computer vision, artificial intelligence, and artificial neural networks (ANNs). The computer output may indicate a number of important parameters, for example, the locations of potential lesions such as lung cancer and breast cancer, the likelihood of malignancy of detected lesions, and the likelihood of various diseases based on differential diagnosis in a given image and clinical parameters. In this review article, the basic concept of CAD is first defined, and the current status of CAD research is then described. In addition, the potential of CAD in the future is discussed and predicted.

  19. Modeling of Groundwater Resources Heavy Metals Concentration Using Soft Computing Methods: Application of Different Types of Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Meysam Alizamir

    2017-09-01

    Full Text Available Nowadays, groundwater resources play a vital role as a source of drinking water in arid and semiarid regions and forecasting of pollutants content in these resources is very important. Therefore, this study aimed to compare two soft computing methods for modeling Cd, Pb and Zn concentration in groundwater resources of Asadabad Plain, Western Iran. The relative accuracy of several soft computing models, namely multi-layer perceptron (MLP and radial basis function (RBF for forecasting of heavy metals concentration have been investigated. In addition, Levenberg-Marquardt, gradient descent and conjugate gradient training algorithms were utilized for the MLP models. The ANN models for this study were developed using MATLAB R 2014 Software program. The MLP performs better than the other models for heavy metals concentration estimation. The simulation results revealed that MLP model was able to model heavy metals concentration in groundwater resources favorably. It generally is effectively utilized in environmental applications and in the water quality estimations. In addition, out of three algorithms, Levenberg-Marquardt was better than the others were. This study proposed soft computing modeling techniques for the prediction and estimation of heavy metals concentration in groundwater resources of Asadabad Plain. Based on collected data from the plain, MLP and RBF models were developed for each heavy metal. MLP can be utilized effectively in applications of prediction of heavy metals concentration in groundwater resources of Asadabad Plain.

  20. Genetic resources of perennial forage grasses in Serbia: Current state, broadening and evaluation

    Directory of Open Access Journals (Sweden)

    Sokolović Dejan

    2017-01-01

    Full Text Available Due to historical background of vegetation development, geographical position, climate and relief, Serbia represents one of the 158 world biodiversity centres, based upon the number of plant species and territory size (biodiversity index 0.72. Large areas in Serbia are under natural grasslands and pastures, composed of forage grass species, and important as source of natural plant genetic diversity and germplasm for breeding. These eco-systems represent basic prerequisites for sustainable forage production, but very low potential of them is utilized and genetic resources are not protected. Family Poaceae is present in Serbia flora with 70 genera and among them from the aspect of forage production and quality, the most important are perennial Festuca, Lolium, Dactylis, Phleum, Bromus, Arrhenatherum, Poa and Agrostis species. Most of these grasses have been bred in Serbia and lot of cultivars were released. These cultivars contain autochthonous Serbian material and represent great and important resource of genetic variability. Therefore, collecting of new samples which are acclimatised to local eco-geographical conditions and including them in plant ex situ gene bank is of exceptional importance for further utilization in different plant breeding programmes as well as genetic resources protection. These autochthonous populations have natural variability and very often have satisfactory yielding performance in comparison with introduced cultivars, which referred them for direct phenotypic selection for cultivars release. Broadening of forage grasses genotypes collection is permanent objective of Serbian scientists. Collected accessions are being characterized and evaluated for important phenological, morphological and agronomical traits. In this paper genetic resources of forage grass species, their diversity and potentials, state of the grasses gene banks, as well as possibility for breeding of new cultivars has been analysed.

  1. VME computer monitoring system of KEK-PS fast pulsed magnet currents and beam intensities

    International Nuclear Information System (INIS)

    Kawakubo, T.; Akiyama, A.; Kadokura, E.; Ishida, T.

    1992-01-01

    For beam transfer from the KEK-PS Linac to the Booster synchrotron ring and from the Booster to the Main ring, many pulse magnets have been installed. It is very important for the machine operation to monitor the firing time, rising time and peak value of the pulsed magnet currents. It is also very important for magnet tuning to obtain good injection efficiency of the Booster and the Main ring, and to observe the last circulating bunched beam in the Booster as well as the first circulating in the Main. These magnet currents and beam intensity signals are digitized by a digital oscilloscope with signal multiplexers, and then shown on a graphic display screen of the console via a VME computer. (author)

  2. Computer programs for the acquisition and analysis of eddy-current array probe data

    International Nuclear Information System (INIS)

    Pate, J.R.; Dodd, C.V.

    1996-07-01

    Objective of the Improved Eddy-Curent ISI (in-service inspection) for Steam Generators Tubing program is to upgrade and validate eddy-current inspections, including probes, instrumentation, and data processing techniques for ISI of new, used, and repaired steam generator tubes; to improve defect detection, classification and characterization as affected by diameter and thickness variations, denting, probe wobble, tube sheet, tube supports, copper and sludge deposits, even when defect types and other variables occur in combination; to transfer this advanced technology to NRC's mobile NDE laboratory and staff. This report documents computer programs that were developed for acquisition of eddy-current data from specially designed 16-coil array probes. Complete code as well as instructions for use are provided

  3. Nanocrystalline material in toroidal cores for current transformer: analytical study and computational simulations

    Directory of Open Access Journals (Sweden)

    Benedito Antonio Luciano

    2005-12-01

    Full Text Available Based on electrical and magnetic properties, such as saturation magnetization, initial permeability, and coercivity, in this work are presented some considerations about the possibilities of applications of nanocrystalline alloys in toroidal cores for current transformers. It is discussed how the magnetic characteristics of the core material affect the performance of the current transformer. From the magnetic characterization and the computational simulations, using the finite element method (FEM, it has been verified that, at the typical CT operation value of flux density, the nanocrystalline alloys properties reinforce the hypothesis that the use of these materials in measurement CT cores can reduce the ratio and phase errors and can also improve its accuracy class.

  4. Computer-aided diagnosis in medical imaging: historical review, current status and future potential.

    Science.gov (United States)

    Doi, Kunio

    2007-01-01

    Computer-aided diagnosis (CAD) has become one of the major research subjects in medical imaging and diagnostic radiology. In this article, the motivation and philosophy for early development of CAD schemes are presented together with the current status and future potential of CAD in a PACS environment. With CAD, radiologists use the computer output as a "second opinion" and make the final decisions. CAD is a concept established by taking into account equally the roles of physicians and computers, whereas automated computer diagnosis is a concept based on computer algorithms only. With CAD, the performance by computers does not have to be comparable to or better than that by physicians, but needs to be complementary to that by physicians. In fact, a large number of CAD systems have been employed for assisting physicians in the early detection of breast cancers on mammograms. A CAD scheme that makes use of lateral chest images has the potential to improve the overall performance in the detection of lung nodules when combined with another CAD scheme for PA chest images. Because vertebral fractures can be detected reliably by computer on lateral chest radiographs, radiologists' accuracy in the detection of vertebral fractures would be improved by the use of CAD, and thus early diagnosis of osteoporosis would become possible. In MRA, a CAD system has been developed for assisting radiologists in the detection of intracranial aneurysms. On successive bone scan images, a CAD scheme for detection of interval changes has been developed by use of temporal subtraction images. In the future, many CAD schemes could be assembled as packages and implemented as a part of PACS. For example, the package for chest CAD may include the computerized detection of lung nodules, interstitial opacities, cardiomegaly, vertebral fractures, and interval changes in chest radiographs as well as the computerized classification of benign and malignant nodules and the differential diagnosis of

  5. Building an application for computing the resource requests such as disk, CPU, and tape and studying the time evolution of computing model

    CERN Document Server

    Noormandipour, Mohammad Reza

    2017-01-01

    The goal of this project was building an application to calculate the computing resources needed by the LHCb experiment for data processing and analysis, and to predict their evolution in future years. The source code was developed in the Python programming language and the application built and developed in CERN GitLab. This application will facilitate the calculation of resources required by LHCb in both qualitative and quantitative aspects. The granularity of computations is improved to a weekly basis, in contrast with the yearly basis used so far. The LHCb computing model will benefit from the new possibilities and options added, as the new predictions and calculations are aimed at giving more realistic and accurate estimates.

  6. Social media as an open-learning resource in medical education: current perspectives

    Directory of Open Access Journals (Sweden)

    Sutherland S

    2017-06-01

    Full Text Available S Sutherland,1 A Jalali2 1Department of Critical Care, The Ottawa Hospital, ²Division of Clinical and Functional Anatomy, Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada Purpose: Numerous studies evaluate the use of social media as an open-learning resource in education, but there is a little published knowledge of empirical evidence that such open-learning resources produce educative outcomes, particularly with regard to student performance. This study undertook a systematic review of the published literature in medical education to determine the state of the evidence as to empirical studies that conduct an evaluation or research regarding social media and open-learning resources.Methods: The authors searched MEDLINE, ERIC, Embase, PubMed, Scopus, and Google Scholar from 2012 to 2017. This search included using keywords related to social media, medical education, research, and evaluation, while restricting the search to peer reviewed, English language articles only. To meet inclusion criteria, manuscripts had to employ evaluative methods and undertake empirical research.Results: Empirical work designed to evaluate the impact of social media as an open-learning resource in medical education is limited as only 13 studies met inclusion criteria. The majority of these studies used undergraduate medical education as the backdrop to investigate open-learning resources, such as Facebook, Twitter, and YouTube. YouTube appears to have little educational value due to the unsupervised nature of content added on a daily basis. Overall, extant reviews have demonstrated that we know a considerable amount about social media use, although to date, its impacts remain unclear.Conclusion: There is a paucity of outcome-based, empirical studies assessing the impact of social media in medical education. The few empirical studies identified tend to focus on evaluating the affective outcomes of social media and medical education as opposed to

  7. Cross stratum resources protection in fog-computing-based radio over fiber networks for 5G services

    Science.gov (United States)

    Guo, Shaoyong; Shao, Sujie; Wang, Yao; Yang, Hui

    2017-09-01

    In order to meet the requirement of internet of things (IoT) and 5G, the cloud radio access network is a paradigm which converges all base stations computational resources into a cloud baseband unit (BBU) pool, while the distributed radio frequency signals are collected by remote radio head (RRH). A precondition for centralized processing in the BBU pool is an interconnection fronthaul network with high capacity and low delay. However, it has become more complex and frequent in the interaction between RRH and BBU and resource scheduling among BBUs in cloud. Cloud radio over fiber network has been proposed in our previous work already. In order to overcome the complexity and latency, in this paper, we first present a novel cross stratum resources protection (CSRP) architecture in fog-computing-based radio over fiber networks (F-RoFN) for 5G services. Additionally, a cross stratum protection (CSP) scheme considering the network survivability is introduced in the proposed architecture. The CSRP with CSP scheme can effectively pull the remote processing resource locally to implement the cooperative radio resource management, enhance the responsiveness and resilience to the dynamic end-to-end 5G service demands, and globally optimize optical network, wireless and fog resources. The feasibility and efficiency of the proposed architecture with CSP scheme are verified on our software defined networking testbed in terms of service latency, transmission success rate, resource occupation rate and blocking probability.

  8. On a numerical strategy to compute gravity currents of non-Newtonian fluids

    International Nuclear Information System (INIS)

    Vola, D.; Babik, F.; Latche, J.-C.

    2004-01-01

    This paper is devoted to the presentation of a numerical scheme for the simulation of gravity currents of non-Newtonian fluids. The two dimensional computational grid is fixed and the free-surface is described as a polygonal interface independent from the grid and advanced in time by a Lagrangian technique. Navier-Stokes equations are semi-discretized in time by the Characteristic-Galerkin method, which finally leads to solve a generalized Stokes problem posed on a physical domain limited by the free surface to only a part of the computational grid. To this purpose, we implement a Galerkin technique with a particular approximation space, defined as the restriction to the fluid domain of functions of a finite element space. The decomposition-coordination method allows to deal without any regularization with a variety of non-linear and possibly non-differentiable constitutive laws. Beside more analytical tests, we revisit with this numerical method some simulations of gravity currents of the literature, up to now investigated within the simplified thin-flow approximation framework

  9. Current strategies for improving access and adherence to antiretroviral therapies in resource-limited settings

    Directory of Open Access Journals (Sweden)

    Scanlon ML

    2013-01-01

    Full Text Available Michael L Scanlon,1,2 Rachel C Vreeman1,21Department of Pediatrics, Indiana University School of Medicine, Indianapolis, IN, USA; 2USAID, Academic Model Providing Access to Healthcare (AMPATH Partnership, Eldoret, KenyaAbstract: The rollout of antiretroviral therapy (ART significantly reduced human immunodeficiency virus (HIV-related morbidity and mortality, but good clinical outcomes depend on access and adherence to treatment. In resource-limited settings, where over 90% of the world’s HIV-infected population resides, data on barriers to treatment are emerging that contribute to low rates of uptake in HIV testing, linkage to and retention in HIV care systems, and suboptimal adherence rates to therapy. A review of the literature reveals limited evidence to inform strategies to improve access and adherence with the majority of studies from sub-Saharan Africa. Data from observational studies and randomized controlled trials support home-based, mobile and antenatal care HIV testing, task-shifting from doctor-based to nurse-based and lower level provider care, and adherence support through education, counseling and mobile phone messaging services. Strategies with more limited evidence include targeted HIV testing for couples and family members of ART patients, decentralization of HIV care, including through home- and community-based ART programs, and adherence promotion through peer health workers, treatment supporters, and directly observed therapy. There is little evidence for improving access and adherence among vulnerable groups such as women, children and adolescents, and other high-risk populations and for addressing major barriers. Overall, studies are few in number and suffer from methodological issues. Recommendations for further research include health information technology, social-level factors like HIV stigma, and new research directions in cost-effectiveness, operations, and implementation. Findings from this review make a

  10. Core clerkship directors: their current resources and the rewards of the role.

    Science.gov (United States)

    Ephgrave, Kimberly; Margo, Katherine L; White, Christopher; Hammoud, Maya; Brodkey, Amy; Painter, Thomas; Juel, Vern C; Shaw, Darlene; Ferguson, Kristi

    2010-04-01

    To conduct a national multidisciplinary investigation assessing core clinical clerkships and their directors, variances in resources from national guidelines, and the impact of the clerkship director role on faculty members' academic productivity, advancement, and satisfaction. A multidisciplinary working group of the Alliance for Clinical Education (ACE), representing all seven core clinical disciplines, created and distributed a survey to clerkship directors at 125 U.S. MD-granting medical schools, in academic year 2006-2007. A total of 544 clerkship directors from Internal Medicine (96), Family Medicine (91), Psychiatry, (91), Pediatrics (79), Surgery (71), Neurology (60), and Obstetrics-Gynecology (56) responded, representing over 60% of U.S. core clinical clerkships. The clerkship directors were similar across disciplines in demographics and academic productivity, though clinical and clerkship activities varied. Departmental staff support for clerkships averaged 0.69 people, distinctly less than the ACE's 2003 guideline of a full-time coordinator in all disciplines' clerkships. Clerkship directors reported heavy clinical responsibilities, which, as in previous studies, were negatively related to academic productivity. However, many clerkship directors felt the role enhanced their academic advancement; a large majority felt it significantly enhanced their career satisfaction. The resources and rewards of the clerkship director role were similar across disciplines. Expectations of clerkship directors were considerable, including responsibility for clinical material and the learning environment. Resources for many fall short of those stated in the ACE guidelines, particularly regarding support staff. However, the findings indicate that the clerkship director role can have benefits for academic advancement and strongly enhances career satisfaction.

  11. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE Model of Water Resources and Water Environments

    Directory of Open Access Journals (Sweden)

    Guohua Fang

    2016-09-01

    Full Text Available To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and output sources of the National Economic Production Department. Secondly, an extended Social Accounting Matrix (SAM of Jiangsu province is developed to simulate various scenarios. By changing values of the discharge fees (increased by 50%, 100% and 150%, three scenarios are simulated to examine their influence on the overall economy and each industry. The simulation results show that an increased fee will have a negative impact on Gross Domestic Product (GDP. However, waste water may be effectively controlled. Also, this study demonstrates that along with the economic costs, the increase of the discharge fee will lead to the upgrading of industrial structures from a situation of heavy pollution to one of light pollution which is beneficial to the sustainable development of the economy and the protection of the environment.

  12. Disposal of waste computer hard disk drive: data destruction and resources recycling.

    Science.gov (United States)

    Yan, Guoqing; Xue, Mianqiang; Xu, Zhenming

    2013-06-01

    An increasing quantity of discarded computers is accompanied by a sharp increase in the number of hard disk drives to be eliminated. A waste hard disk drive is a special form of waste electrical and electronic equipment because it holds large amounts of information that is closely connected with its user. Therefore, the treatment of waste hard disk drives is an urgent issue in terms of data security, environmental protection and sustainable development. In the present study the degaussing method was adopted to destroy the residual data on the waste hard disk drives and the housing of the disks was used as an example to explore the coating removal process, which is the most important pretreatment for aluminium alloy recycling. The key operation points of the degaussing determined were: (1) keep the platter plate parallel with the magnetic field direction; and (2) the enlargement of magnetic field intensity B and action time t can lead to a significant upgrade in the degaussing effect. The coating removal experiment indicated that heating the waste hard disk drives housing at a temperature of 400 °C for 24 min was the optimum condition. A novel integrated technique for the treatment of waste hard disk drives is proposed herein. This technique offers the possibility of destroying residual data, recycling the recovered resources and disposing of the disks in an environmentally friendly manner.

  13. Increasing efficiency of job execution with resource co-allocation in distributed computer systems

    OpenAIRE

    Cankar, Matija

    2014-01-01

    The field of distributed computer systems, while not new in computer science, is still the subject of a lot of interest in both industry and academia. More powerful computers, faster and more ubiquitous networks, and complex distributed applications are accelerating the growth of distributed computing. Large numbers of computers interconnected in a single network provide additional computing power to users whenever required. Such systems are, however, expensive and complex to manage, which ca...

  14. BK Virus-Associated Nephropathy: Current Situation in a Resource-Limited Country.

    Science.gov (United States)

    Yooprasert, P; Rotjanapan, P

    Data on BK virus-associated nephropathy (BKVAN) and treatment strategy in a resource-limited country are scarce. This study aimed to evaluate epidemiology of BKVAN and its situation in Thailand. A retrospective analysis was conducted among adult kidney transplant recipients at Ramathibodi Hospital from October 2011 to September 2016. Patients' demographic data, information on kidney transplantation, immunosuppressive therapy, cytomegalovirus and BK virus infections, and allograft outcomes were retrieved and analyzed. This study included 623 kidney transplant recipients. Only 327 patients (52.49%) received BK virus infection screening, and 176 of 327 patients had allograft dysfunction as a trigger for screening. BKVAN was identified in 39 of 327 patients (11.93%). Deceased donor transplantation and cytomegalovirus infection were associated with a higher risk of BKVAN (odds ratio = 2.2, P = .024, 95% confidence intervals [1.1, 4.43], and odds ratio = 2.6, P = .006, 95% confidence intervals [1.29, 5.26], respectively). BKVAN patients were at significantly higher risk for allograft rejection (P < .001) and allograft failure (P = .036). At the end of the study, 4 graft losses were documented (12.12%). BKVAN was associated with high rate of allograft rejection and failure. However, surveillance of its complications has been underperformed at our facility. Implementing a formal practice guideline may improve allograft outcome in resource-limited countries. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Doctors as managers of healthcare resources in Nigeria: Evolving roles and current challenges.

    Science.gov (United States)

    Ojo, Temitope Olumuyiwa; Akinwumi, Adebowale Femi

    2015-01-01

    Over the years, medical practice in Nigeria has evolved in scope and practice, in terms of changing disease patterns, patients' needs, and social expectations. In addition, there is a growing sentiment especially among the general public and some health workers that most doctors are bad managers. Besides drawing examples from some doctors in top management positions that have performed less creditably, critics also harp on the fact that more needs to be done to improve the training of doctors in health management. This article describes the role of doctors in this changing scene of practice and highlights the core areas where doctors' managerial competencies are required to improve the quality of healthcare delivery. Areas such as health care financing, essential drugs and supplies management, and human resource management are emphasized. Resources to be managed and various skills needed to function effectively at the different levels of management are also discussed. To ensure that doctors are well-skilled in managerial competencies, the article concludes by suggesting a curriculum review at undergraduate and postgraduate levels of medical training to include newer but relevant courses on health management in addition to the existing ones, whereas also advocating that doctors be incentivized to go for professional training in health management and not only in the core clinical specialties.

  16. Construction and assessment of hierarchical edge elements for three-dimensional computations of eddy currents

    Energy Technology Data Exchange (ETDEWEB)

    Midtgaard, Ole-Morten

    1997-12-31

    This thesis considers the feasibility of doing calculations to optimize electrical machines without the need to build expensive prototypes. It deals with the construction and assessment of new, hierarchical, hexahedral edge elements for three-dimensional computations of eddy currents with the electric vector potential formulation. The new elements, five in all, gave up to second-order approximations for both the magnetic field and the current density. Theoretical arguments showed these elements to be more economical for a given polynomial order of the approximated fields than the serendipity family of nodal elements. Further it was pointed out how the support of a source field computed by using edge elements could be made very small provided that a proper spanning tree was used in the edge element mesh. This was exploited for the voltage forcing technique, where source fields were used as basis functions, with unknown total currents in voltage forced conductors as degrees of freedom. The practical assessment of the edge elements proved the accuracy to improve with increasing polynomial order, both for local and global quantities. The most economical element was, however, one giving only complete first-order approximations for both fields. Further, the edge elements turned out to be better than the nodal elements also in practice. For the voltage forcing technique, source field basis functions which had small support, resulted in large reduction of the CPU-time for solving the main equation system, compared to source fields which had large support. The new elements can be used in a p-type adaptive scheme, and they should also be applicable for other tangentially continuous field problems. 67 refs., 34 figs., 10 tabs.

  17. The Development of an Individualized Instructional Program in Beginning College Mathematics Utilizing Computer Based Resource Units. Final Report.

    Science.gov (United States)

    Rockhill, Theron D.

    Reported is an attempt to develop and evaluate an individualized instructional program in pre-calculus college mathematics. Four computer based resource units were developed in the areas of set theory, relations and function, algebra, trigonometry, and analytic geometry. Objectives were determined by experienced calculus teachers, and…

  18. Iterative reconstruction for quantitative computed tomography analysis of emphysema: consistent results using different tube currents

    Directory of Open Access Journals (Sweden)

    Yamashiro T

    2015-02-01

    Full Text Available Tsuneo Yamashiro,1 Tetsuhiro Miyara,1 Osamu Honda,2 Noriyuki Tomiyama,2 Yoshiharu Ohno,3 Satoshi Noma,4 Sadayuki Murayama1 On behalf of the ACTIve Study Group 1Department of Radiology, Graduate School of Medical Science, University of the Ryukyus, Nishihara, Okinawa, Japan; 2Department of Radiology, Osaka University Graduate School of Medicine, Suita, Osaka, Japan; 3Department of Radiology, Kobe University Graduate School of Medicine, Kobe, Hyogo, Japan; 4Department of Radiology, Tenri Hospital, Tenri, Nara, Japan Purpose: To assess the advantages of iterative reconstruction for quantitative computed tomography (CT analysis of pulmonary emphysema. Materials and methods: Twenty-two patients with pulmonary emphysema underwent chest CT imaging using identical scanners with three different tube currents: 240, 120, and 60 mA. Scan data were converted to CT images using Adaptive Iterative Dose Reduction using Three Dimensional Processing (AIDR3D and a conventional filtered-back projection mode. Thus, six scans with and without AIDR3D were generated per patient. All other scanning and reconstruction settings were fixed. The percent low attenuation area (LAA%; < -950 Hounsfield units and the lung density 15th percentile were automatically measured using a commercial workstation. Comparisons of LAA% and 15th percentile results between scans with and without using AIDR3D were made by Wilcoxon signed-rank tests. Associations between body weight and measurement errors among these scans were evaluated by Spearman rank correlation analysis. Results: Overall, scan series without AIDR3D had higher LAA% and lower 15th percentile values than those with AIDR3D at each tube current (P<0.0001. For scan series without AIDR3D, lower tube currents resulted in higher LAA% values and lower 15th percentiles. The extent of emphysema was significantly different between each pair among scans when not using AIDR3D (LAA%, P<0.0001; 15th percentile, P<0.01, but was not

  19. Current computational modelling trends in craniomandibular biomechanics and their clinical implications.

    Science.gov (United States)

    Hannam, A G

    2011-03-01

    Computational models of interactions in the craniomandibular apparatus are used with increasing frequency to study biomechanics in normal and abnormal masticatory systems. Methods and assumptions in these models can be difficult to assess by those unfamiliar with current practices in this field; health professionals are often faced with evaluating the appropriateness, validity and significance of models which are perhaps more familiar to the engineering community. This selective review offers a foundation for assessing the strength and implications of a craniomandibular modelling study. It explores different models used in general science and engineering and focuses on current best practices in biomechanics. The problem of validation is considered at some length, because this is not always fully realisable in living subjects. Rigid-body, finite element and combined approaches are discussed, with examples of their application to basic and clinically relevant problems. Some advanced software platforms currently available for modelling craniomandibular systems are mentioned. Recent studies of the face, masticatory muscles, tongue, craniomandibular skeleton, temporomandibular joint, dentition and dental implants are reviewed, and the significance of non-linear and non-isotropic material properties is emphasised. The unique challenges in clinical application are discussed, and the review concludes by posing some questions which one might reasonably expect to find answered in plausible modelling studies of the masticatory apparatus. © 2010 Blackwell Publishing Ltd.

  20. Computer programmes for high current ion trajectories in a magnetic sector-type mass separator

    International Nuclear Information System (INIS)

    Nakai, Akira

    1988-01-01

    According to theoretical calculations previously proposed by the author, a new programme 'MALT' for electronic computers has been developed for numerical calculations of ion trajectories of a high current ion beam traversing a magnetic sector-type mass separator. In the programme, both effects of the fringing field and the space charge are taken into account in an analytical way, so that numerical calculations can be done straightforwardly. Furthermore, it becomes also possible to analyze and cotrol the trajectories of the high current ion beam. The programme MALT contains several subroutine programmes which are separated individually for the convenience of various calculations with respect to the high current ion beam. To demonstrate the calculations by the use of these subroutine programmes, a main programme for the calculation of the trajectories in the whole region of the separator is shown, which also makes it possible to draw the traces of the trajectories. The trajectories calculated by the proposed programme have been compared with the images of the ion beams recorded on novel dry plates developed by the author: the comparison enables us to evaluate the effective space charge and the effective space charge potential, and to analyze the behaviour of the beam of neutral particles accompanying the ion beam. (author)

  1. Historical Overview, Current Status, and Future Trends in Human-Computer Interfaces for Process Control

    International Nuclear Information System (INIS)

    Owre, Fridtjov

    2003-01-01

    Approximately 25 yr ago, the first computer-based process control systems, including computer-generated displays, appeared. It is remarkable how slowly the human-computer interfaces (HCI's) of such systems have developed over the years. The display design approach in those early days had its roots in the topology of the process. Usually, the information came from the piping and instrumentation diagrams. Later, some important additional functions were added to the basic system, such as alarm and trend displays. Today, these functions are still the basic ones, and the end-user displays have not changed much except for improved display quality in terms of colors, font types and sizes, resolution, and object shapes, resulting from improved display hardware.Today, there are two schools of display design competing for supremacy in the process control segment of the HCI community. One can be characterized by extension and integration of current practice, while the other is more revolutionary.The extension of the current practice approach can be described in terms of added system functionality and integration. This means that important functions for the plant operator - such as signal validation, plant overview information, safety parameter displays, procedures, prediction of future states, and plant performance optimization - are added to the basic functions and integrated in a total unified HCI for the plant operator.The revolutionary approach, however, takes as its starting point the design process itself. The functioning of the plant is described in terms of the plant goals and subgoals, as well as the means available to reach these goals. Then, displays are designed representing this functional structure - in clear contrast to the earlier plant topology representation. Depending on the design approach used, the corresponding displays have various designations, e.g., function-oriented, task-oriented, or ecological displays.This paper gives a historical overview of past

  2. Computer Assisted Surgery and Current Trends in Orthopaedics Research and Total Joint Replacements

    Science.gov (United States)

    Amirouche, Farid

    2008-06-01

    Musculoskeletal research has brought about revolutionary changes in our ability to perform high precision surgery in joint replacement procedures. Recent advances in computer assisted surgery as well better materials have lead to reduced wear and greatly enhanced the quality of life of patients. The new surgical techniques to reduce the size of the incision and damage to underlying structures have been the primary advance toward this goal. These new techniques are known as MIS or Minimally Invasive Surgery. Total hip and knee Arthoplasties are at all time high reaching 1.2 million surgeries per year in the USA. Primary joint failures are usually due to osteoarthristis, rheumatoid arthritis, osteocronis and other inflammatory arthritis conditions. The methods for THR and TKA are critical to initial stability and longevity of the prostheses. This research aims at understanding the fundamental mechanics of the joint Arthoplasty and providing an insight into current challenges in patient specific fitting, fixing, and stability. Both experimental and analytical work will be presented. We will examine Cementless total hip arthroplasty success in the last 10 years and how computer assisted navigation is playing in the follow up studies. Cementless total hip arthroplasty attains permanent fixation by the ingrowth of bone into a porous coated surface. Loosening of an ingrown total hip arthroplasty occurs as a result of osteolysis of the periprosthetic bone and degradation of the bone prosthetic interface. The osteolytic process occurs as a result of polyethylene wear particles produced by the metal polyethylene articulation of the prosthesis. The total hip arthroplasty is a congruent joint and the submicron wear particles produced are phagocytized by macrophages initiating an inflammatory cascade. This cascade produces cytokines ultimately implicated in osteolysis. Resulting bone loss both on the acetabular and femoral sides eventually leads to component instability. As

  3. Computational dosimetry for grounded and ungrounded human models due to contact current

    International Nuclear Information System (INIS)

    Chan, Kwok Hung; Hattori, Junya; Laakso, Ilkka; Hirata, Akimasa; Taki, Masao

    2013-01-01

    This study presents the computational dosimetry of contact currents for grounded and ungrounded human models. The uncertainty of the quasi-static (QS) approximation of the in situ electric field induced in a grounded/ungrounded human body due to the contact current is first estimated. Different scenarios of cylindrical and anatomical human body models are considered, and the results are compared with the full-wave analysis. In the QS analysis, the induced field in the grounded cylindrical model is calculated by the QS finite-difference time-domain (QS-FDTD) method, and compared with the analytical solution. Because no analytical solution is available for the grounded/ungrounded anatomical human body model, the results of the QS-FDTD method are then compared with those of the conventional FDTD method. The upper frequency limit for the QS approximation in the contact current dosimetry is found to be 3 MHz, with a relative local error of less than 10%. The error increases above this frequency, which can be attributed to the neglect of the displacement current. The QS or conventional FDTD method is used for the dosimetry of induced electric field and/or specific absorption rate (SAR) for a contact current injected into the index finger of a human body model in the frequency range from 10 Hz to 100 MHz. The in situ electric fields or SAR are compared with the basic restrictions in the international guidelines/standards. The maximum electric field or the 99th percentile value of the electric fields appear not only in the fat and muscle tissues of the finger, but also around the wrist, forearm, and the upper arm. Some discrepancies are observed between the basic restrictions for the electric field and SAR and the reference levels for the contact current, especially in the extremities. These discrepancies are shown by an equation that relates the current density, tissue conductivity, and induced electric field in the finger with a cross-sectional area of 1 cm 2 . (paper)

  4. THE CURRENT SITUATION OF WATER RESOURCES IN IRRIGATED AGRICULTURE OF UZBEKISTAN

    OpenAIRE

    Djalalov, Sandjar

    1998-01-01

    Irrigation in Uzbekistan is of great importance since the country is an arid zone. The use of water in agriculture is described and its relationship as a constraint to economic development discussed. The current technical and organizational characteristics of irrigation systems need study and analysis to identify opportunities for improvements. The characteristics of demand for water at the farm level are described and irrigation and land improvement activities are outlined. Reform of water u...

  5. Current role of multidetector computed tomography in imaging of wrist injuries.

    Science.gov (United States)

    Syed, Mohd Arif; Raj, Vimal; Jeyapalan, Kanagaratnam

    2013-01-01

    Imaging of the wrist is challenging to both radiologists and orthopedic surgeons. This is primarily because of the complex anatomy/functionality of the wrist and also the fact that many frequent injuries are sustained to the hands. On going developments in multidetector computed tomography (MDCT) technology with its "state of the art" postprocessing capabilities have revolutionized this field. Apart from routine imaging of wrist trauma, it is now possible to assess intrinsic ligaments with MDCT arthrography, thereby avoiding invasive diagnostic arthroscopies. Postoperative wrist imaging can be a diagnostic challenge, and MDCT can be helpful in assessment of these cases because volume acquisition and excellent postprocessing abilities help to evaluate these wrists in any desired plane and thinner slices. This article pictorially reviews the current clinical role of MDCT imaging of wrist in our practice. It also describes arthrography technique and scanning parameters used at our center. Copyright © 2013 Mosby, Inc. All rights reserved.

  6. Current status of dental caries diagnosis using cone beam computed tomography

    International Nuclear Information System (INIS)

    Park, Young Seok; Ahn, Jin Soo; Kwon, Ho Beom; Lee, Seung Pyo

    2011-01-01

    The purpose of this article is to review the current status of dental caries diagnosis using cone beam computed tomography (CBCT). An online PubMed search was performed to identify studies on caries research using CBCT. Despite its usefulness, there were inherent limitations in the detection of caries lesions through conventional radiograph mainly due to the two-dimensional (2D) representation of caries lesions. Several efforts were made to investigate the three-dimensional (3D) image of lesion, only to gain little popularity. Recently, CBCT was introduced and has been used for diagnosis of caries in several reports. Some of them maintained the superiority of CBCT systems, however it is still under controversies. The CBCT systems are promising, however they should not be considered as a primary choice of caries diagnosis in everyday practice yet. Further studies under more standardized condition should be performed in the near future.

  7. Current status of dental caries diagnosis using cone beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Park, Young Seok; Ahn, Jin Soo; Kwon, Ho Beom; Lee, Seung Pyo [School of Dentistry, Seoul National University, Seoul (Korea, Republic of)

    2011-06-15

    The purpose of this article is to review the current status of dental caries diagnosis using cone beam computed tomography (CBCT). An online PubMed search was performed to identify studies on caries research using CBCT. Despite its usefulness, there were inherent limitations in the detection of caries lesions through conventional radiograph mainly due to the two-dimensional (2D) representation of caries lesions. Several efforts were made to investigate the three-dimensional (3D) image of lesion, only to gain little popularity. Recently, CBCT was introduced and has been used for diagnosis of caries in several reports. Some of them maintained the superiority of CBCT systems, however it is still under controversies. The CBCT systems are promising, however they should not be considered as a primary choice of caries diagnosis in everyday practice yet. Further studies under more standardized condition should be performed in the near future.

  8. Current Status of Human Resource Training Program for Fostering RIBiomics Professionals

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong-Eun; Jang, Beom-Su; Choi, Dae Seong [Advanced Radiation Technology Institute, Korea Atomic Energy Research Institute, Jeongeup 580-185 (Korea, Republic of); Park, Tai-jin [Radiation Research Division, Korea Radioisotope Association, Seoul 132-822, Republic of Korea (Korea, Republic of); Park, Sang Hyun [Advanced Radiation Technology Institute, Korea Atomic Energy Research Institute, Jeongeup 580-185 (Korea, Republic of); Radiation Research Division, Korea Radioisotope Association, Seoul 132-822, Republic of Korea (Korea, Republic of); Department of Radiobiotechnology and Applied Radioisotope Science, Korea University of Science and Technology, Deajeon 305-350 (Korea, Republic of)

    2015-07-01

    RI-Biomics is a state-of-the-art radiation fusion technology for evaluating in-vivo dynamics such as absorption, distribution, metabolism and excretion (ADME) of new drug candidates and biomaterials using radioisotope (RI), and quantitative evaluation of their efficacy via molecular imaging techniques and animal models. The RI-Biomics center is the sole comprehensive research and experiment complex in Korea that can simultaneously perform the radio-synthesis of drug candidate with radioisotope, analysis, and molecular imaging evaluation with animal model. Molecular imaging techniques, including nuclear imaging (SPECT and PET), near-infrared fluorescent (NIRF) imaging, and magnetic resonance imaging (MRI), are the cutting-edge technologies for evaluating drug candidates. Since they allow in vivo real-time imaging of the diseased site, monitoring the biodistribution of drug and determining the optimal therapeutic efficacy following treatments, we have integrated RI-ADME and molecular imaging to provide useful information for drug evaluation and to accelerate the development of new drugs and biomaterials. The RI-Biomics center was established with total investment of 18 million $ during four years from 2009 to 2012 in order to develop a comprehensive analyzing system using RI for new drug development as an axis for national growth in the next generation. The RI-Biomics center has labeling synthesis facility for the radiosynthesis of drug candidate with radioisotope such as Tc-99m, I-125, I-131, F-18, H-3 and C-14 using hot cell. It also includes RI-general analysis facilities, such as Radio-HPLC, LC/MS, GC/MS, gamma counter that can analyzing the radio-synthesized materials, and animal image analysis facilities that developed small animal imaging equipment such as SPECT/PET/CT, 7 T MRI, in-vivo optical imaging system and others. In order to achieve the system to verify safety and effectiveness of the new drugs using RI, it is necessary to establish a human resource

  9. Current Status of Human Resource Training Program for Fostering RIBiomics Professionals

    International Nuclear Information System (INIS)

    Lee, Dong-Eun; Jang, Beom-Su; Choi, Dae Seong; Park, Tai-jin; Park, Sang Hyun

    2015-01-01

    RI-Biomics is a state-of-the-art radiation fusion technology for evaluating in-vivo dynamics such as absorption, distribution, metabolism and excretion (ADME) of new drug candidates and biomaterials using radioisotope (RI), and quantitative evaluation of their efficacy via molecular imaging techniques and animal models. The RI-Biomics center is the sole comprehensive research and experiment complex in Korea that can simultaneously perform the radio-synthesis of drug candidate with radioisotope, analysis, and molecular imaging evaluation with animal model. Molecular imaging techniques, including nuclear imaging (SPECT and PET), near-infrared fluorescent (NIRF) imaging, and magnetic resonance imaging (MRI), are the cutting-edge technologies for evaluating drug candidates. Since they allow in vivo real-time imaging of the diseased site, monitoring the biodistribution of drug and determining the optimal therapeutic efficacy following treatments, we have integrated RI-ADME and molecular imaging to provide useful information for drug evaluation and to accelerate the development of new drugs and biomaterials. The RI-Biomics center was established with total investment of 18 million $ during four years from 2009 to 2012 in order to develop a comprehensive analyzing system using RI for new drug development as an axis for national growth in the next generation. The RI-Biomics center has labeling synthesis facility for the radiosynthesis of drug candidate with radioisotope such as Tc-99m, I-125, I-131, F-18, H-3 and C-14 using hot cell. It also includes RI-general analysis facilities, such as Radio-HPLC, LC/MS, GC/MS, gamma counter that can analyzing the radio-synthesized materials, and animal image analysis facilities that developed small animal imaging equipment such as SPECT/PET/CT, 7 T MRI, in-vivo optical imaging system and others. In order to achieve the system to verify safety and effectiveness of the new drugs using RI, it is necessary to establish a human resource

  10. Computer-aided diagnosis for screening of breast cancer on mammograms. Current status and future potential

    International Nuclear Information System (INIS)

    Doi, Kunio

    2007-01-01

    Described are the history, current status and future potential of computer-aided diagnosis (CAD) with particular emphasis on screening mammography for breast cancer. The systematic basic and clinical studies on CAD started around 20 years before and the significance of CAD has been well recognized to be evident because of human errors occurring in the visual check by doctors of so numerous screening images. Improvement of diagnostic accuracy by CAD has been demonstrated by statistical analysis of ROC (receiver operating characteristic) curves. In mammography, reviewed is detection of the early stage breast cancer like microcalcifications by computer alone, by CAD plus one or more doctors' reading, and by practical clinical CAD diagnosis. For differential diagnosis for malignancy, microcalcifications and masses are given their characteristic image properties and the results are that the Az-value (area under ROC curve) is higher in CAD than in doctor's (0.80 vs 0.61) in the former and, doctor's (0.93) is improved by CAD to 0.96 in the latter masses. In this diagnosis, similar images in the digital database are useful and the database can learn by repeated input of individual data by neural network. Detection of the lesion and especially, its differential diagnosis will be more important in parallel to database development and CAD will be also useful for doctor' carrier as an educational mean. (R.T.)

  11. Current-voltage curves for molecular junctions computed using all-electron basis sets

    International Nuclear Information System (INIS)

    Bauschlicher, Charles W.; Lawson, John W.

    2006-01-01

    We present current-voltage (I-V) curves computed using all-electron basis sets on the conducting molecule. The all-electron results are very similar to previous results obtained using effective core potentials (ECP). A hybrid integration scheme is used that keeps the all-electron calculations cost competitive with respect to the ECP calculations. By neglecting the coupling of states to the contacts below a fixed energy cutoff, the density matrix for the core electrons can be evaluated analytically. The full density matrix is formed by adding this core contribution to the valence part that is evaluated numerically. Expanding the definition of the core in the all-electron calculations significantly reduces the computational effort and, up to biases of about 2 V, the results are very similar to those obtained using more rigorous approaches. The convergence of the I-V curves and transmission coefficients with respect to basis set is discussed. The addition of diffuse functions is critical in approaching basis set completeness

  12. Current practice in software development for computational neuroscience and how to improve it.

    Science.gov (United States)

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  13. Current practice in software development for computational neuroscience and how to improve it.

    Directory of Open Access Journals (Sweden)

    Marc-Oliver Gewaltig

    2014-01-01

    Full Text Available Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  14. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

    Energy Technology Data Exchange (ETDEWEB)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); Zwahlen, Daniel [Kantonsspital Graubuenden, Department of Radiotherapy, Chur (Switzerland); Bodis, Stephan [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); University Hospital Zurich, Department of Radiation Oncology, Zurich (Switzerland)

    2016-09-15

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology ''Quantification of Radiation Therapy Infrastructure and Staffing'' guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO ''Health Economics in Radiation Oncology'' (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland. (orig.) [German] Ziel dieser Studie war es, den aktuellen Stand der Infrastruktur und Personalausstattung der

  15. Historical perspective of traditional indigenous medical practices: the current renaissance and conservation of herbal resources.

    Science.gov (United States)

    Pan, Si-Yuan; Litscher, Gerhard; Gao, Si-Hua; Zhou, Shu-Feng; Yu, Zhi-Ling; Chen, Hou-Qi; Zhang, Shuo-Feng; Tang, Min-Ke; Sun, Jian-Ning; Ko, Kam-Ming

    2014-01-01

    In recent years, increasing numbers of people have been choosing herbal medicines or products to improve their health conditions, either alone or in combination with others. Herbs are staging a comeback and herbal "renaissance" occurs all over the world. According to the World Health Organization, 75% of the world's populations are using herbs for basic healthcare needs. Since the dawn of mankind, in fact, the use of herbs/plants has offered an effective medicine for the treatment of illnesses. Moreover, many conventional/pharmaceutical drugs are derived directly from both nature and traditional remedies distributed around the world. Up to now, the practice of herbal medicine entails the use of more than 53,000 species, and a number of these are facing the threat of extinction due to overexploitation. This paper aims to provide a review of the history and status quo of Chinese, Indian, and Arabic herbal medicines in terms of their significant contribution to the health promotion in present-day over-populated and aging societies. Attention will be focused on the depletion of plant resources on earth in meeting the increasing demand for herbs.

  16. Historical Perspective of Traditional Indigenous Medical Practices: The Current Renaissance and Conservation of Herbal Resources

    Directory of Open Access Journals (Sweden)

    Si-Yuan Pan

    2014-01-01

    Full Text Available In recent years, increasing numbers of people have been choosing herbal medicines or products to improve their health conditions, either alone or in combination with others. Herbs are staging a comeback and herbal “renaissance” occurs all over the world. According to the World Health Organization, 75% of the world’s populations are using herbs for basic healthcare needs. Since the dawn of mankind, in fact, the use of herbs/plants has offered an effective medicine for the treatment of illnesses. Moreover, many conventional/pharmaceutical drugs are derived directly from both nature and traditional remedies distributed around the world. Up to now, the practice of herbal medicine entails the use of more than 53,000 species, and a number of these are facing the threat of extinction due to overexploitation. This paper aims to provide a review of the history and status quo of Chinese, Indian, and Arabic herbal medicines in terms of their significant contribution to the health promotion in present-day over-populated and aging societies. Attention will be focused on the depletion of plant resources on earth in meeting the increasing demand for herbs.

  17. Historical Perspective of Traditional Indigenous Medical Practices: The Current Renaissance and Conservation of Herbal Resources

    Science.gov (United States)

    Pan, Si-Yuan; Gao, Si-Hua; Zhou, Shu-Feng; Yu, Zhi-Ling; Chen, Hou-Qi; Zhang, Shuo-Feng; Tang, Min-Ke; Sun, Jian-Ning; Ko, Kam-Ming

    2014-01-01

    In recent years, increasing numbers of people have been choosing herbal medicines or products to improve their health conditions, either alone or in combination with others. Herbs are staging a comeback and herbal “renaissance” occurs all over the world. According to the World Health Organization, 75% of the world's populations are using herbs for basic healthcare needs. Since the dawn of mankind, in fact, the use of herbs/plants has offered an effective medicine for the treatment of illnesses. Moreover, many conventional/pharmaceutical drugs are derived directly from both nature and traditional remedies distributed around the world. Up to now, the practice of herbal medicine entails the use of more than 53,000 species, and a number of these are facing the threat of extinction due to overexploitation. This paper aims to provide a review of the history and status quo of Chinese, Indian, and Arabic herbal medicines in terms of their significant contribution to the health promotion in present-day over-populated and aging societies. Attention will be focused on the depletion of plant resources on earth in meeting the increasing demand for herbs. PMID:24872833

  18. Computation of groundwater resources and recharge in Chithar River Basin, South India.

    Science.gov (United States)

    Subramani, T; Babu, Savithri; Elango, L

    2013-01-01

    Groundwater recharge and available groundwater resources in Chithar River basin, Tamil Nadu, India spread over an area of 1,722 km(2) have been estimated by considering various hydrological, geological, and hydrogeological parameters, such as rainfall infiltration, drainage, geomorphic units, land use, rock types, depth of weathered and fractured zones, nature of soil, water level fluctuation, saturated thickness of aquifer, and groundwater abstraction. The digital ground elevation models indicate that the regional slope of the basin is towards east. The Proterozoic (Post-Archaean) basement of the study area consists of quartzite, calc-granulite, crystalline limestone, charnockite, and biotite gneiss with or without garnet. Three major soil types were identified namely, black cotton, deep red, and red sandy soils. The rainfall intensity gradually decreases from west to east. Groundwater occurs under water table conditions in the weathered zone and fluctuates between 0 and 25 m. The water table gains maximum during January after northeast monsoon and attains low during October. Groundwater abstraction for domestic/stock and irrigational needs in Chithar River basin has been estimated as 148.84 MCM (million m(3)). Groundwater recharge due to monsoon rainfall infiltration has been estimated as 170.05 MCM based on the water level rise during monsoon period. It is also estimated as 173.9 MCM using rainfall infiltration factor. An amount of 53.8 MCM of water is contributed to groundwater from surface water bodies. Recharge of groundwater due to return flow from irrigation has been computed as 147.6 MCM. The static groundwater reserve in Chithar River basin is estimated as 466.66 MCM and the dynamic reserve is about 187.7 MCM. In the present scenario, the aquifer is under safe condition for extraction of groundwater for domestic and irrigation purposes. If the existing water bodies are maintained properly, the extraction rate can be increased in future about 10% to 15%.

  19. The harmonic distortion evolution of current in computers; A evolucao da distorcao harmonica de corrente em computadores

    Energy Technology Data Exchange (ETDEWEB)

    Bollen, Math; Larsson, Anders; Lundmark, Martin [Universidade de Tecnologia de Lulea (LTU) (Sweden); Wahlberg, Mats; Roennberg, Sarah [Skelleftea Kraft (Sweden)

    2010-05-15

    This project made feeding measurements of large group of computers during games between 2002 and 2008, including the magnitude of current in each phase and in the neutral conductor, the energy consumption and the harmonic spectrum. The presented results show that the harmonic distortion has been diminishing significantly, while the energy consumption by computer do not register important increase.

  20. The ISMAR high frequency coastal radar network: Monitoring surface currents for management of marine resources

    DEFF Research Database (Denmark)

    Carlson, Daniel Frazier

    2015-01-01

    The Institute of Marine Sciences (ISMAR) of the National Research Council of Italy (CNR) established a High Frequency (HF) Coastal Radar Network for the measurement of the velocity of surface currents in coastal seas. The network consists of four HF radar systems located on the coast of the Gargano...... Promontory (Southern Adriatic, Italy). The network has been operational since May 2013 and covers an area of approximately 1700 square kilometers in the Gulf of Manfredonia. Quality Assessment (QA) procedures are applied for the systems deployment and maintenance and Quality Control (QC) procedures...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  2. Groundwater resources of the Devils Postpile National Monument—Current conditions and future vulnerabilities

    Science.gov (United States)

    Evans, William C.; Bergfeld, Deborah

    2017-06-15

    This study presents an extensive database on groundwater conditions in and around Devils Postpile National Monument. The database contains chemical analyses of springs and the monument water-supply well, including major-ion chemistry, trace element chemistry, and the first information on a list of organic compounds known as emerging contaminants. Diurnal, seasonal, and annual variations in groundwater discharge and chemistry are evaluated from data collected at five main monitoring sites, where streams carry the aggregate flow from entire groups of springs. These springs drain the Mammoth Mountain area and, during the fall months, contribute a significant fraction of the San Joaquin River flow within the monument. The period of this study, from fall 2012 to fall 2015, includes some of the driest years on record, though the seasonal variability observed in 2013 might have been near normal. The spring-fed streams generally flowed at rates well below those observed during a sequence of wet years in the late 1990s. However, persistence of flow and reasonably stable water chemistry through the recent dry years are indicative of a sizeable groundwater system that should provide a reliable resource during similar droughts in the future. Only a few emerging contaminants were detected at trace levels below 1 microgram per liter (μg/L), suggesting that local human visitation is not degrading groundwater quality. No indication of salt from the ski area on the north side of Mammoth Mountain could be found in any of the groundwaters. Chemical data instead show that natural mineral water, such as that discharged from local soda springs, is the main source of anomalous chloride in the monument supply well and in the San Joaquin River. The results of the study are used to develop a set of recommendations for future monitoring to enable detection of deleterious impacts to groundwater quality and quantity

  3. Current state of standardization in the field of dimensional computed tomography

    International Nuclear Information System (INIS)

    Bartscher, Markus; Härtig, Frank; Neuschaefer-Rube, Ulrich; Sato, Osamu

    2014-01-01

    Industrial x-ray computed tomography (CT) is a well-established non-destructive testing (NDT) technology and has been in use for decades. Moreover, CT has also started to become an important technology for dimensional metrology. But the requirements on dimensional CTs, i.e., on performing coordinate measurements with CT, are different from NDT. For dimensional measurements, the position of interfaces or surfaces is of importance, while this is often less critical in NDT. Standardization plays an important role here as it can create trust in new measurement technologies as is the case for dimensional CT. At the international standardization level, the ISO TC 213 WG 10 is working on specifications for dimensional CT. This paper highlights the demands on international standards in the field of dimensional CT and describes the current developments from the viewpoint of representatives of national and international standardization committees. Key aspects of the discussion are the material influence on the length measurement error E and how E can best be measured. A respective study was performed on hole plates as new reference standards for error testing of length measurements incorporating the material influence. We performed corresponding measurement data analysis and present a further elaborated hole plate design. The authors also comment on different approaches currently pursued and give an outlook on upcoming developments as far as they can be foreseen. (paper)

  4. Noninvasive imaging of coronary arteries: current and future role of multidetector row computer tomography

    International Nuclear Information System (INIS)

    Nedevska, M.; Stoinova, V.

    2006-01-01

    Full text: This review will present the current and future role of cardiac computer tomography (CCT), and particular multidetector CCT, for imaging of atherosclerotic pathologic changes of the coronary arteries. Atherosclerosis and its cardio-vascular complications represent one of the major issues of public health in industrial countries. Different imaging modalities, including invasive coronarography, have been aimed to the diagnosis of the disease, when it provokes symptomatic decrease of the blood flow. In spite of development of surgical and percutaneous methods for coronary revascularization, coronary artery disease remains the major cause of death in North America and Europe. This demonstrates the need of novel, complementary diagnostic strategies, aimed to identify asymptomatic stages as the basis of pharmacological interventions. Noninvasive coronary angiography with multidetector CT allows both assessment of luminal stenosis and subclinical disease of arterial wall. Large trails are missing now to understand and present what will be the role of this technology in the comprehensive assessment of patients, suspected of having CAD. Based on experience and current potentials we will describe how tomographic coronary imaging may eventually supplement traditional angiographic techniques in understanding the patterns of atherosclerotic CAD development

  5. BioSPICE: access to the most current computational tools for biologists.

    Science.gov (United States)

    Garvey, Thomas D; Lincoln, Patrick; Pedersen, Charles John; Martin, David; Johnson, Mark

    2003-01-01

    The goal of the BioSPICE program is to create a framework that provides biologists access to the most current computational tools. At the program midpoint, the BioSPICE member community has produced a software system that comprises contributions from approximately 20 participating laboratories integrated under the BioSPICE Dashboard and a methodology for continued software integration. These contributed software modules are the BioSPICE Dashboard, a graphical environment that combines Open Agent Architecture and NetBeans software technologies in a coherent, biologist-friendly user interface. The current Dashboard permits data sources, models, simulation engines, and output displays provided by different investigators and running on different machines to work together across a distributed, heterogeneous network. Among several other features, the Dashboard enables users to create graphical workflows by configuring and connecting available BioSPICE components. Anticipated future enhancements to BioSPICE include a notebook capability that will permit researchers to browse and compile data to support model building, a biological model repository, and tools to support the development, control, and data reduction of wet-lab experiments. In addition to the BioSPICE software products, a project website supports information exchange and community building.

  6. Nanoinformatics workshop report: Current resources, community needs, and the proposal of a collaborative framework for data sharing and information integration.

    Science.gov (United States)

    Harper, Stacey L; Hutchison, James E; Baker, Nathan; Ostraat, Michele; Tinkle, Sally; Steevens, Jeffrey; Hoover, Mark D; Adamick, Jessica; Rajan, Krishna; Gaheen, Sharon; Cohen, Yoram; Nel, Andre; Cachau, Raul E; Tuominen, Mark

    2013-01-01

    The quantity of information on nanomaterial properties and behavior continues to grow rapidly. Without a concerted effort to collect, organize and mine disparate information coming out of current research efforts, the value and effective use of this information will be limited at best. Data will not be translated to knowledge. At worst, erroneous conclusions will be drawn and future research may be misdirected. Nanoinformatics can be a powerful approach to enhance the value of global information in nanoscience and nanotechnology. Much progress has been made through grassroots efforts in nanoinformatics resulting in a multitude of resources and tools for nanoscience researchers. In 2012, the nanoinformatics community believed it was important to critically evaluate and refine currently available nanoinformatics approaches in order to best inform the science and support the future of predictive nanotechnology. The Greener Nano 2012: Nanoinformatics Tools and Resources Workshop brought together informatics groups with materials scientists active in nanoscience research to evaluate and reflect on the tools and resources that have recently emerged in support of predictive nanotechnology. The workshop goals were to establish a better understanding of current nanoinformatics approaches and to clearly define immediate and projected informatics infrastructure needs of the nanotechnology community. The theme of nanotechnology environmental health and safety (nanoEHS) was used to provide real-world, concrete examples on how informatics can be utilized to advance our knowledge and guide nanoscience. The benefit here is that the same properties that impact the performance of products could also be the properties that inform EHS. From a decision management standpoint, the dual use of such data should be considered a priority. Key outcomes include a proposed collaborative framework for data collection, data sharing and information integration.

  7. Nanoinformatics workshop report: current resources, community needs and the proposal of a collaborative framework for data sharing and information integration

    International Nuclear Information System (INIS)

    Harper, Stacey L; Hutchison, James E; Baker, Nathan; Ostraat, Michele; Tinkle, Sally; Steevens, Jeffrey; Hoover, Mark D; Adamick, Jessica; Rajan, Krishna; Gaheen, Sharon; Cohen, Yoram; Nel, Andre; Cachau, Raul E; Tuominen, Mark

    2013-01-01

    The quantity of information on nanomaterial properties and behavior continues to grow rapidly. Without a concerted effort to collect, organize and mine disparate information coming out of current research efforts, the value and effective use of this information will be limited at best. Data will not be translated to knowledge. At worst, erroneous conclusions will be drawn and future research may be misdirected. Nanoinformatics can be a powerful approach to enhance the value of global information in nanoscience and nanotechnology. Much progress has been made through grassroots efforts in nanoinformatics resulting in a multitude of resources and tools for nanoscience researchers. In 2012, the nanoinformatics community believed it was important to critically evaluate and refine currently available nanoinformatics approaches in order to best inform the science and support the future of predictive nanotechnology. The Greener Nano 2012: Nanoinformatics Tools and Resources Workshop brought together informatics groups with materials scientists active in nanoscience research to evaluate and reflect on the tools and resources that have recently emerged in support of predictive nanotechnology. The workshop goals were to establish a better understanding of current nanoinformatics approaches and to clearly define immediate and projected informatics infrastructure needs of the nanotechnology community. The theme of nanotechnology environmental health and safety (nanoEHS) was used to provide real-world, concrete examples on how informatics can be utilized to advance our knowledge and guide nanoscience. The benefit here is that the same properties that impact the performance of products could also be the properties that inform EHS. From a decision management standpoint, the dual use of such data should be considered a priority. Key outcomes include a proposed collaborative framework for data collection, data sharing and information integration. (paper)

  8. 面向业务对象的计算资源动态分配方法%DYNAMIC ALLOCATION OF COMPUTING RESOURCES FOR BUSINESS-ORIENTED OBJECT

    Institute of Scientific and Technical Information of China (English)

    尚海鹰

    2017-01-01

    This paper aims to summarize the development trend of computer system infrastructure.In view of the current era Internet plus information system business scenarios,we analyze the mainstream method of computing resources allocation and load balancing.Meanwhile,to further improve transaction processing efficiency and meet the demand of service level agreement flexibility,we introduce a dynamic allocation method of computing resources for business objects.According to the reference value of the processing performance of the actual application system,the computing resources allocation plan and dynamic adjustment strategy ofeach business object were obtained.The experiment achieved the desired effect through large amount of data in the actual clearing business of the city card.%概述计算机系统基础架构的发展趋势.针对当前互联网+时代事务处理系统的业务场景,分析研究了计算资源分配与负载均衡的基本方法.为满足事务处理系统对业务对象的差异化服务需求,并充分发挥事务处理系统的整体处理能力,提出面向业务对象的计算资源动态分配方法.方法根据实际应用系统平台的处理性能基准值,确定各业务对象的计算资源分配计划及动态调整策略.通过城市一卡通实际清算业务大数据量的测试达到预期效果.

  9. Review of current results in computational studies of hydrocarbon phase and transport properties in nanoporous structures

    Science.gov (United States)

    Stroev, N.; Myasnikov, A.

    2017-12-01

    This article provides a general overview of the main simulation results on the behavior of gas/liquids under confinement conditions, namely hydrocarbons in shale formations, and current understanding of such phenomena. In addition to the key effects, which different research groups obtained and which have to be taken into account during the creation of reservoir simulation software, a list of methods is briefly covered. Comprehensive understanding of both fluid phase equilibrium and transport properties in nanoscale structures is of great importance for many scientific and technical disciplines, especially for petroleum engineering considering the hydrocarbon behavior in complex shale formations, the development of which increases with time. Recent estimations show that a significant amount of resources are trapped inside organic matter and clays, which has extremely low permeability and yet great economic potential. The issue is not only of practical importance, as the existing conventional approaches by definition are unable to capture complicated physics phenomena for effective results, but it is also of fundamental value. The research of the processes connected with such deposits is necessary for both evaluations of petroleum reservoir deposits and hydrodynamic simulators. That is why the review is divided into two major parts—equilibrium states of hydrocarbons and their transport properties in highly confined conditions.

  10. Computation of classical triton burnup with high plasma temperature and current

    International Nuclear Information System (INIS)

    Batistoni, P.

    1990-09-01

    For comparison with experiment, the expected production of 14-MeV neutrons from the burnup of tritons produced in the d(d,t)p reaction must be computed. An effort was undertaken to compare in detail the computer codes used for this purpose at TFTR and JET. The calculation of the confined fraction of tritons by the different codes agrees to within a few percent. The high electron temperature in the experiments has raised the critical energy of the tritons that are slowing down to near or above the peak of the D-T reactivity, making the ion drag terms more important. When the different codes use the same slowing down formulas, the calculated burnup was within 6% for a case where orbit effects are expected to be small. Then results from codes with and without the effects of finite radial orbit excursions were compared for two test cases. For medium to high current discharges the finite radius effects are only of order 10%. A new version of the TFTR burnup code using an implicit Fokker-Planck solution was written to include the effects of energy diffusion and charge exchange. These effects change the time-integrated yields by only a few percent, but can significantly affect the instantaneous rates in time. Significant populations of hot ions can affect the fusion reactivity, and this effect was also studied. In particular, the d(d,p)t rate can be 10%--15% less than the d(d, 3 He)n rate which is usually used as a direct monitor of the triton source. Finally, a finite particle confinement time for the thermalized tritons can increase the apparent ''burn-up'' either if there is a high thermal deuteron temperature or if there exists a significant beam deuteron density

  11. Heats of formation of phosphorus compounds determined by current methods of computational quantum chemistry

    Science.gov (United States)

    Haworth, Naomi L.; Bacskay, George B.

    2002-12-01

    The heats of formation of a range of phosphorus containing molecules (P2, P4, PH, PH2, PH3, P2H2, P2H4, PO, PO2, PO3, P2O, P2O2, HPO, HPOH, H2POH, H3PO, HOPO, and HOPO2) have been determined by high level quantum chemical calculations. The equilibrium geometries and vibrational frequencies were computed via density functional theory, utilizing the B3LYP/6-31G(2df,p) functional and basis set. Atomization energies were obtained by the application of ab initio coupled cluster theory with single and double excitations from (spin)-restricted Hartree-Fock reference states with perturbative correction for triples [CCSD(T)], in conjunction with cc-pVnZ basis sets (n=T, Q, 5) which include an extra d function on the phosphorus atoms and diffuse functions on the oxygens, as recommended by Bauschlicher [J. Phys. Chem. A 103, 11126 (1999)]. The valence correlated atomization energies were extrapolated to the complete basis limit and corrected for core-valence (CV) correlation and scalar relativistic effects, as well as for basis set superposition errors (BSSE) in the CV terms. This methodology is effectively the same as the one adopted by Bauschlicher in his study of PO, PO2, PO3, HPO, HOPO, and HOPO2. Consequently, for these molecules the results of this work closely match Bauschlicher's computed values. The theoretical heats of formation, whose accuracy is estimated as ranging from ±1.0 to ±2.5 kcal mol-1, are consistent with the available experimental data. The current set of theoretical data represent a convenient benchmark, against which the results of other computational procedures, such as G3, G3X, and G3X2, can be compared. Despite the fact that G3X2 [which is an approximation to the quadratic CI procedure QCISD(T,Full)/G3Xlarge] is a formally higher level theory than G3X, the heats of formation obtained by these two methods are found to be of comparable accuracy. Both reproduce the benchmark heats of formation on the average to within ±2 kcal mol-1 and, for these

  12. Planning Committee for a National Resource for Computation in Chemistry. Final report, October 1, 1974--June 30, 1977

    International Nuclear Information System (INIS)

    1978-11-01

    The Planning Committee for a National Resource for Computation in Chemistry (NRCC) was charged with the responsibility of formulating recommendations regarding organizational structure for an NRCC including the composition, size, and responsibilities of its policy board, the relationship of such a board to the operating structure of the NRCC, to federal funding agencies, and to user groups; desirable priorities, growth rates, and levels of operations for the first several years; and facilities, access and site requirements for such a Resource. By means of site visits, questionnaires, and a workshop, the Committee sought advice from a wide range of potential users and organizations interested in chemical computation. Chemical kinetics, crystallography, macromolecular science, nonnumerical methods, physical organic chemistry, quantum chemistry, and statistical mechanics are covered

  13. Planning Committee for a National Resource for Computation in Chemistry. Final report, October 1, 1974--June 30, 1977

    Energy Technology Data Exchange (ETDEWEB)

    Bigeleisen, Jacob; Berne, Bruce J.; Coton, F. Albert; Scheraga, Harold A.; Simmons, Howard E.; Snyder, Lawrence C.; Wiberg, Kenneth B.; Wipke, W. Todd

    1978-11-01

    The Planning Committee for a National Resource for Computation in Chemistry (NRCC) was charged with the responsibility of formulating recommendations regarding organizational structure for an NRCC including the composition, size, and responsibilities of its policy board, the relationship of such a board to the operating structure of the NRCC, to federal funding agencies, and to user groups; desirable priorities, growth rates, and levels of operations for the first several years; and facilities, access and site requirements for such a Resource. By means of site visits, questionnaires, and a workshop, the Committee sought advice from a wide range of potential users and organizations interested in chemical computation. Chemical kinetics, crystallography, macromolecular science, nonnumerical methods, physical organic chemistry, quantum chemistry, and statistical mechanics are covered.

  14. National Uranium Resource Evaluation Program. Hydrogeochemical and Stream Sediment Reconnaissance Basic Data Reports Computer Program Requests Manual

    International Nuclear Information System (INIS)

    1980-01-01

    This manual is intended to aid those who are unfamiliar with ordering computer output for verification and preparation of Uranium Resource Evaluation (URE) Project reconnaissance basic data reports. The manual is also intended to help standardize the procedures for preparing the reports. Each section describes a program or group of related programs. The sections are divided into three parts: Purpose, Request Forms, and Requested Information

  15. Fast computational scheme for feedback control of high current fusion tokamaks

    International Nuclear Information System (INIS)

    Dong, J.Q.; Khayrutdinov, R.; Azizov, E.; Jardin, S.

    1992-01-01

    An accurate and fast numerical model of tokamak plasma evolution is presented. In this code (DINA) the equilibrium problem of plasmas with free boundaries in externally changing magnetic fields is solved simultaneously with the plasma transport equation. The circuit equations are solved for the vacuum vessel and passive and active coils. The code includes pellet injection, neutral beam heating, auxiliary heating, and alpha particle heating. Bootstrap and beam-driven plasma currents are accounted for. An inverse variable technique is utilized to obtain the coordinates of the equilibrium magnetic surfaces. This numerical algorithm permits to determine the flux coordinates very quickly and accurately. The authors show that using the fully resistive MHD analysis the region of stability (to vertical motions) is wider than using the rigid displacement model. Comparing plasma motions with the same gain, it is seen that the plasma oscillates more in the rigid analysis than in the MHD analysis. They study the influence of the pick up coil's location and the possibility of control of the plasma vertical position. They use a simple modification of the standard control law that enables the control of the plasma with pick up coils located at any position. This flexibility becomes critical in the design of future complex high current tokamak systems. The fully resistive MHD model permits to obtain accurate estimates of the plasma response. This approach yields computational time savings of one to two orders of magnitude with respect to other existing MHD models. In this sense, conventional numerical algorithms do not provide suitable models for application of modern control techniques into real time expert systems. The proposed inverse variable technique is rather suitable for incorporation in a comprehensive expert system for feedback control of fusion tokamaks in real time

  16. Computational intelligence in gait research: a perspective on current applications and future challenges.

    Science.gov (United States)

    Lai, Daniel T H; Begg, Rezaul K; Palaniswami, Marimuthu

    2009-09-01

    Our mobility is an important daily requirement so much so that any disruption to it severely degrades our perceived quality of life. Studies in gait and human movement sciences, therefore, play a significant role in maintaining the well-being of our mobility. Current gait analysis involves numerous interdependent gait parameters that are difficult to adequately interpret due to the large volume of recorded data and lengthy assessment times in gait laboratories. A proposed solution to these problems is computational intelligence (CI), which is an emerging paradigm in biomedical engineering most notably in pathology detection and prosthesis design. The integration of CI technology in gait systems facilitates studies in disorders caused by lower limb defects, cerebral disorders, and aging effects by learning data relationships through a combination of signal processing and machine learning techniques. Learning paradigms, such as supervised learning, unsupervised learning, and fuzzy and evolutionary algorithms, provide advanced modeling capabilities for biomechanical systems that in the past have relied heavily on statistical analysis. CI offers the ability to investigate nonlinear data relationships, enhance data interpretation, design more efficient diagnostic methods, and extrapolate model functionality. These are envisioned to result in more cost-effective, efficient, and easy-to-use systems, which would address global shortages in medical personnel and rising medical costs. This paper surveys current signal processing and CI methodologies followed by gait applications ranging from normal gait studies and disorder detection to artificial gait simulation. We review recent systems focusing on the existing challenges and issues involved in making them successful. We also examine new research in sensor technologies for gait that could be combined with these intelligent systems to develop more effective healthcare solutions.

  17. Multidimensional Environmental Data Resource Brokering on Computational Grids and Scientific Clouds

    Science.gov (United States)

    Montella, Raffaele; Giunta, Giulio; Laccetti, Giuliano

    Grid computing has widely evolved over the past years, and its capabilities have found their way even into business products and are no longer relegated to scientific applications. Today, grid computing technology is not restricted to a set of specific grid open source or industrial products, but rather it is comprised of a set of capabilities virtually within any kind of software to create shared and highly collaborative production environments. These environments are focused on computational (workload) capabilities and the integration of information (data) into those computational capabilities. An active grid computing application field is the fully virtualization of scientific instruments in order to increase their availability and decrease operational and maintaining costs. Computational and information grids allow to manage real-world objects in a service-oriented way using industrial world-spread standards.

  18. A REVIEW ON SECURITY ISSUES AND CHALLENGES IN CLOUD COMPUTING MODEL OF RESOURCE MANAGEMENT

    OpenAIRE

    T. Vaikunth Pai; Dr. P. S. Aithal

    2017-01-01

    Cloud computing services refer to set of IT-enabled services delivered to a customer as services over the Internet on a leased basis and have the capability to extend up or down their service requirements or needs. Usually, cloud computing services are delivered by third party vendors who own the infrastructure. It has several advantages include scalability, elasticity, flexibility, efficiency and outsourcing non-core activities of an organization. Cloud computing offers an innovative busines...

  19. Domain Decomposition for Computing Extremely Low Frequency Induced Current in the Human Body

    OpenAIRE

    Perrussel , Ronan; Voyer , Damien; Nicolas , Laurent; Scorretti , Riccardo; Burais , Noël

    2011-01-01

    International audience; Computation of electromagnetic fields in high resolution computational phantoms requires solving large linear systems. We present an application of Schwarz preconditioners with Krylov subspace methods for computing extremely low frequency induced fields in a phantom issued from the Visible Human.

  20. Computer-Assisted Language Learning: Current Programs and Projects. ERIC Digest.

    Science.gov (United States)

    Higgins, Chris

    For many years, foreign language teachers have used the computer to provide supplemental exercises in the instruction of foreign languages. In recent years, advances in computer technology have motivated teachers to reassess the computer and consider it a valuable part of daily foreign language learning. Innovative software programs, authoring…

  1. Assessment of knowledge and awareness among radiology personnel regarding current computed tomography technology and radiation dose

    Science.gov (United States)

    Karim, M. K. A.; Hashim, S.; Bradley, D. A.; Bahruddin, N. A.; Ang, W. C.; Salehhon, N.

    2016-03-01

    In this paper, we evaluate the level of knowledge and awareness among 120 radiology personnel working in 7 public hospitals in Johor, Malaysia, concerning Computed Tomography (CT) technology and radiation doses based on a set of questionnaires. Subjects were divided into two groups (Medical profession (Med, n=32) and Allied health profession (AH, n=88). The questionnaires are addressed: (1) demographic data (2) relative radiation dose and (3) knowledge of current CT technology. One-third of respondents from both groups were able to estimate relative radiation dose for routine CT examinations. 68% of the allied health profession personnel knew of the Malaysia regulations entitled ‘Basic Safety Standard (BSS) 2010’, although notably 80% of them had previously attended a radiation protection course. No significant difference (p < 0.05) in mean scores of CT technology knowledge detected between the two groups, with the medical professions producing a mean score of (26.7 ± 2.7) and the allied health professions a mean score of (25.2 ± 4.3). This study points to considerable variation among the respondents concerning their understanding of knowledge and awareness of risks of radiation and CT optimization techniques.

  2. Assessment of knowledge and awareness among radiology personnel regarding current computed tomography technology and radiation dose

    International Nuclear Information System (INIS)

    Karim, M K A; Hashim, S; Bahruddin, N A; Ang, W C; Salehhon, N; Bradley, D A

    2016-01-01

    In this paper, we evaluate the level of knowledge and awareness among 120 radiology personnel working in 7 public hospitals in Johor, Malaysia, concerning Computed Tomography (CT) technology and radiation doses based on a set of questionnaires. Subjects were divided into two groups (Medical profession (Med, n=32) and Allied health profession (AH, n=88). The questionnaires are addressed: (1) demographic data (2) relative radiation dose and (3) knowledge of current CT technology. One-third of respondents from both groups were able to estimate relative radiation dose for routine CT examinations. 68% of the allied health profession personnel knew of the Malaysia regulations entitled ‘Basic Safety Standard (BSS) 2010’, although notably 80% of them had previously attended a radiation protection course. No significant difference (p < 0.05) in mean scores of CT technology knowledge detected between the two groups, with the medical professions producing a mean score of (26.7 ± 2.7) and the allied health professions a mean score of (25.2 ± 4.3). This study points to considerable variation among the respondents concerning their understanding of knowledge and awareness of risks of radiation and CT optimization techniques. (paper)

  3. Use of time space Green's functions in the computation of transient eddy current fields

    International Nuclear Information System (INIS)

    Davey, K.; Turner, L.

    1988-01-01

    The utility of integral equations to solve eddy current problems has been borne out by numerous computations in the past few years, principally in sinusoidal steady-state problems. This paper attempts to examine the applicability of the integral approaches in both time and space for the more generic transient problem. The basic formulation for the time space Green's function approach is laid out. A technique employing Gauss-Laguerre integration is employed to realize the temporal solution, while Gauss--Legendre integration is used to resolve the spatial field character. The technique is then applied to the fusion electromagnetic induction experiments (FELIX) cylinder experiments in both two and three dimensions. It is found that quite accurate solutions can be obtained using rather coarse time steps and very few unknowns; the three-dimensional field solution worked out in this context used basically only four unknowns. The solution appears to be somewhat sensitive to the choice of time step, a consequence of a numerical instability imbedded in the Green's function near the origin

  4. Clinical utility of dental cone-beam computed tomography: current perspectives

    Directory of Open Access Journals (Sweden)

    Jaju PP

    2014-04-01

    Full Text Available Prashant P Jaju,1 Sushma P Jaju21Oral Medicine and Radiology, 2Conservative Dentistry and Endodontics, Rishiraj College of Dental Sciences and Research Center, Bhopal, IndiaAbstract: Panoramic radiography and computed tomography were the pillars of maxillofacial diagnosis. With the advent of cone-beam computed tomography, dental practice has seen a paradigm shift. This review article highlights the potential applications of cone-beam computed tomography in the fields of dental implantology and forensic dentistry, and its limitations in maxillofacial diagnosis.Keywords: dental implants, cone-beam computed tomography, panoramic radiography, computed tomography

  5. Using Free Computational Resources to Illustrate the Drug Design Process in an Undergraduate Medicinal Chemistry Course

    Science.gov (United States)

    Rodrigues, Ricardo P.; Andrade, Saulo F.; Mantoani, Susimaire P.; Eifler-Lima, Vera L.; Silva, Vinicius B.; Kawano, Daniel F.

    2015-01-01

    Advances in, and dissemination of, computer technologies in the field of drug research now enable the use of molecular modeling tools to teach important concepts of drug design to chemistry and pharmacy students. A series of computer laboratories is described to introduce undergraduate students to commonly adopted "in silico" drug design…

  6. University Students and Ethics of Computer Technology Usage: Human Resource Development

    Science.gov (United States)

    Iyadat, Waleed; Iyadat, Yousef; Ashour, Rateb; Khasawneh, Samer

    2012-01-01

    The primary purpose of this study was to determine the level of students' awareness about computer technology ethics at the Hashemite University in Jordan. A total of 180 university students participated in the study by completing the questionnaire designed by the researchers, named the Computer Technology Ethics Questionnaire (CTEQ). Results…

  7. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively on such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  9. Policies for Resource Efficient and Effective Solutions : A review of concepts, current policy landscape and future policy considerations for the transition to a Circular Economy

    OpenAIRE

    Milios, Leonidas

    2016-01-01

    This report presents basic concepts around resources, resource efficiency and the Circular Economy. The limitations and the opportunities within the Circular Economy are identified and clearly presented. The current policy landscape in the EU as well as in Sweden is thoroughly analysed and a set of policy areas with a significant untapped potential for resource efficiency is identified. The policy areas which have been underutilised so far include policies for re-use, repair and remanufacturi...

  10. Toward Production From Gas Hydrates: Current Status, Assessment of Resources, and Simulation-Based Evaluationof Technology and Potential

    Energy Technology Data Exchange (ETDEWEB)

    Reagan, Matthew; Moridis, George J.; Collett, Timothy; Boswell, Ray; Kurihara, M.; Reagan, Matthew T.; Koh, Carolyn; Sloan, E. Dendy

    2008-02-12

    Gas hydrates are a vast energy resource with global distribution in the permafrost and in the oceans. Even if conservative estimates are considered and only a small fraction is recoverable, the sheer size of the resource is so large that it demands evaluation as a potential energy source. In this review paper, we discuss the distribution of natural gas hydrate accumulations, the status of the primary international R&D programs, and the remaining science and technological challenges facing commercialization of production. After a brief examination of gas hydrate accumulations that are well characterized and appear to be models for future development and gas production, we analyze the role of numerical simulation in the assessment of the hydrate production potential, identify the data needs for reliable predictions, evaluate the status of knowledge with regard to these needs, discuss knowledge gaps and their impact, and reach the conclusion that the numerical simulation capabilities are quite advanced and that the related gaps are either not significant or are being addressed. We review the current body of literature relevant to potential productivity from different types of gas hydrate deposits, and determine that there are consistent indications of a large production potential at high rates over long periods from a wide variety of hydrate deposits. Finally, we identify (a) features, conditions, geology and techniques that are desirable in potential production targets, (b) methods to maximize production, and (c) some of the conditions and characteristics that render certain gas hydrate deposits undesirable for production.

  11. Clinical utility of dental cone-beam computed tomography: current perspectives

    OpenAIRE

    Jaju, Prashant P; Jaju, Sushma P

    2014-01-01

    Prashant P Jaju,1 Sushma P Jaju21Oral Medicine and Radiology, 2Conservative Dentistry and Endodontics, Rishiraj College of Dental Sciences and Research Center, Bhopal, IndiaAbstract: Panoramic radiography and computed tomography were the pillars of maxillofacial diagnosis. With the advent of cone-beam computed tomography, dental practice has seen a paradigm shift. This review article highlights the potential applications of cone-beam computed tomography in the fields of dental implantology an...

  12. Assessment of an organ‐based tube current modulation in thoracic computed tomography

    Science.gov (United States)

    Sugai, Mai; Toyoda, Asami; Koshida, Haruka; Sakuta, Keita; Takata, Tadanori; Koshida, Kichiro; Iida, Hiroji; Matsui, Osamu

    2012-01-01

    Recently, specific computed tomography (CT) scanners have been equipped with organ‐based tube current modulation (TCM) technology. It is possible that organ‐based TCM will replace the conventional dose‐reduction technique of reducing the effective milliampere‐second. The aim of this study was to determine if organ‐based TCM could reduce radiation exposure to the breasts without compromising the image uniformity and beam hardening effect in thoracic CT examinations. Breast and skin radiation doses and the absorbed radiation dose distribution within a single section were measured with an anthropomorphic phantom and radiophotoluminescent glass dosimeters using four approaches to thoracic CT (reference, organ‐based TCM, copper shielding, and the combination of the above two techniques, hereafter referred to as the combination technique). The CT value and noise level were measured using the same calibration phantom. Organ‐based TCM and copper shielding reduced radiation doses to the breast by 23.7% and 21.8%, respectively. However, the CT value increased, especially in the anterior region, using copper shielding. In contrast, the CT value and noise level barely increased using organ‐based TCM. The combination technique reduced the radiation dose to the breast by 38.2%, but greatly increased the absorbed radiation dose from the central to the posterior regions. Moreover, the CT value increased in the anterior region and the noise level increased by more than 10% in the entire region. Therefore, organ‐based TCM can reduce radiation doses to breasts with only small increases in noise levels, making it preferable for specific groups of patients, such as children and young women. PACS numbers: 87.53.Bn; 87.57.Q‐; 87.57.qp PMID:22402390

  13. Assessment of an organ-based tube current modulation in thoracic computed tomography.

    Science.gov (United States)

    Matsubara, Kosuke; Sugai, Mai; Toyoda, Asami; Koshida, Haruka; Sakuta, Keita; Takata, Tadanori; Koshida, Kichiro; Iida, Hiroji; Matsui, Osamu

    2012-03-08

    Recently, specific computed tomography (CT) scanners have been equipped with organ-based tube current modulation (TCM) technology. It is possible that organ-based TCM will replace the conventional dose-reduction technique of reducing the effective milliampere-second. The aim of this study was to determine if organ-based TCM could reduce radiation exposure to the breasts without compromising the image uniformity and beam hardening effect in thoracic CT examinations. Breast and skin radiation doses and the absorbed radiation dose distribution within a single section were measured with an anthropomorphic phantom and radiophotoluminescent glass dosimeters using four approaches to thoracic CT (reference, organ-based TCM, copper shielding, and the combination of the above two techniques, hereafter referred to as the combination technique). The CT value and noise level were measured using the same calibration phantom. Organ-based TCM and copper shielding reduced radiation doses to the breast by 23.7% and 21.8%, respectively. However, the CT value increased, especially in the anterior region, using copper shielding. In contrast, the CT value and noise level barely increased using organ-based TCM. The combination technique reduced the radiation dose to the breast by 38.2%, but greatly increased the absorbed radiation dose from the central to the posterior regions. Moreover, the CT value increased in the anterior region and the noise level increased by more than 10% in the entire region. Therefore, organ-based TCM can reduce radiation doses to breasts with only small increases in noise levels, making it preferable for specific groups of patients, such as children and young women.

  14. Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.

    Science.gov (United States)

    Knerr, Bruce W.; And Others

    Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…

  15. Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.

    Science.gov (United States)

    Bian, Yuemin; Xie, Xiang-Qun Sean

    2018-04-09

    Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.

  16. Virtual partitioning for robust resource sharing: computational techniques for heterogeneous traffic

    NARCIS (Netherlands)

    Borst, S.C.; Mitra, D.

    1998-01-01

    We consider virtual partitioning (VP), which is a scheme for sharing a resource among several traffic classes in an efficient, fair, and robust manner. In the preliminary design stage, each traffic class is allocated a nominal capacity, which is based on expected offered traffic and required quality

  17. Application of computer graphics to generate coal resources of the Cache coal bed, Recluse geologic model area, Campbell County, Wyoming

    Science.gov (United States)

    Schneider, G.B.; Crowley, S.S.; Carey, M.A.

    1982-01-01

    Low-sulfur subbituminous coal resources have been calculated, using both manual and computer methods, for the Cache coal bed in the Recluse Model Area, which covers the White Tail Butte, Pitch Draw, Recluse, and Homestead Draw SW 7 1/2 minute quadrangles, Campbell County, Wyoming. Approximately 275 coal thickness measurements obtained from drill hole data are evenly distributed throughout the area. The Cache coal and associated beds are in the Paleocene Tongue River Member of the Fort Union Formation. The depth from the surface to the Cache bed ranges from 269 to 1,257 feet. The thickness of the coal is as much as 31 feet, but in places the Cache coal bed is absent. Comparisons between hand-drawn and computer-generated isopach maps show minimal differences. Total coal resources calculated by computer show the bed to contain 2,316 million short tons or about 6.7 percent more than the hand-calculated figure of 2,160 million short tons.

  18. Recommendations for protecting National Library of Medicine Computing and Networking Resources

    Energy Technology Data Exchange (ETDEWEB)

    Feingold, R.

    1994-11-01

    Protecting Information Technology (IT) involves a number of interrelated factors. These include mission, available resources, technologies, existing policies and procedures, internal culture, contemporary threats, and strategic enterprise direction. In the face of this formidable list, a structured approach provides cost effective actions that allow the organization to manage its risks. We face fundamental challenges that will persist for at least the next several years. It is difficult if not impossible to precisely quantify risk. IT threats and vulnerabilities change rapidly and continually. Limited organizational resources combined with mission restraints-such as availability and connectivity requirements-will insure that most systems will not be absolutely secure (if such security were even possible). In short, there is no technical (or administrative) {open_quotes}silver bullet.{close_quotes} Protection is employing a stratified series of recommendations, matching protection levels against information sensitivities. Adaptive and flexible risk management is the key to effective protection of IT resources. The cost of the protection must be kept less than the expected loss, and one must take into account that an adversary will not expend more to attack a resource than the value of its compromise to that adversary. Notwithstanding the difficulty if not impossibility to precisely quantify risk, the aforementioned allows us to avoid the trap of choosing a course of action simply because {open_quotes}it`s safer{close_quotes} or ignoring an area because no one had explored its potential risk. Recommendations for protecting IT resources begins with discussing contemporary threats and vulnerabilities, and then procedures from general to specific preventive measures. From a risk management perspective, it is imperative to understand that today, the vast majority of threats are against UNIX hosts connected to the Internet.

  19. Wealth geography, environment and hunger: small critic contribution to the current agrarian/agricultural model of the natural resources usage

    Directory of Open Access Journals (Sweden)

    Carlos Walter Porto Gonçalves

    2004-01-01

    Full Text Available The text questions the geopolitical issue implied in the argument about hunger and the environment. It criticizes the current agrarian / agricultural model of the natural resources usage, stating it is a model of economic development of mild regions that has been imposed all over the world at a very high ecological, cultural and political cost. This model has faced the patrimonial, collective and community knowledge, characteristic of populations with distinct rationality from the occidental atomistic-individualistic one, with severe risks to the feeding safety. It analyzes the social-environmental consequences of the current agrarian / agricultural model, the contradictory results of the increase of the world capacity of food production, hunger in the world, the meanings of the Green Revolution from the seventies on, the social-environmental impacts of the agrarian business in the Brazilian cerrado and the complexity of the use of transgenic products. It criticizes the restricted ecological sustentation based on a political realism, and proposes a reflection upon a new rationality for the environmental challenge. It concludes that hunger is not a technical problem, for it does not happen because of the lack of food, but because of the way the food is produced and distributed. Today hunger lives with the provisions necessary to overcome itself.

  20. Current costing models: are they suitable for allocating health resources? The example of fall injury prevention in Australia.

    Science.gov (United States)

    Moller, Jerry

    2005-01-01

    The example of fall injury among older people is used to define and illustrate how current Australian systems for allocation of health resources perform for funding emerging public health issues. While the examples are Australian, the allocation and priority setting methods are common in the health sector in all developed western nations. With an ageing population the number of falls injuries in Australia and the cost of treatment will rise dramatically over the next 20-50 years. Current methods of allocating funds within the health system are not well suited to meeting this coming epidemic. The information requirements for cost-benefit and cost-effectiveness measures cannot be met. Marginal approaches to health funding are likely to continue to fund already well-funded treatment or politically driven prevention processes and to miss the opportunity for new prevention initiatives in areas that do not have a high political profile. Fall injury is one of many emerging areas that struggle to make claims for funding because the critical mass of intervention and evidence of its impact is not available. The beneficiaries of allocation failure may be those who treat the disease burden that could have been easily prevented. Changes to allocation mechanisms, data systems and new initiative funding practices are required to ensure that preventative strategies are able to compete on an equal footing with treatment approaches for mainstream health funding.

  1. Dynamic resource allocation engine for cloud-based real-time video transcoding in mobile cloud computing environments

    Science.gov (United States)

    Adedayo, Bada; Wang, Qi; Alcaraz Calero, Jose M.; Grecos, Christos

    2015-02-01

    The recent explosion in video-related Internet traffic has been driven by the widespread use of smart mobile devices, particularly smartphones with advanced cameras that are able to record high-quality videos. Although many of these devices offer the facility to record videos at different spatial and temporal resolutions, primarily with local storage considerations in mind, most users only ever use the highest quality settings. The vast majority of these devices are optimised for compressing the acquired video using a single built-in codec and have neither the computational resources nor battery reserves to transcode the video to alternative formats. This paper proposes a new low-complexity dynamic resource allocation engine for cloud-based video transcoding services that are both scalable and capable of being delivered in real-time. Firstly, through extensive experimentation, we establish resource requirement benchmarks for a wide range of transcoding tasks. The set of tasks investigated covers the most widely used input formats (encoder type, resolution, amount of motion and frame rate) associated with mobile devices and the most popular output formats derived from a comprehensive set of use cases, e.g. a mobile news reporter directly transmitting videos to the TV audience of various video format requirements, with minimal usage of resources both at the reporter's end and at the cloud infrastructure end for transcoding services.

  2. Mobile clusters of single board computers: an option for providing resources to student projects and researchers.

    Science.gov (United States)

    Baun, Christian

    2016-01-01

    Clusters usually consist of servers, workstations or personal computers as nodes. But especially for academic purposes like student projects or scientific projects, the cost for purchase and operation can be a challenge. Single board computers cannot compete with the performance or energy-efficiency of higher-value systems, but they are an option to build inexpensive cluster systems. Because of the compact design and modest energy consumption, it is possible to build clusters of single board computers in a way that they are mobile and can be easily transported by the users. This paper describes the construction of such a cluster, useful applications and the performance of the single nodes. Furthermore, the clusters' performance and energy-efficiency is analyzed by executing the High Performance Linpack benchmark with a different number of nodes and different proportion of the systems total main memory utilized.

  3. Distributed Factorization Computation on Multiple Volunteered Mobile Resource to Break RSA Key

    Science.gov (United States)

    Jaya, I.; Hardi, S. M.; Tarigan, J. T.; Zamzami, E. M.; Sihombing, P.

    2017-01-01

    Similar to common asymmeric encryption, RSA can be cracked by usmg a series mathematical calculation. The private key used to decrypt the massage can be computed using the public key. However, finding the private key may require a massive amount of calculation. In this paper, we propose a method to perform a distributed computing to calculate RSA’s private key. The proposed method uses multiple volunteered mobile devices to contribute during the calculation process. Our objective is to demonstrate how the use of volunteered computing on mobile devices may be a feasible option to reduce the time required to break a weak RSA encryption and observe the behavior and running time of the application on mobile devices.

  4. Computational modeling as a tool for water resources management: an alternative approach to problems of multiple uses

    Directory of Open Access Journals (Sweden)

    Haydda Manolla Chaves da Hora

    2012-04-01

    Full Text Available Today in Brazil there are many cases of incompatibility regarding use of water and its availability. Due to the increase in required variety and volume, the concept of multiple uses was created, as stated by Pinheiro et al. (2007. The use of the same resource to satisfy different needs with several restrictions (qualitative and quantitative creates conflicts. Aiming to minimize these conflicts, this work was applied to the particular cases of Hydrographic Regions VI and VIII of Rio de Janeiro State, using computational modeling techniques (based on MOHID software – Water Modeling System as a tool for water resources management.

  5. Development of the method of aggregation to determine the current storage area using computer vision and radiofrequency identification

    Science.gov (United States)

    Astafiev, A.; Orlov, A.; Privezencev, D.

    2018-01-01

    The article is devoted to the development of technology and software for the construction of positioning and control systems in industrial plants based on aggregation to determine the current storage area using computer vision and radiofrequency identification. It describes the developed of the project of hardware for industrial products positioning system in the territory of a plant on the basis of radio-frequency grid. It describes the development of the project of hardware for industrial products positioning system in the plant on the basis of computer vision methods. It describes the development of the method of aggregation to determine the current storage area using computer vision and radiofrequency identification. Experimental studies in laboratory and production conditions have been conducted and described in the article.

  6. Computer simulation of electron beams. II. Low-cost beam-current reconstruction

    International Nuclear Information System (INIS)

    de Wolf, D.A.

    1985-01-01

    Reconstruction of current density in electron beams is complicated by distortion of phase space which can require very fine discretization of the beam into trajectories. An efficient discretization of phase space is exploited, using conservation of charge and current in hypertriangle patches, to reconstruct the current density by fitting Gaussians through the distorted hypertriangles. Advantages and limitations are discussed

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  8. Assessement of rheumatic diseases with computational radiology: current status and future potential

    DEFF Research Database (Denmark)

    Peloschek, Philipp; Boesen, Mikael; Donner, Rene

    2009-01-01

    In recent years, several computational image analysis methods to assess disease progression in rheumatic diseases were presented. This review article explains the basics of these methods as well as their potential application in rheumatic disease monitoring, it covers radiography, sonography...

  9. Current Trend Towards Using Soft Computing Approaches to Phase Synchronization in Communication Systems

    Science.gov (United States)

    Drake, Jeffrey T.; Prasad, Nadipuram R.

    1999-01-01

    This paper surveys recent advances in communications that utilize soft computing approaches to phase synchronization. Soft computing, as opposed to hard computing, is a collection of complementary methodologies that act in producing the most desirable control, decision, or estimation strategies. Recently, the communications area has explored the use of the principal constituents of soft computing, namely, fuzzy logic, neural networks, and genetic algorithms, for modeling, control, and most recently for the estimation of phase in phase-coherent communications. If the receiver in a digital communications system is phase-coherent, as is often the case, phase synchronization is required. Synchronization thus requires estimation and/or control at the receiver of an unknown or random phase offset.

  10. Attentional Resource Allocation and Cultural Modulation in a Computational Model of Ritualized Behavior

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Sørensen, Jesper

    2016-01-01

    studies have tried to answer by focusing on ritualized behavior instead of ritual. Ritualized behavior (i.e., a set of behavioral features embedded in rituals) increases attention to detail and induces cognitive resource depletion, which together support distinct modes of action categorization. While......How do cultural and religious rituals influence human perception and cognition, and what separates the highly patterned behaviors of communal ceremonies from perceptually similar precautionary and compulsive behaviors? These are some of the questions that recent theoretical models and empirical...... patterns and the simulation data were subjected to linear and non-linear analysis. The results are used to exemplify how action perception of ritualized behavior a) might influence allocation of attentional resources; and b) can be modulated by cultural priors. Further explorations of the model show why...

  11. Computer and Video Games in Family Life: The Digital Divide as a Resource in Intergenerational Interactions

    Science.gov (United States)

    Aarsand, Pal Andre

    2007-01-01

    In this ethnographic study of family life, intergenerational video and computer game activities were videotaped and analysed. Both children and adults invoked the notion of a digital divide, i.e. a generation gap between those who master and do not master digital technology. It is argued that the digital divide was exploited by the children to…

  12. Integrating Computing Resources: A Shared Distributed Architecture for Academics and Administrators.

    Science.gov (United States)

    Beltrametti, Monica; English, Will

    1994-01-01

    Development and implementation of a shared distributed computing architecture at the University of Alberta (Canada) are described. Aspects discussed include design of the architecture, users' views of the electronic environment, technical and managerial challenges, and the campuswide human infrastructures needed to manage such an integrated…

  13. Toward production from gas hydrates: Current status, assessment of resources, and simulation-based evaluation of technology and potential

    Science.gov (United States)

    Moridis, G.J.; Collett, T.S.; Boswell, R.; Kurihara, M.; Reagan, M.T.; Koh, C.; Sloan, E.D.

    2009-01-01

    Gas hydrates (GHs) are a vast energy resource with global distribution in the permafrost and in the oceans. Even if conservative estimates are considered and only a small fraction is recoverable, the sheer size of the resource is so large that it demands evaluation as a potential energy source. In this review paper, we discuss the distribution of natural GH accumulations, the status of the primary international research and development (R&D) programs, and the remaining science and technological challenges facing the commercialization of production. After a brief examination of GH accumulations that are well characterized and appear to be models for future development and gas production, we analyze the role of numerical simulation in the assessment of the hydrate-production potential, identify the data needs for reliable predictions, evaluate the status of knowledge with regard to these needs, discuss knowledge gaps and their impact, and reach the conclusion that the numerical-simulation capabilities are quite advanced and that the related gaps either are not significant or are being addressed. We review the current body of literature relevant to potential productivity from different types of GH deposits and determine that there are consistent indications of a large production potential at high rates across long periods from a wide variety of hydrate deposits. Finally, we identify (a) features, conditions, geology and techniques that are desirable in potential production targets; (b) methods to maximize production; and (c) some of the conditions and characteristics that render certain GH deposits undesirable for production. Copyright ?? 2009 Society of Petroleum Engineers.

  14. Computer modelling of the UK wind energy resource: final overview report

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Ravenscroft, F

    1993-12-31

    This report describes the results of a programme of work to estimate the UK wind energy resource. Mean wind speed maps and quantitative resource estimates were obtained using the NOABL mesoscale (1 km resolution) numerical model for the prediction of wind flow over complex terrain. NOABL was used in conjunction with digitised terrain data and wind data from surface meteorological stations for a ten year period (1975-1984) to provide digital UK maps of mean wind speed at 10m, 25m and 45m above ground level. Also included in the derivation of these maps was the use of the Engineering Science Data Unit (ESDU) method to model the effect on wind speed of the abrupt change in surface roughness that occurs at the coast. Existing isovent maps, based on standard meteorological data which take no account of terrain effects, indicate that 10m annual mean wind speeds vary between about 4.5 and 7 m/s over the UK with only a few coastal areas over 6 m/s. The present study indicated that 23% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. (20 figures, 7 tables, 10 references). (author)

  15. On the Cutting Edge Professional Development Program: Workshop and Web Resources for Current and Future Geoscience Faculty

    Science.gov (United States)

    MacDonald, R.; Manduca, C. A.; Mogk, D. W.; Tewksbury, B. J.

    2004-12-01

    Recognizing that many college and university faculty receive little formal training in teaching, are largely unaware of advances in research on teaching and learning, and face a variety of challenges in advancing in academic careers, the National Science Foundation-funded program On the Cutting Edge provides professional development for current and future faculty in the geosciences at various stages in their careers. The program includes a series of six multi-day workshops, sessions and one-day workshops at professional meetings, and a website with information about workshop opportunities and a variety of resources that bring workshop content to faculty (http://serc.carleton.edu/NAGTWorkshops). The program helps faculty improve their teaching and their job satisfaction by providing resources on instructional methods, geoscience content, and strategies for career planning. Workshop and website resources address innovative and effective practices in teaching, course design, delivery of instructional materials, and career planning, as well as approaches for teaching particular topics and strategies for starting and maintaining a research program in various institutional settings. Each year, special workshops for graduate students and post-doctoral fellows interested in academic careers and for early career faculty complement offerings on course design and emerging topics that are open to the full geoscience community. These special workshops include sessions on topics such as dual careers, gender issues, family-work balance, interviewing and negotiating strategies. The workshops serve as opportunities for networking and community building, with participants building connections with other participants as well as workshop leaders. Workshop participants reflect the full range of institutional diversity as well as ethnic and racial diversity beyond that of the geoscience faculty workforce. More than 40 percent of the faculty participants are female. Of the faculty

  16. An Analysis of Current Energy Policy Initiatives in New Mexico. What are the Potential Impacts to the State's Water Resources?

    Science.gov (United States)

    Klise, G. T.; Hart, W. E.; Kobos, P. H.; Malczynski, L. A.; Tidwell, V. C.

    2008-12-01

    Population in New Mexico is increasing rapidly with recent projections showing that the state will add more than 1 million people by 2035. This growth will create a demand for additional energy and water supplies that have yet to be developed. New Mexico currently exports about 50% of the energy generated within the state to neighboring states, and existing power plants predominately utilize traditional fossil fuels such as coal, oil and natural gas. Because traditional electric generation technologies utilize large quantities of water, New Mexico can also be seen as exporting water for the benefit of electricity consumed in neighboring states. As it is, both surface water and groundwater supplies are stretched thin and these internal and external stresses stemming from population growth will have a substantial impact on the state's water resources. In 2004, the Governor laid out a plan to make New Mexico a "Clean Energy State" by implementing renewable portfolio standards, developing renewable energy transmission infrastructure, creating an alternative energy innovation fund and creating state specific tax credits for renewable energy production and manufacturing. Recent work in the National Energy-Water Roadmap has pointed out that certain renewable sources of energy utilize less water than traditional power plants, and technological fixes to existing power plants will result in less water consumption. If New Mexico carries out its energy initiative, what will be the impacts to the state's water resources? Will it be possible to meet competing demands for this water? These questions and others will be analyzed in a decision-support tool that can look at the connection between both the physical and economic systems to see what the tradeoffs might be as a result of specific policy decisions. The ability to plan for future energy needs and understanding potential impacts to the state's limited water resources will be an invaluable tool for decision-makers in New

  17. Alaskan resources, current development. Traditional cultural values, and the role of LANDSAT data in current and future land use management planning

    Science.gov (United States)

    Laperriere, A. J.

    1975-01-01

    Past, present, and proposed applications of LANDSAT data for renewable resource assessments in Alaska are described. Specific projects briefly discussed include: a feasibility investigation applying LANDSAT data to caribou habitat mapping in northeast Alaska, analysis of a native corporate region in southwest Alaska, analysis of a game management unit in interior Alaska, and two proposed analyses in northwest Alaska. These analyses principally address range evaluations concerning caribou, moose, and Dall sheep, but results have application to other renewable resource themes. Application of resource assessment results to a statewide land use management plan is discussed.

  18. Terrestrial hydro-climatic change, lake shrinkage and water resource deterioration: Analysis of current to future drivers across Asia

    Science.gov (United States)

    Jarsjo, J.; Beygi, H.; Thorslund, J.

    2016-12-01

    Due to overlapping effects of different anthropogenic pressures and natural variability, main drivers behind on-going changes in the water cycle have in many cases not been identified, which complicates management of water resources. For instance, in many parts of the world, and not least in semi-arid and arid parts of Asia, lowered groundwater levels and shrinkage of surface water bodies with associated salinization and water quality deterioration constitute great challenges. With the aim to identify main drivers and mechanisms behind such changes, we here combine (i) historical observations of long-term, large scale change, (ii) ensemble projections of expected future change from the climate models of the Coupled Model Intercomparison Project Phase 5 (CMIP 5) and (iii) output from water balance modelling. Our particular focus is on regions near shrinking lakes. For the principal Lake Urmia in Iran, results show that agricultural intensification including irrigation expansion has clearly contributed to the surprisingly rapid water quality deterioration and lake shrinkage, from 10% lake area reduction in 2002 to the current value of about 75% (leaving billion of tons of salt exposed in its basin). Nevertheless, runoff decrease due to climate change has had an even larger effect. For the Aral Sea in Central Asia, where problems accelerated much earlier (in the 1990's), land-use change and irrigation expansion can fully explain the disastrous surface water deficits and water quality problems in the extensive low-lying parts of the basin. However, projections show that climate-driven runoff decrease in the headwaters of the Aral Sea basin may become a dominant driver of continued change in the near-future. More generally, present results illustrate that mitigation measures that compensate only for land-use driven effects may not reverse current trends of decreasing water availability, due to increasingly strong impacts of climate-driven runoff decrease. This has

  19. The Radiation Safety Information Computational Center (RSICC): A Resource for Nuclear Science Applications

    International Nuclear Information System (INIS)

    Kirk, Bernadette Lugue

    2009-01-01

    The Radiation Safety Information Computational Center (RSICC) has been in existence since 1963. RSICC collects, organizes, evaluates and disseminates technical information (software and nuclear data) involving the transport of neutral and charged particle radiation, and shielding and protection from the radiation associated with: nuclear weapons and materials, fission and fusion reactors, outer space, accelerators, medical facilities, and nuclear waste management. RSICC serves over 12,000 scientists and engineers from about 100 countries. An important activity of RSICC is its participation in international efforts on computational and experimental benchmarks. An example is the Shielding Integral Benchmarks Archival Database (SINBAD), which includes shielding benchmarks for fission, fusion and accelerators. RSICC is funded by the United States Department of Energy, Department of Homeland Security and Nuclear Regulatory Commission.

  20. The Radiation Safety Information Computational Center (RSICC): A Resource for Nuclear Science Applications

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, Bernadette Lugue [ORNL

    2009-01-01

    The Radiation Safety Information Computational Center (RSICC) has been in existence since 1963. RSICC collects, organizes, evaluates and disseminates technical information (software and nuclear data) involving the transport of neutral and charged particle radiation, and shielding and protection from the radiation associated with: nuclear weapons and materials, fission and fusion reactors, outer space, accelerators, medical facilities, and nuclear waste management. RSICC serves over 12,000 scientists and engineers from about 100 countries.

  1. A resource letter CSSMD-1: computer simulation studies by the method of molecular dynamics

    International Nuclear Information System (INIS)

    Goel, S.P.; Hockney, R.W.

    1974-01-01

    A comprehensive bibliography on computer simulation studies by the method of Molecular Dynamics is presented. The bibliography includes references to relevant literature published up to mid 1973, starting from the first paper of Alder and Wainwright, published in 1957. The procedure of the method of Molecular Dynamics, the main fields of study in which it has been used, its limitations and how these have been overcome in some cases are also discussed [pt

  2. Out-patient management and non-attendance in the current economic climate. How best to manage our resources?

    LENUS (Irish Health Repository)

    Hennessy, D

    2010-03-01

    Outpatient non-attendance is a considerable source of inefficiency in the health service, wasting time, resources and potentially lengthening waiting lists, Given the current economic climate, methods need to be employed to reduce non-attendance. The aim was to analyse outpatient non-attendance and determine what factors influence attendance. A prospective audit over a two-month period to a tertiary-referral Urological service was performed to determine the clinical and demographic profile of non-attendees. Of 737 appointments, 148 (20%) patients did not attend (DNA). A benign urological condition was evident in 116 cases (78%). This group of patients also accounted for the majority of new patients not attending 40\\/47, returning patients not attending 101\\/148 and the majority of patients who missed multiple appointments 43\\/49. Patients with benign conditions make up the majority of clinic non-attendance. Consideration may be given to discharging such patients back to their general practitioner after one unexplained non-attendance until other alternatives of follow up are available.

  3. Exploring dark current voltage characteristics of micromorph silicon tandem cells with computer simulations

    NARCIS (Netherlands)

    Sturiale, A.; Li, H. B. T.; Rath, J.K.; Schropp, R.E.I.; Rubinelli, F.A.

    2009-01-01

    The transport mechanisms controlling the forward dark current-voltage characteristic of the silicon micromorph tandem solar cell were investigated with numerical modeling techniques. The dark current-voltage characteristics of the micromorph tandem structure at forward voltages show three regions:

  4. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    Science.gov (United States)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  5. Computer-simulation movie of ionospheric electric fields and currents for a magnetospheric substorm life cycle. Technical note

    International Nuclear Information System (INIS)

    Kamide, Y.; Matsushita, S.

    1980-07-01

    Numerical solution of the current conservation equation gives the distributions of electric fields and currents in the global ionosphere produced by the field-aligned currents. By altering ionospheric conductivity distributions as well as the field-aligned current densities and configurations to simulate a magnetospheric substorm life cycle, which is assumed to last for five hours, various patterns of electric fields and currents are computed for every 30-second interval in the life cycle. The simulated results are compiled in the form of a color movie, where variations of electric equi-potential curves are the first sequence, electric current-vector changes are the second, and fluctuations of the electric current system are the third. The movie compresses real time by a factor of 1/180, taking 1.7 minutes of running time for one sequence. One of the most striking features of this simulation is the clear demonstration of rapid and large scale interactions between the auroral zone and middle-low latitudes during the substorm sequences. This technical note provides an outline of the numerical scheme and world-wide contour maps of the electric potential, ionospheric current vectors, and the equivalent ionospheric current system at 5-minute intervals as an aid in viewing the movie and to further detailed study of the 'model' substorms

  6. Resource utilization and costs during the initial years of lung cancer screening with computed tomography in Canada.

    Science.gov (United States)

    Cressman, Sonya; Lam, Stephen; Tammemagi, Martin C; Evans, William K; Leighl, Natasha B; Regier, Dean A; Bolbocean, Corneliu; Shepherd, Frances A; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R; Mayo, John R; McWilliams, Annette; Couture, Christian; English, John C; Goffin, John; Hwang, David M; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J; Goss, Glenwood D; Nicholas, Garth; Seely, Jean M; Sekhon, Harmanjatinder S; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D; Tan, Wan C; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J

    2014-10-01

    It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer's perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400-$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553-$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254-$52,200; p = 0.061). In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure.

  7. Computer simulation modeling of recreation use: Current status, case studies, and future directions

    Science.gov (United States)

    David N. Cole

    2005-01-01

    This report compiles information about recent progress in the application of computer simulation modeling to planning and management of recreation use, particularly in parks and wilderness. Early modeling efforts are described in a chapter that provides an historical perspective. Another chapter provides an overview of modeling options, common data input requirements,...

  8. Assessement of rheumatic diseases with computational radiology: Current status and future potential

    International Nuclear Information System (INIS)

    Peloschek, Philipp; Boesen, Mikael; Donner, Rene; Kubassova, Olga; Birngruber, Erich; Patsch, Janina; Mayerhoefer, Marius; Langs, Georg

    2009-01-01

    In recent years, several computational image analysis methods to assess disease progression in rheumatic diseases were presented. This review article explains the basics of these methods as well as their potential application in rheumatic disease monitoring, it covers radiography, sonography as well as magnetic resonance imaging in quantitative analysis frameworks.

  9. Test of some current ideas in quark confinement physics by Monte Carlo computations for finite lattices

    International Nuclear Information System (INIS)

    Mack, G.; Pietarinen, E.

    1980-06-01

    We present some new results of Monte Carlo computations for pure SU(2) Yang Mills theory on a finite lattice. They support consistency of asymptotic freedom with quark confinement, validity of a block cell picture, and ideas based on a vortex condensation picture of quark confinement. (orig.)

  10. Unlocking the Treasures of the Ocean: Current Assessment and Future Perspectives of Seafloor Resources (C.F Gauss Lecture)

    Science.gov (United States)

    Jegen, Marion

    2016-04-01

    Oceans cover 70% of the Earth's surface, and there is reason to believe that the wealth of mineral and carbon resources on the seafloor is similar to deposits on land. While off-shore energy resources such as oil and gas are nowadays regarded as conventional, energy resources in form of methane hydrates and seafloor mineral deposits are yet unconventional and at best marginally economic. However, taking into account global population growth, geopolitics and technological development (both in terms of increasing industrialization and possibility to explore and mine seafloor resources), these resources might play a more fundamental role in the future. Resource assessment and understanding of the geological formation process of resources are topics in marine geosciences with broad relevance to society. The lecture presents an overview of the geophysical exploration of the seafloor and its resource potential. Starting from the link of physical parameter anomalies associated with resources, I will explore marine technological developments on how to sense them remotely from the seafloor. Also the question will be addressed of how well we can actually quantify the amount of resources from geophysical data. The process will be illustrated based on theoretical work as well as case studies from around the world.

  11. The current status and future prospects of computer-assisted hip surgery.

    Science.gov (United States)

    Inaba, Yutaka; Kobayashi, Naomi; Ike, Hiroyuki; Kubota, So; Saito, Tomoyuki

    2016-03-01

    The advances in computer assistance technology have allowed detailed three-dimensional preoperative planning and simulation of preoperative plans. The use of a navigation system as an intraoperative assistance tool allows more accurate execution of the preoperative plan, compared to manual operation without assistance of the navigation system. In total hip arthroplasty using CT-based navigation, three-dimensional preoperative planning with computer software allows the surgeon to determine the optimal angle of implant placement at which implant impingement is unlikely to occur in the range of hip joint motion necessary for daily activities of living, and to determine the amount of three-dimensional correction for leg length and offset. With the use of computer navigation for intraoperative assistance, the preoperative plan can be precisely executed. In hip osteotomy using CT-based navigation, the navigation allows three-dimensional preoperative planning, intraoperative confirmation of osteotomy sites, safe performance of osteotomy even under poor visual conditions, and a reduction in exposure doses from intraoperative fluoroscopy. Positions of the tips of chisels can be displayed on the computer monitor during surgery in real time, and staff other than the operator can also be aware of the progress of surgery. Thus, computer navigation also has an educational value. On the other hand, its limitations include the need for placement of trackers, increased radiation exposure from preoperative CT scans, and prolonged operative time. Moreover, because the position of a bone fragment cannot be traced after osteotomy, methods to find its precise position after its movement need to be developed. Despite the need to develop methods for the postoperative evaluation of accuracy for osteotomy, further application and development of these systems are expected in the future. Copyright © 2016 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.

  12. Computer circuit analysis of induced currents in the MFTF-B magnet system

    International Nuclear Information System (INIS)

    Magnuson, G.D.; Woods, E.L.

    1981-01-01

    An analysis was made of the induced current behavior of the MFTF-B magnet system. Although the magnet system consists of 22 coils, because of its symmetry we considered only 11 coils in the analysis. Various combinations of the coils were dumped either singly or in groups, with the current behavior in all magnets calculated as a function of time after initiation of the dump

  13. Near DC eddy current measurement of aluminum multilayers using MR sensors and commodity low-cost computer technology

    Science.gov (United States)

    Perry, Alexander R.

    2002-06-01

    Low Frequency Eddy Current (EC) probes are capable of measurement from 5 MHz down to DC through the use of Magnetoresistive (MR) sensors. Choosing components with appropriate electrical specifications allows them to be matched to the power and impedance characteristics of standard computer connectors. This permits direct attachment of the probe to inexpensive computers, thereby eliminating external power supplies, amplifiers and modulators that have heretofore precluded very low system purchase prices. Such price reduction is key to increased market penetration in General Aviation maintenance and consequent reduction in recurring costs. This paper examines our computer software CANDETECT, which implements this approach and permits effective probe operation. Results are presented to show the intrinsic sensitivity of the software and demonstrate its practical performance when seeking cracks in the underside of a thick aluminum multilayer structure. The majority of the General Aviation light aircraft fleet uses rivets and screws to attach sheet aluminum skin to the airframe, resulting in similar multilayer lap joints.

  14. Computational Modeling of Cobalt-based Water Oxidation: Current Status and Future Challenges

    Science.gov (United States)

    Schilling, Mauro; Luber, Sandra

    2018-04-01

    A lot of effort is nowadays put into the development of novel water oxidation catalysts. In this context mechanistic studies are crucial in order to elucidate the reaction mechanisms governing this complex process, new design paradigms and strategies how to improve the stability and efficiency of those catalysis. This review is focused on recent theoretical mechanistic studies in the field of homogeneous cobalt-based water oxidation catalysts. In the first part, computational methodologies and protocols are summarized and evaluated on the basis of their applicability towards real catalytic or smaller model systems, whereby special emphasis is laid on the choice of an appropriate model system. In the second part, an overview of mechanistic studies is presented, from which conceptual guidelines are drawn on how to approach novel studies of catalysts and how to further develop the field of computational modeling of water oxidation reactions.

  15. Computational Modeling of Cobalt-Based Water Oxidation: Current Status and Future Challenges

    Directory of Open Access Journals (Sweden)

    Mauro Schilling

    2018-04-01

    Full Text Available A lot of effort is nowadays put into the development of novel water oxidation catalysts. In this context, mechanistic studies are crucial in order to elucidate the reaction mechanisms governing this complex process, new design paradigms and strategies how to improve the stability and efficiency of those catalysts. This review is focused on recent theoretical mechanistic studies in the field of homogeneous cobalt-based water oxidation catalysts. In the first part, computational methodologies and protocols are summarized and evaluated on the basis of their applicability toward real catalytic or smaller model systems, whereby special emphasis is laid on the choice of an appropriate model system. In the second part, an overview of mechanistic studies is presented, from which conceptual guidelines are drawn on how to approach novel studies of catalysts and how to further develop the field of computational modeling of water oxidation reactions.

  16. Computer-aided diagnosis and volumetry of pulmonary nodules: current concepts and future perspectives

    International Nuclear Information System (INIS)

    Marten, K.; Rummeny, E.J.; Engelke, C.

    2005-01-01

    For computer-aided detection (CAD) and volumetry of small pulmonary nodules, a number of algorithms have been developed for multislice CT data sets in recent years, with the goal of improving the diagnostic work-up and the follow-up of findings. Recent data show that the detection of small lesions may improve with CAD, suggesting that especially experienced readers may benefit from using CAD systems. This has lead to the recommendation of CAD as a replacement of the second reader in clinical practice. Furthermore, computer-aided volumetry of pulmonary nodules allows a precise determination of nodular growth rates as a prerequisite for a better classification of nodules as benign or malignant. In this article, we review recent developments of CAD and volumetry tools for pulmonary nodules, and address open questions regarding the use of these software tools in clinical routine. (orig.)

  17. Use of cone beam computed tomography in implant dentistry: current concepts, indications and limitations for clinical practice and research.

    Science.gov (United States)

    Bornstein, Michael M; Horner, Keith; Jacobs, Reinhilde

    2017-02-01

    Diagnostic radiology is an essential component of treatment planning in the field of implant dentistry. This narrative review will present current concepts for the use of cone beam computed tomography imaging, before and after implant placement, in daily clinical practice and research. Guidelines for the selection of three-dimensional imaging will be discussed, and limitations will be highlighted. Current concepts of radiation dose optimization, including novel imaging modalities using low-dose protocols, will be presented. For preoperative cross-sectional imaging, data are still not available which demonstrate that cone beam computed tomography results in fewer intraoperative complications such as nerve damage or bleeding incidents, or that implants inserted using preoperative cone beam computed tomography data sets for planning purposes will exhibit higher survival or success rates. The use of cone beam computed tomography following the insertion of dental implants should be restricted to specific postoperative complications, such as damage of neurovascular structures or postoperative infections in relation to the maxillary sinus. Regarding peri-implantitis, the diagnosis and severity of the disease should be evaluated primarily based on clinical parameters and on radiological findings based on periapical radiographs (two dimensional). The use of cone beam computed tomography scans in clinical research might not yield any evident beneficial effect for the patient included. As many of the cone beam computed tomography scans performed for research have no direct therapeutic consequence, dose optimization measures should be implemented by using appropriate exposure parameters and by reducing the field of view to the actual region of interest. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. I - Detector Simulation for the LHC and beyond: how to match computing resources and physics requirements

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Detector simulation at the LHC is one of the most computing intensive activities. In these lectures we will show how physics requirements were met for the LHC experiments and extrapolate to future experiments (FCC-hh case). At the LHC, detectors are complex, very precise and ambitious: this implies modern modelisation tools for geometry and response. Events are busy and characterised by an unprecedented energy scale with hundreds of particles to be traced and high energy showers to be accurately simulated. Furthermore, high luminosities imply many events in a bunch crossing and many bunch crossings to be considered at the same time. In addition, backgrounds not directly correlated to bunch crossings have also to be taken into account. Solutions chosen for ATLAS (a mixture of detailed simulation and fast simulation/parameterisation) will be described and CPU and memory figures will be given. An extrapolation to the FCC-hh case will be tried by taking as example the calorimeter simulation.

  19. II - Detector simulation for the LHC and beyond : how to match computing resources and physics requirements

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Detector simulation at the LHC is one of the most computing intensive activities. In these lectures we will show how physics requirements were met for the LHC experiments and extrapolate to future experiments (FCC-hh case). At the LHC, detectors are complex, very precise and ambitious: this implies modern modelisation tools for geometry and response. Events are busy and characterised by an unprecedented energy scale with hundreds of particles to be traced and high energy showers to be accurately simulated. Furthermore, high luminosities imply many events in a bunch crossing and many bunch crossings to be considered at the same time. In addition, backgrounds not directly correlated to bunch crossings have also to be taken into account. Solutions chosen for ATLAS (a mixture of detailed simulation and fast simulation/parameterisation) will be described and CPU and memory figures will be given. An extrapolation to the FCC-hh case will be tried by taking as example the calorimeter simulation.

  20. Menu-driven cloud computing and resource sharing for R and Bioconductor.

    Science.gov (United States)

    Bolouri, Hamid; Dulepet, Rajiv; Angerman, Michael

    2011-08-15

    We report CRdata.org, a cloud-based, free, open-source web server for running analyses and sharing data and R scripts with others. In addition to using the free, public service, CRdata users can launch their own private Amazon Elastic Computing Cloud (EC2) nodes and store private data and scripts on Amazon's Simple Storage Service (S3) with user-controlled access rights. All CRdata services are provided via point-and-click menus. CRdata is open-source and free under the permissive MIT License (opensource.org/licenses/mit-license.php). The source code is in Ruby (ruby-lang.org/en/) and available at: github.com/seerdata/crdata. hbolouri@fhcrc.org.

  1. Energy efficiency models and optimization algoruthm to enhance on-demand resource delivery in a cloud computing environment / Thusoyaone Joseph Moemi

    OpenAIRE

    Moemi, Thusoyaone Joseph

    2013-01-01

    Online hosed services are what is referred to as Cloud Computing. Access to these services is via the internet. h shifts the traditional IT resource ownership model to renting. Thus, high cost of infrastructure cannot limit the less privileged from experiencing the benefits that this new paradigm brings. Therefore, c loud computing provides flexible services to cloud user in the form o f software, platform and infrastructure as services. The goal behind cloud computing is to provi...

  2. Computational resources to filter gravitational wave data with P-approximant templates

    International Nuclear Information System (INIS)

    Porter, Edward K

    2002-01-01

    The prior knowledge of the gravitational waveform from compact binary systems makes matched filtering an attractive detection strategy. This detection method involves the filtering of the detector output with a set of theoretical waveforms or templates. One of the most important factors in this strategy is knowing how many templates are needed in order to reduce the loss of possible signals. In this study, we calculate the number of templates and computational power needed for a one-step search for gravitational waves from inspiralling binary systems. We build on previous works by first expanding the post-Newtonian waveforms to 2.5-PN order and second, for the first time, calculating the number of templates needed when using P-approximant waveforms. The analysis is carried out for the four main first-generation interferometers, LIGO, GEO600, VIRGO and TAMA. As well as template number, we also calculate the computational cost of generating banks of templates for filtering GW data. We carry out the calculations for two initial conditions. In the first case we assume a minimum individual mass of 1 M o-dot and in the second, we assume a minimum individual mass of 5 M o-dot . We find that, in general, we need more P-approximant templates to carry out a search than if we use standard PN templates. This increase varies according to the order of PN-approximation, but can be as high as a factor of 3 and is explained by the smaller span of the P-approximant templates as we go to higher masses. The promising outcome is that for 2-PN templates, the increase is small and is outweighed by the known robustness of the 2-PN P-approximant templates

  3. SuperB R&D computing program: HTTP direct access to distributed resources

    Science.gov (United States)

    Fella, A.; Bianchi, F.; Ciaschini, V.; Corvo, M.; Delprete, D.; Diacono, D.; Di Simone, A.; Franchini, P.; Donvito, G.; Giacomini, F.; Gianoli, A.; Longo, S.; Luitz, S.; Luppi, E.; Manzali, M.; Pardi, S.; Perez, A.; Rama, M.; Russo, G.; Santeramo, B.; Stroili, R.; Tomassetti, L.

    2012-12-01

    The SuperB asymmetric energy e+e- collider and detector to be built at the newly founded Nicola Cabibbo Lab will provide a uniquely sensitive probe of New Physics in the flavor sector of the Standard Model. Studying minute effects in the heavy quark and heavy lepton sectors requires a data sample of 75 ab-1 and a luminosity target of 1036cm-2s-1. The increasing network performance also in the Wide Area Network environment and the capability to read data remotely with good efficiency are providing new possibilities and opening new scenarios in the data access field. Subjects like data access and data availability in a distributed environment are key points in the definition of the computing model for an HEP experiment like SuperB. R&D efforts in such a field have been brought on during the last year in order to release the Computing Technical Design Report within 2013. WAN direct access to data has been identified as one of the more interesting viable option; robust and reliable protocols as HTTP/WebDAV and xrootd are the subjects of a specific R&D line in a mid-term scenario. In this work we present the R&D results obtained in the study of new data access technologies for typical HEP use cases, focusing on specific protocols such as HTTP and WebDAV in Wide Area Network scenarios. Reports on efficiency, performance and reliability tests performed in a data analysis context have been described. Future R&D plan includes HTTP and xrootd protocols comparison tests, in terms of performance, efficiency, security and features available.

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  5. Computing and the Crisis: The Significant Role of New Information Technologies in the Current Socio-economic Meltdown

    Directory of Open Access Journals (Sweden)

    David Hakken

    2010-08-01

    Full Text Available There is good reason to be concerned about the long-term implications of the current crisis for the reproduction of contemporary social formations. Thus there is an urgent need to understand it character, especially its distinctive features. This article identifies profound ambiguities in valuing assets as new and key economic features of this crisis, ambiguities traceable to the dominant, “computationalist” computing used to develop new financial instruments. After some preliminaries, the article identifies four specific ways in which computerization of finance is generative of crisis. It then demonstrates how computationalist computing is linked to other efforts to extend commodification based on the ideology of so-called “intellectual property” (IP. Several other accounts for the crisis are considered and then demonstrated to have less explanatory value. After considering how some commons-oriented (e.g., Free/Libre and/or Opening Source Software development projects forms of computing also undermine the IP project, the article concludes with a brief discussion of what research on Socially Robust and Enduring Computing might contribute to fostering alternative, non-crisis generative ways to compute.

  6. Knowledge-driven computational modeling in Alzheimer's disease research: Current state and future trends.

    Science.gov (United States)

    Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J

    2017-11-01

    Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to

  7. Analysis of Ion Currents Contribution to Repolarization in Human Heart Failure Using Computer Models

    Energy Technology Data Exchange (ETDEWEB)

    Marotta, F.; Paci, M.A.; Severi, S.; Trenor, B.

    2016-07-01

    The mechanisms underlying repolarization of the ventricular action potential (AP) are subject of research for anti-arrhythmic drugs. In fact, the prolongation of the AP occurs in several conditions of heart disease, such as heart failure, a major problem precursor for serious arrhythmias. In this study, we investigated the phenomena of repolarization reserve, defined as the capacity of the cell to repolarize in case of a functional loss, and the all-or-none repolarization, which depends on the delicate balance of inward and outward currents in the different phases of the AP, under conditions of human heart failure (HF). To simulate HF conditions, the O'Hara et al. human AP model was modified and specific protocols for all-or-none repolarization were applied. Our results show that in the early repolarization the threshold for all-or-none repolarization is not altered in HF even if a decrease in potassium currents can be observed. To quantify the contribution of the individual ion currents to HF induced AP prolongation, we used a novel piecewise-linear approximation approach proposed by Paci et al. In particular, INaL and ICaL are the main responsible for APD prolongation due to HF (85 and 35 ms respectively). Our results highlight this novel algorithm as a powerful tool to have a more complete picture of the complex ionic mechanisms underlying this disease and confirm the important role of the late sodium current in HF repolarization. (Author)

  8. A Two-Tier Energy-Aware Resource Management for Virtualized Cloud Computing System

    Directory of Open Access Journals (Sweden)

    Wei Huang

    2016-01-01

    Full Text Available The economic costs caused by electric power take the most significant part in total cost of data center; thus energy conservation is an important issue in cloud computing system. One well-known technique to reduce the energy consumption is the consolidation of Virtual Machines (VMs. However, it may lose some performance points on energy saving and the Quality of Service (QoS for dynamic workloads. Fortunately, Dynamic Frequency and Voltage Scaling (DVFS is an efficient technique to save energy in dynamic environment. In this paper, combined with the DVFS technology, we propose a cooperative two-tier energy-aware management method including local DVFS control and global VM deployment. The DVFS controller adjusts the frequencies of homogenous processors in each server at run-time based on the practical energy prediction. On the other hand, Global Scheduler assigns VMs onto the designate servers based on the cooperation with the local DVFS controller. The final evaluation results demonstrate the effectiveness of our two-tier method in energy saving.

  9. Reconfiguration of Computation and Communication Resources in Multi-Core Real-Time Embedded Systems

    DEFF Research Database (Denmark)

    Pezzarossa, Luca

    -core platform. Our approach is to associate reconfiguration with operational mode changes where the system, during normal operation, changes a subset of the executing tasks to adapt its behaviour to new conditions. Reconfiguration is therefore used during a mode change to modify the real-time guaranteed services...... of the communication channels between the tasks that are affected by the reconfiguration. This thesis investigates the use of reconfiguration in the context of multicore realtime systems targeting embedded applications. We address the reconfiguration of both the computation and the communication resources of a multi...... by the communication fabric between the cores of the platform. To support this, we present a new network on chip architecture, named Argo 2, that allows instantaneous and time-predictable reconfiguration of the communication channels. Our reconfiguration-capable architecture is prototyped using the existing time...

  10. Neurobionics and the brain-computer interface: current applications and future horizons.

    Science.gov (United States)

    Rosenfeld, Jeffrey V; Wong, Yan Tat

    2017-05-01

    The brain-computer interface (BCI) is an exciting advance in neuroscience and engineering. In a motor BCI, electrical recordings from the motor cortex of paralysed humans are decoded by a computer and used to drive robotic arms or to restore movement in a paralysed hand by stimulating the muscles in the forearm. Simultaneously integrating a BCI with the sensory cortex will further enhance dexterity and fine control. BCIs are also being developed to: provide ambulation for paraplegic patients through controlling robotic exoskeletons; restore vision in people with acquired blindness; detect and control epileptic seizures; and improve control of movement disorders and memory enhancement. High-fidelity connectivity with small groups of neurons requires microelectrode placement in the cerebral cortex. Electrodes placed on the cortical surface are less invasive but produce inferior fidelity. Scalp surface recording using electroencephalography is much less precise. BCI technology is still in an early phase of development and awaits further technical improvements and larger multicentre clinical trials before wider clinical application and impact on the care of people with disabilities. There are also many ethical challenges to explore as this technology evolves.

  11. A computational model of the ionic currents, Ca2+ dynamics and action potentials underlying contraction of isolated uterine smooth muscle.

    Directory of Open Access Journals (Sweden)

    Wing-Chiu Tong

    2011-04-01

    Full Text Available Uterine contractions during labor are discretely regulated by rhythmic action potentials (AP of varying duration and form that serve to determine calcium-dependent force production. We have employed a computational biology approach to develop a fuller understanding of the complexity of excitation-contraction (E-C coupling of uterine smooth muscle cells (USMC. Our overall aim is to establish a mathematical platform of sufficient biophysical detail to quantitatively describe known uterine E-C coupling parameters and thereby inform future empirical investigations of physiological and pathophysiological mechanisms governing normal and dysfunctional labors. From published and unpublished data we construct mathematical models for fourteen ionic currents of USMCs: Ca2+ currents (L- and T-type, Na+ current, an hyperpolarization-activated current, three voltage-gated K+ currents, two Ca2+-activated K+ current, Ca2+-activated Cl current, non-specific cation current, Na+-Ca2+ exchanger, Na+-K+ pump and background current. The magnitudes and kinetics of each current system in a spindle shaped single cell with a specified surface area:volume ratio is described by differential equations, in terms of maximal conductances, electrochemical gradient, voltage-dependent activation/inactivation gating variables and temporal changes in intracellular Ca2+ computed from known Ca2+ fluxes. These quantifications are validated by the reconstruction of the individual experimental ionic currents obtained under voltage-clamp. Phasic contraction is modeled in relation to the time constant of changing [Ca2+]i. This integrated model is validated by its reconstruction of the different USMC AP configurations (spikes, plateau and bursts of spikes, the change from bursting to plateau type AP produced by estradiol and of simultaneous experimental recordings of spontaneous AP, [Ca2+]i and phasic force. In summary, our advanced mathematical model provides a powerful tool to

  12. Dynamical model of computation of the rhodium self-powered neutron detector current

    International Nuclear Information System (INIS)

    Erben, O.; Slovacek, M.; Zerola, L.

    1992-01-01

    A model is presented for the calculation of the rhodium self-powered neutron detector current in dependence on the neutron flux density during reactor core transients. The total signal consists of a beta emission, prompt, and gamma component and a background signal. The model has been verified by means of experimental data obtained during measurements on the LVR-15 research reactor and at the Dukovany nuclear power plant. (author) 9 figs., 21 refs

  13. The current status of cone beam computed tomography imaging in orthodontics

    Science.gov (United States)

    Kapila, S; Conley, R S; Harrell, W E

    2011-01-01

    Cone beam CT (CBCT) has become an increasingly important source of three dimensional (3D) volumetric data in clinical orthodontics since its introduction into dentistry in 1998. The purpose of this manuscript is to highlight the current understanding of, and evidence for, the clinical use of CBCT in orthodontics, and to review the findings to answer clinically relevant questions. Currently available information from studies using CBCT can be organized into five broad categories: 1, the assessment of CBCT technology; 2, its use in craniofacial morphometric analyses; 3, incidental and missed findings; 4, analysis of treatment outcomes; and 5, efficacy of CBCT in diagnosis and treatment planning. The findings in these topical areas are summarized, followed by current indications and protocols for the use of CBCT in specific cases. Despite the increasing popularity of CBCT in orthodontics, and its advantages over routine radiography in specific cases, the effects of information derived from these images in altering diagnosis and treatment decisions has not been demonstrated in several types of cases. It has therefore been recommended that CBCT be used in select cases in which conventional radiography cannot supply satisfactory diagnostic information; these include cleft palate patients, assessment of unerupted tooth position, supernumerary teeth, identification of root resorption and for planning orthognathic surgery. The need to image other types of cases should be made on a case-by-case basis following an assessment of benefits vs risks of scanning in these situations. PMID:21159912

  14. Multislice Spiral Computed Tomography of the Heart: Technique, Current Applications, and Perspective

    International Nuclear Information System (INIS)

    Mahnken, Andreas H.; Wildberger, Joachim E.; Koos, Ralf; Guenther, Rolf W.

    2005-01-01

    Multislice spiral computed tomography (MSCT) is a rapidly evolving, noninvasive technique for cardiac imaging. Knowledge of the principle of electrocardiogram-gated MSCT and its limitations in clinical routine are needed to optimize image quality. Therefore, the basic technical principle including essentials of image postprocessing is described. Cardiac MSCT imaging was initially focused on coronary calcium scoring, MSCT coronary angiography, and analysis of left ventricular function. Recent studies also evaluated the ability of cardiac MSCT to visualize myocardial infarction and assess valvular morphology. In combination with experimental approaches toward the assessment of aortic valve function and myocardial viability, cardiac MSCT holds the potential for a comprehensive examination of the heart using one single examination technique

  15. Twenty Years of Creativity Research in Human-Computer Interaction: Current State and Future Directions

    DEFF Research Database (Denmark)

    Frich Pedersen, Jonas; Biskjaer, Michael Mose; Dalsgaard, Peter

    2018-01-01

    Creativity has been a growing topic within the ACM community since the 1990s. However, no clear overview of this trend has been offered. We present a thorough survey of 998 creativity-related publications in the ACM Digital Library collected using keyword search to determine prevailing approaches......, topics, and characteristics of creativity-oriented Human-Computer Interaction (HCI) research. . A selected sample based on yearly citations yielded 221 publications, which were analyzed using constant comparison analysis. We found that HCI is almost exclusively responsible for creativity......-oriented publications; they focus on collaborative creativity rather than individual creativity; there is a general lack of definition of the term ‘creativity’; empirically based contributions are prevalent; and many publications focus on new tools, often developed by researchers. On this basis, we present three...

  16. The current status of the development of the technology on 3D computer simulation in Japan

    International Nuclear Information System (INIS)

    Kim, Hee Reyoung; Park, Seung Kook; Chung, Un Soo; Jung, Ki Jung

    2002-05-01

    The development background and property of the COSIDA, which is the 3D computer simulation system for the analysis on the dismantling procedure of the nuclear facilities in Japan was reviewed. The function of the visualization on the work area, Kinematics analysis and dismantling scenario analysis, which are the sub systems of the COSIDA, has been investigated. The physical, geometrical and radiological properties were modelled in 2D or 3D in the sub system of the visualization of the work area. In the sub system of the kinematics analysis, the command set on the basic work procedure for the control of the motion of the models at a cyber space was driven. The suitability of the command set was estimated by the application of COSIDA to the programming on the motion of the remote dismantling tools for dismantling the components of the nuclear facilities at cyber space

  17. Computationally derived points of fragility of a human cascade are consistent with current therapeutic strategies.

    Directory of Open Access Journals (Sweden)

    Deyan Luan

    2007-07-01

    Full Text Available The role that mechanistic mathematical modeling and systems biology will play in molecular medicine and clinical development remains uncertain. In this study, mathematical modeling and sensitivity analysis were used to explore the working hypothesis that mechanistic models of human cascades, despite model uncertainty, can be computationally screened for points of fragility, and that these sensitive mechanisms could serve as therapeutic targets. We tested our working hypothesis by screening a model of the well-studied coagulation cascade, developed and validated from literature. The predicted sensitive mechanisms were then compared with the treatment literature. The model, composed of 92 proteins and 148 protein-protein interactions, was validated using 21 published datasets generated from two different quiescent in vitro coagulation models. Simulated platelet activation and thrombin generation profiles in the presence and absence of natural anticoagulants were consistent with measured values, with a mean correlation of 0.87 across all trials. Overall state sensitivity coefficients, which measure the robustness or fragility of a given mechanism, were calculated using a Monte Carlo strategy. In the absence of anticoagulants, fluid and surface phase factor X/activated factor X (fX/FXa activity and thrombin-mediated platelet activation were found to be fragile, while fIX/FIXa and fVIII/FVIIIa activation and activity were robust. Both anti-fX/FXa and direct thrombin inhibitors are important classes of anticoagulants; for example, anti-fX/FXa inhibitors have FDA approval for the prevention of venous thromboembolism following surgical intervention and as an initial treatment for deep venous thrombosis and pulmonary embolism. Both in vitro and in vivo experimental evidence is reviewed supporting the prediction that fIX/FIXa activity is robust. When taken together, these results support our working hypothesis that computationally derived points of

  18. Analysis of problem solving on project based learning with resource based learning approach computer-aided program

    Science.gov (United States)

    Kuncoro, K. S.; Junaedi, I.; Dwijanto

    2018-03-01

    This study aimed to reveal the effectiveness of Project Based Learning with Resource Based Learning approach computer-aided program and analyzed problem-solving abilities in terms of problem-solving steps based on Polya stages. The research method used was mixed method with sequential explanatory design. The subject of this research was the students of math semester 4. The results showed that the S-TPS (Strong Top Problem Solving) and W-TPS (Weak Top Problem Solving) had good problem-solving abilities in each problem-solving indicator. The problem-solving ability of S-MPS (Strong Middle Problem Solving) and (Weak Middle Problem Solving) in each indicator was good. The subject of S-BPS (Strong Bottom Problem Solving) had a difficulty in solving the problem with computer program, less precise in writing the final conclusion and could not reflect the problem-solving process using Polya’s step. While the Subject of W-BPS (Weak Bottom Problem Solving) had not been able to meet almost all the indicators of problem-solving. The subject of W-BPS could not precisely made the initial table of completion so that the completion phase with Polya’s step was constrained.

  19. Analysis on Current Situation and Countermeasure of Domestic Electronic Commerce Logistics in the Internet Age——Based on Resource Dependence Theory

    Directory of Open Access Journals (Sweden)

    Zhang Jiapeng

    2017-01-01

    Full Text Available This paper analyzes the status of electric business logistics in the current Internet era in China, and combines the SWOT analysis with AHP to do the empirical analysis, then puts forward the countermeasure that the electric business logistics resource should be shared based on the resource dependence theory. Through the empirical analysis, it is found that the disadvantages and opportunities of the logistics status are important in the Internet era.The resource sharing strategy based on the resource dependence theory is more scientific. The rational use of Internet technology in electric business logistics industry can achieve “sharing”. It is of great significance for its balanced development, intelligent development and optimization and development.

  20. Using Multiple Seasonal Holt-Winters Exponential Smoothing to Predict Cloud Resource Provisioning

    OpenAIRE

    Ashraf A. Shahin

    2016-01-01

    Elasticity is one of the key features of cloud computing that attracts many SaaS providers to minimize their services' cost. Cost is minimized by automatically provision and release computational resources depend on actual computational needs. However, delay of starting up new virtual resources can cause Service Level Agreement violation. Consequently, predicting cloud resources provisioning gains a lot of attention to scale computational resources in advance. However, most of current approac...

  1. Current trends in hardware and software for brain-computer interfaces (BCIs).

    Science.gov (United States)

    Brunner, P; Bianchi, L; Guger, C; Cincotti, F; Schalk, G

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  2. Addressing current challenges in cancer immunotherapy with mathematical and computational modelling.

    Science.gov (United States)

    Konstorum, Anna; Vella, Anthony T; Adler, Adam J; Laubenbacher, Reinhard C

    2017-06-01

    The goal of cancer immunotherapy is to boost a patient's immune response to a tumour. Yet, the design of an effective immunotherapy is complicated by various factors, including a potentially immunosuppressive tumour microenvironment, immune-modulating effects of conventional treatments and therapy-related toxicities. These complexities can be incorporated into mathematical and computational models of cancer immunotherapy that can then be used to aid in rational therapy design. In this review, we survey modelling approaches under the umbrella of the major challenges facing immunotherapy development, which encompass tumour classification, optimal treatment scheduling and combination therapy design. Although overlapping, each challenge has presented unique opportunities for modellers to make contributions using analytical and numerical analysis of model outcomes, as well as optimization algorithms. We discuss several examples of models that have grown in complexity as more biological information has become available, showcasing how model development is a dynamic process interlinked with the rapid advances in tumour-immune biology. We conclude the review with recommendations for modellers both with respect to methodology and biological direction that might help keep modellers at the forefront of cancer immunotherapy development. © 2017 The Author(s).

  3. Current trends in hardware and software for brain-computer interfaces (BCIs)

    Science.gov (United States)

    Brunner, P.; Bianchi, L.; Guger, C.; Cincotti, F.; Schalk, G.

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  4. Computation of antenna pattern correlation and MIMO performance by means of surface current distribution and spherical wave theory

    Directory of Open Access Journals (Sweden)

    O. Klemp

    2006-01-01

    Full Text Available In order to satisfy the stringent demand for an accurate prediction of MIMO channel capacity and diversity performance in wireless communications, more effective and suitable models that account for real antenna radiation behavior have to be taken into account. One of the main challenges is the accurate modeling of antenna correlation that is directly related to the amount of channel capacity or diversity gain which might be achieved in multi element antenna configurations. Therefore spherical wave theory in electromagnetics is a well known technique to express antenna far fields by means of a compact field expansion with a reduced number of unknowns that was recently applied to derive an analytical approach in the computation of antenna pattern correlation. In this paper we present a novel and efficient computational technique to determine antenna pattern correlation based on the evaluation of the surface current distribution by means of a spherical mode expansion.

  5. Evaluation of Current Computer Models Applied in the DOE Complex for SAR Analysis of Radiological Dispersion & Consequences

    Energy Technology Data Exchange (ETDEWEB)

    O' Kula, K. R. [Savannah River Site (SRS), Aiken, SC (United States); East, J. M. [Savannah River Site (SRS), Aiken, SC (United States); Weber, A. H. [Savannah River Site (SRS), Aiken, SC (United States); Savino, A. V. [Savannah River Site (SRS), Aiken, SC (United States); Mazzola, C. A. [Savannah River Site (SRS), Aiken, SC (United States)

    2003-01-01

    The evaluation of atmospheric dispersion/ radiological dose analysis codes included fifteen models identified in authorization basis safety analysis at DOE facilities, or from regulatory and research agencies where past or current work warranted inclusion of a computer model. All computer codes examined were reviewed using general and specific evaluation criteria developed by the Working Group. The criteria were based on DOE Orders and other regulatory standards and guidance for performing bounding and conservative dose calculations. Included were three categories of criteria: (1) Software Quality/User Interface; (2) Technical Model Adequacy; and (3) Application/Source Term Environment. A consensus-based limited quantitative ranking process was used to base an order of model preference as both an overall conclusion, and under specific conditions.

  6. A GIS-based Computational Tool for Multidimensional Flow Velocity by Acoustic Doppler Current Profilers

    International Nuclear Information System (INIS)

    Kim, D; Winkler, M; Muste, M

    2015-01-01

    Acoustic Doppler Current Profilers (ADCPs) provide efficient and reliable flow measurements compared to other tools for characteristics of the riverine environments. In addition to originally targeted discharge measurements, ADCPs are increasingly utilized to assess river flow characteristics. The newly developed VMS (Velocity Mapping Software) aims at providing an efficient process for quality assurance, mapping velocity vectors for visualization and facilitating comparison with physical and numerical model results. VMS was designed to provide efficient and smooth work flows for processing groups of transects. The software allows the user to select group of files and subsequently to conduct statistical and graphical quality assurance on the files as a group or individually as appropriate. VMS also enables spatial averaging in horizontal and vertical plane for ADCP data in a single or multiple transects over the same or consecutive cross sections. The analysis results are displayed in numerical and graphical formats. (paper)

  7. Water Resources Status and Availability Assessment in Current and Future Climate Change Scenarios for Beas River Basin of North Western Himalaya

    Science.gov (United States)

    Aggarwal, S. P.; Thakur, P. K.; Garg, V.; Nikam, B. R.; Chouksey, A.; Dhote, P.; Bhattacharya, T.

    2016-10-01

    The water resources status and availability of any river basin is of primary importance for overall and sustainable development of any river basin. This study has been done in Beas river basin which is located in North Western Himalaya for assessing the status of water resources in present and future climate change scenarios. In this study hydrological modelling approach has been used for quantifying the water balance components of Beas river basin upto Pandoh. The variable infiltration capacity (VIC) model has been used in energy balance mode for Beas river basin at 1km grid scale. The VIC model has been run with snow elevation zones files to simulate the snow module of VIC. The model was run with National Centre for Environmental Prediction (NCEP) forcing data (Tmax, Tmin, Rainfall and wind speed at 0.5degree resolution) from 1 Jan. 1999 to 31 Dec 2006 for calibration purpose. The additional component of glacier melt was added into overall river runoff using semi-empirical approach utilizing air temperature and glacier type and extent data. The ground water component is computed from overall recharge of ground water by water balance approach. The overall water balance approach is validated with river discharge data provided by Bhakra Beas Management Board (BBMB) from 1994-2014. VIC routing module was used to assess pixel wise flow availability at daily, monthly and annual time scales. The mean monthly flow at Pandoh during study period varied from 19 - 1581 m3/s from VIC and 50 to 1556 m3/sec from observation data, with minimum water flow occurring in month of January and maximum flow in month of August with annual R2 of 0.68. The future climate change data is taken from CORDEX database. The climate model of NOAA-GFDL-ESM2M for IPCC RCP scenario 4.5 and 8.5 were used for South Asia at 0.44 deg. grid from year 2006 to 2100. The climate forcing data for VIC model was prepared using daily maximum and minimum near surface air temperature, daily precipitation and

  8. WATER RESOURCES STATUS AND AVAILABILITY ASSESSMENT IN CURRENT AND FUTURE CLIMATE CHANGE SCENARIOS FOR BEAS RIVER BASIN OF NORTH WESTERN HIMALAYA

    Directory of Open Access Journals (Sweden)

    S. P. Aggarwal

    2016-10-01

    Full Text Available The water resources status and availability of any river basin is of primary importance for overall and sustainable development of any river basin. This study has been done in Beas river basin which is located in North Western Himalaya for assessing the status of water resources in present and future climate change scenarios. In this study hydrological modelling approach has been used for quantifying the water balance components of Beas river basin upto Pandoh. The variable infiltration capacity (VIC model has been used in energy balance mode for Beas river basin at 1km grid scale. The VIC model has been run with snow elevation zones files to simulate the snow module of VIC. The model was run with National Centre for Environmental Prediction (NCEP forcing data (Tmax, Tmin, Rainfall and wind speed at 0.5degree resolution from 1 Jan. 1999 to 31 Dec 2006 for calibration purpose. The additional component of glacier melt was added into overall river runoff using semi-empirical approach utilizing air temperature and glacier type and extent data. The ground water component is computed from overall recharge of ground water by water balance approach. The overall water balance approach is validated with river discharge data provided by Bhakra Beas Management Board (BBMB from 1994-2014. VIC routing module was used to assess pixel wise flow availability at daily, monthly and annual time scales. The mean monthly flow at Pandoh during study period varied from 19 - 1581 m3/s from VIC and 50 to 1556 m3/sec from observation data, with minimum water flow occurring in month of January and maximum flow in month of August with annual R2 of 0.68. The future climate change data is taken from CORDEX database. The climate model of NOAA-GFDL-ESM2M for IPCC RCP scenario 4.5 and 8.5 were used for South Asia at 0.44 deg. grid from year 2006 to 2100. The climate forcing data for VIC model was prepared using daily maximum and minimum near surface air temperature, daily

  9. Application of the Right to Permanent Sovereignty over Natural Resources for Indigenous Peoples: Assessment of Current Legal Developments

    Directory of Open Access Journals (Sweden)

    Endalew Lijalem Enyew

    2017-11-01

    Full Text Available The right to Permanent Sovereignty over Natural Resources (PSNR emerged in the era of decolonization. As a reaction to the irresponsible exploitation of natural resources by colonial powers, peoples under colonial rule and newly independent developing states asserted the right to control and dispose of their own natural resources. The UN General Assembly recognized and reinforced these claims by adopting a series of resolutions relating to the right to PSNR so as to facilitate the process of decolonization. However, the subjects of the right to PSNR have expanded to include ‘all peoples’ due to legal developments in international law pertaining to the right to self-determination of peoples and other human rights standards. This article explores the contemporary application of the right to PSNR for indigenous peoples, by virtue of their being ‘peoples’, tracing various developments in international law relating to indigenous peoples since the inception of PSNR in the 1950s.

  10. Precision of dosimetry-related measurements obtained on current multidetector computed tomography scanners

    International Nuclear Information System (INIS)

    Mathieu, Kelsey B.; McNitt-Gray, Michael F.; Zhang, Di; Kim, Hyun J.; Cody, Dianna D.

    2010-01-01

    Purpose: Computed tomography (CT) intrascanner and interscanner variability has not been well characterized. Thus, the purpose of this study was to examine the within-run, between-run, and between-scanner precision of physical dosimetry-related measurements collected over the course of 1 yr on three different makes and models of multidetector row CT (MDCT) scanners. Methods: Physical measurements were collected using nine CT scanners (three scanners each of GE VCT, GE LightSpeed 16, and Siemens Sensation 64 CT). Measurements were made using various combinations of technical factors, including kVp, type of bowtie filter, and x-ray beam collimation, for several dosimetry-related quantities, including (a) free-in-air CT dose index (CTDI 100,air ); (b) calculated half-value layers and quarter-value layers; and (c) weighted CT dose index (CTDI w ) calculated from exposure measurements collected in both a 16 and 32 cm diameter CTDI phantom. Data collection was repeated at several different time intervals, ranging from seconds (for CTDI 100,air values) to weekly for 3 weeks and then quarterly or triannually for 1 yr. Precision of the data was quantified by the percent coefficient of variation (%CV). Results: The maximum relative precision error (maximum %CV value) across all dosimetry metrics, time periods, and scanners included in this study was 4.33%. The median observed %CV values for CTDI 100,air ranged from 0.05% to 0.19% over several seconds, 0.12%-0.52% over 1 week, and 0.58%-2.31% over 3-4 months. For CTDI w for a 16 and 32 cm CTDI phantom, respectively, the range of median %CVs was 0.38%-1.14% and 0.62%-1.23% in data gathered weekly for 3 weeks and 1.32%-2.79% and 0.84%-2.47% in data gathered quarterly or triannually for 1 yr. Conclusions: From a dosimetry perspective, the MDCT scanners tested in this study demonstrated a high degree of within-run, between-run, and between-scanner precision (with relative precision errors typically well under 5%).

  11. Educational Experiences in Oceanography through Hands-On Involvement with Surface Drifters: an Introduction to Ocean Currents, Engineering, Data Collection, and Computer Science

    Science.gov (United States)

    Anderson, T.

    2015-12-01

    The Northeast Fisheries Science Center's (NEFSC) Student Drifters Program is providing education opportunities for students of all ages. Using GPS-tracked ocean drifters, various educational institutions can provide students with hands-on experience in physical oceanography, engineering, and computer science. In building drifters many high school and undergraduate students may focus on drifter construction, sometimes designing their own drifter or attempting to improve current NEFSC models. While learning basic oceanography younger students can build drifters with the help of an educator and directions available on the studentdrifters.org website. Once drifters are deployed, often by a local mariner or oceanographic partner, drifter tracks can be visualised on maps provided at http://nefsc.noaa.gov/drifter. With the lesson plans available for those interested in computer science, students may download, process, and plot the drifter position data with basic Python code provided. Drifter tracks help students to visualize ocean currents, and also allow them to understand real particle tracking applications such as in search and rescue, oil spill dispersion, larval transport, and the movement of injured sea animals. Additionally, ocean circulation modelers can use student drifter paths to validate their models. The Student Drifters Program has worked with over 100 schools, several of them having deployed drifters on the West Coast. Funding for the program often comes from individual schools and small grants but in the future will preferably come from larger government grants. NSF, Sea-Grant, NOAA, and EPA are all possible sources of funding, especially with the support of multiple schools and large marine education associations. The Student Drifters Program is a unique resource for educators, students, and scientists alike.

  12. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  13. Cloud Computing Security: A Survey

    OpenAIRE

    Khalil, Issa; Khreishah, Abdallah; Azeem, Muhammad

    2014-01-01

    Cloud computing is an emerging technology paradigm that migrates current technological and computing concepts into utility-like solutions similar to electricity and water systems. Clouds bring out a wide range of benefits including configurable computing resources, economic savings, and service flexibility. However, security and privacy concerns are shown to be the primary obstacles to a wide adoption of clouds. The new concepts that clouds introduce, such as multi-tenancy, resource sharing a...

  14. Economics and resources analysis of the potential use of reprocessing options by the current Spanish nuclear reactor park

    Energy Technology Data Exchange (ETDEWEB)

    Alvarez-Velarde, F.; Merino Rodriguez, I.; Gonzalez-Romero, E.

    2014-07-01

    Reprocessing of irradiated nuclear fuel serves multiple purposes, from Pu separation and recovery for MOX fuel fabrication to reduction of high level waste volume, and is nowadays being implemented in several countries like France, Japan, Russia or United Kingdom. This work is aimed at exploring the possibility (in resources and economic terms) of implementing reprocessing for MOX fabrication in Spain. (Author)

  15. Current challenges in the management of sepsis in ICUs in resource-poor settings and suggestions for the future.

    Science.gov (United States)

    Schultz, Marcus J; Dunser, Martin W; Dondorp, Arjen M; Adhikari, Neill K J; Iyer, Shivakumar; Kwizera, Arthur; Lubell, Yoel; Papali, Alfred; Pisani, Luigi; Riviello, Beth D; Angus, Derek C; Azevedo, Luciano C; Baker, Tim; Diaz, Janet V; Festic, Emir; Haniffa, Rashan; Jawa, Randeep; Jacob, Shevin T; Kissoon, Niranjan; Lodha, Rakesh; Martin-Loeches, Ignacio; Lundeg, Ganbold; Misango, David; Mer, Mervyn; Mohanty, Sanjib; Murthy, Srinivas; Musa, Ndidiamaka; Nakibuuka, Jane; Serpa Neto, Ary; Nguyen Thi Hoang, Mai; Nguyen Thien, Binh; Pattnaik, Rajyabardhan; Phua, Jason; Preller, Jacobus; Povoa, Pedro; Ranjit, Suchitra; Talmor, Daniel; Thevanayagam, Jonarthan; Thwaites, C Louise

    2017-05-01

    Sepsis is a major reason for intensive care unit (ICU) admission, also in resource-poor settings. ICUs in low- and middle-income countries (LMICs) face many challenges that could affect patient outcome. To describe differences between resource-poor and resource-rich settings regarding the epidemiology, pathophysiology, economics and research aspects of sepsis. We restricted this manuscript to the ICU setting even knowing that many sepsis patients in LMICs are treated outside an ICU. Although many bacterial pathogens causing sepsis in LMICs are similar to those in high-income countries, resistance patterns to antimicrobial drugs can be very different; in addition, causes of sepsis in LMICs often include tropical diseases in which direct damaging effects of pathogens and their products can sometimes be more important than the response of the host. There are substantial and persisting differences in ICU capacities around the world; not surprisingly the lowest capacities are found in LMICs, but with important heterogeneity within individual LMICs. Although many aspects of sepsis management developed in rich countries are applicable in LMICs, implementation requires strong consideration of cost implications and the important differences in resources. Addressing both disease-specific and setting-specific factors is important to improve performance of ICUs in LMICs. Although critical care for severe sepsis is likely cost-effective in LMIC setting, more detailed evaluation at both at a macro- and micro-economy level is necessary. Sepsis management in resource-limited settings is a largely unexplored frontier with important opportunities for research, training, and other initiatives for improvement.

  16. Algorithms for Computing the Magnetic Field, Vector Potential, and Field Derivatives for a Thin Solenoid with Uniform Current Density

    Energy Technology Data Exchange (ETDEWEB)

    Walstrom, Peter Lowell [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-07

    A numerical algorithm for computing the field components Br and Bz and their r and z derivatives with open boundaries in cylindrical coordinates for radially thin solenoids with uniform current density is described in this note. An algorithm for computing the vector potential Aθ is also described. For the convenience of the reader, derivations of the final expressions from their defining integrals are given in detail, since their derivations are not all easily found in textbooks. Numerical calculations are based on evaluation of complete elliptic integrals using the Bulirsch algorithm cel. The (apparently) new feature of the algorithms described in this note applies to cases where the field point is outside of the bore of the solenoid and the field-point radius approaches the solenoid radius. Since the elliptic integrals of the third kind normally used in computing Bz and Aθ become infinite in this region of parameter space, fields for points with the axial coordinate z outside of the ends of the solenoid and near the solenoid radius are treated by use of elliptic integrals of the third kind of modified argument, derived by use of an addition theorem. Also, the algorithms also avoid the numerical difficulties the textbook solutions have for points near the axis arising from explicit factors of 1/r or 1/r2 in the some of the expressions.

  17. ERATO - a computer program for the calculation of induced eddy-currents in three-dimensional conductive structures

    International Nuclear Information System (INIS)

    Benner, J.

    1985-10-01

    The computer code ERATO is used for the calculation of eddy-currents in three-dimensional conductive structures and their secondary magnetic field. ERATO is a revised version of the code FEDIFF, developed at IPP Garching. For the calculation the Finite-Element-Network (FEN) method is used, where the structure is simulated by an equivalent electric network. In the ERATO-code, the calculation of the finite-element discretization, the eddy-current analysis, and the final evaluation of the results are done in separate programs. So the eddy-current analysis as the central step is perfectly independent of a special geometry. For the finite-element discretization there are two so called preprocessors, which treat a torus-segment and a rectangular, flat plate. For the final evaluation postprocessors are used, by which the current-distributions can be printed and plotted. In the report, the theoretical foundation of the FEN-Method is discussed, the structure and the application of the programs (preprocessors, analysis-program, postprocessors, supporting programs) are shown, and two examples for calculations are presented. (orig.) [de

  18. Computer-aided design of multifrequency eddy-current tests for layered conductors with multiple property variations

    Energy Technology Data Exchange (ETDEWEB)

    Deeds, W E; Dodd, C V; Scott, G W

    1979-10-01

    Our program is part of a larger project designed to develop multifrequency eddy-current inspection techniques for multilayered conductors with parallel planar boundaries. To reduce the need to specially program each new problem, a family of programs that handle a large class of related problems with only minor editorial and interactive changes were developed. Programs for two types of cylindrical coil probes were developed: the reflection probe, which contains the driver and pickup coils and is used from one side of the specimen, and the through-transmission probe set, which places the driver and pickup coils on opposite sides of the conductor stack. The programs perform the following basic functions: (1) simulation of an ideal instrument's response to specific conductor and defect configurations, (2) control of an eddy-current instrument interfaced to a minicomputer to acquire and record actual instrument responses to test specimens, (3) construction of complex function expansions to relate instrument response to conductor and defect properties by using measured or computed responses and properties, and (4) simulation of a microcomputer on board the instrument by the interfaced minicomputer to test the analytical programming for the microcomputer. The report contains the basic equations for the computations, the main and subroutine programs, instructions for editorial changes and program execution, analyses of the main programs, file requirements, and other miscellaneous aids for the user.

  19. Multimedia messages in genetics: design, development, and evaluation of a computer-based instructional resource for secondary school students in a Tay Sachs disease carrier screening program.

    Science.gov (United States)

    Gason, Alexandra A; Aitken, MaryAnne; Delatycki, Martin B; Sheffield, Edith; Metcalfe, Sylvia A

    2004-01-01

    Tay Sachs disease is a recessively inherited neurodegenerative disorder, for which carrier screening programs exist worldwide. Education for those offered a screening test is essential in facilitating informed decision-making. In Melbourne, Australia, we have designed, developed, and evaluated a computer-based instructional resource for use in the Tay Sachs disease carrier screening program for secondary school students attending Jewish schools. The resource entitled "Genetics in the Community: Tay Sachs disease" was designed on a platform of educational learning theory. The development of the resource included formative evaluation using qualitative data analysis supported by descriptive quantitative data. The final resource was evaluated within the screening program and compared with the standard oral presentation using a questionnaire. Knowledge outcomes were measured both before and after either of the educational formats. Data from the formative evaluation were used to refine the content and functionality of the final resource. The questionnaire evaluation of 302 students over two years showed the multimedia resource to be equally effective as an oral educational presentation in facilitating participants' knowledge construction. The resource offers a large number of potential benefits, which are not limited to the Tay Sachs disease carrier screening program setting, such as delivery of a consistent educational message, short delivery time, and minimum financial and resource commitment. This article outlines the value of considering educational theory and describes the process of multimedia development providing a framework that may be of value when designing genetics multimedia resources in general.

  20. Computational Multiqubit Tunnelling in Programmable Quantum Annealers

    Science.gov (United States)

    2016-08-25

    ARTICLE Received 3 Jun 2015 | Accepted 26 Nov 2015 | Published 7 Jan 2016 Computational multiqubit tunnelling in programmable quantum annealers...state itself. Quantum tunnelling has been hypothesized as an advantageous physical resource for optimization in quantum annealing. However, computational ...qubit tunnelling plays a computational role in a currently available programmable quantum annealer. We devise a probe for tunnelling, a computational

  1. Flaws in current human training protocols for spontaneous Brain-Computer Interfaces: lessons learned from instructional design

    Directory of Open Access Journals (Sweden)

    Fabien eLotte

    2013-09-01

    Full Text Available While recent research on Brain-Computer Interfaces (BCI has highlighted their potential for many applications, they remain barely used outside laboratories. The main reason is their lack of robustness. Indeed, with current BCI, mental state recognition is usually slow and often incorrect. Spontaneous BCI (i.e., mental imagery-based BCI often rely on mutual learning efforts by the user and the machine, with BCI users learning to produce stable EEG patterns (spontaneous BCI control being widely acknowledged as a skill while the computer learns to automatically recognize these EEG patterns, using signal processing. Most research so far was focused on signal processing, mostly neglecting the human in the loop. However, how well the user masters the BCI skill is also a key element explaining BCI robustness. Indeed, if the user is not able to produce stable and distinct EEG patterns, then no signal processing algorithm would be able to recognize them. Unfortunately, despite the importance of BCI training protocols, they have been scarcely studied so far, and used mostly unchanged for years.In this paper, we advocate that current human training approaches for spontaneous BCI are most likely inappropriate. We notably study instructional design literature in order to identify the key requirements and guidelines for a successful training procedure that promotes a good and efficient skill learning. This literature study highlights that current spontaneous BCI user training procedures satisfy very few of these requirements and hence are likely to be suboptimal. We therefore identify the flaws in BCI training protocols according to instructional design principles, at several levels: in the instructions provided to the user, in the tasks he/she has to perform, and in the feedback provided. For each level, we propose new research directions that are theoretically expected to address some of these flaws and to help users learn the BCI skill more efficiently.

  2. Bone density loss on computed tomography at 3-year follow-up in current compared to former male smokers

    Energy Technology Data Exchange (ETDEWEB)

    Pompe, E., E-mail: e.pompe@umcutrecht.nl [Department of Pulmonology, University Medical Center Utrecht, Utrecht (Netherlands); Bartstra, J. [Department of Radiology, University Medical Center Utrecht, Utrecht (Netherlands); Verhaar, H.J. [Department of Geriatric Medicine, University Medical Center Utrecht, Utrecht (Netherlands); Koning, H.J. de; Aalst, C.M. van der [Department of Public Health, Erasmus MC − University Medical Center Rotterdam, Rotterdam (Netherlands); Oudkerk, M. [University of Groningen, University Medical Center Groningen, Groningen, Department of Radiology (Netherlands); Vliegenthart, R. [University of Groningen, University Medical Center Groningen, Groningen, Department of Radiology (Netherlands); University of Groningen, University Medical Center Groningen, Center for Medical Imaging-North East Netherlands, Groningen (Netherlands); Lammers, J.-W.J. [Department of Pulmonology, University Medical Center Utrecht, Utrecht (Netherlands); Jong, P.A. de; Mohamed Hoesein, F.A.A. [Department of Radiology, University Medical Center Utrecht, Utrecht (Netherlands)

    2017-04-15

    Objectives: Cigarette smoking negatively affects bone quality and increases fracture risk. Little is known on the effect of smoking cessation and computed tomography (CT)-derived bone mineral density (BMD) decline in the spine. We evaluated the association of current and former smoking with BMD decline after 3-year follow-up. Methods: Male current and former smokers participating in a lung cancer screening trial who underwent baseline and 3-year follow-up CT were included. BMD was measured by manual placement of a region of interest in the first lumbar vertebra and expressed in Hounsfield Unit (HU). Multiple linear regression analysis was used to evaluate the association between pack years smoked and smoking status with BMD decline. Results: 408 participants were included with median (25th–75th percentile) age of 59.4 (55.9–63.5) years. At the start of the study, 197 (48.3%) participants were current smokers and 211 (51.7%) were former smokers and had a similar amount of pack years. Current smokers had quit smoking for 6 (4–8) years prior to inclusion. There was no difference in BMD between current and former smokers at baseline (109 ± 34 HU vs. 108 ± 32 HU, p = 0.96). At 3-year follow-up, current smokers had a mean BMD decline of −3 ± 13 HU (p = 0.001), while BMD in former smokers did not change as compared to baseline (1 ± 13 HU, p = 0.34). After adjustment for BMD at baseline and body mass index, current smoking was independently associated with BMD decline (−3.8 HU, p = 0.003). Age, pack years, and the presence of a fracture at baseline did not associate with BMD decline. Conclusions: Current smokers showed a more rapid BMD decline over a 3-year period compared to former smokers. This information might be important to identify subjects at risk for osteoporosis and emphasizes the importance of smoking cessation in light of BMD decline.

  3. Bone density loss on computed tomography at 3-year follow-up in current compared to former male smokers

    International Nuclear Information System (INIS)

    Pompe, E.; Bartstra, J.; Verhaar, H.J.; Koning, H.J. de; Aalst, C.M. van der; Oudkerk, M.; Vliegenthart, R.; Lammers, J.-W.J.; Jong, P.A. de; Mohamed Hoesein, F.A.A.

    2017-01-01

    Objectives: Cigarette smoking negatively affects bone quality and increases fracture risk. Little is known on the effect of smoking cessation and computed tomography (CT)-derived bone mineral density (BMD) decline in the spine. We evaluated the association of current and former smoking with BMD decline after 3-year follow-up. Methods: Male current and former smokers participating in a lung cancer screening trial who underwent baseline and 3-year follow-up CT were included. BMD was measured by manual placement of a region of interest in the first lumbar vertebra and expressed in Hounsfield Unit (HU). Multiple linear regression analysis was used to evaluate the association between pack years smoked and smoking status with BMD decline. Results: 408 participants were included with median (25th–75th percentile) age of 59.4 (55.9–63.5) years. At the start of the study, 197 (48.3%) participants were current smokers and 211 (51.7%) were former smokers and had a similar amount of pack years. Current smokers had quit smoking for 6 (4–8) years prior to inclusion. There was no difference in BMD between current and former smokers at baseline (109 ± 34 HU vs. 108 ± 32 HU, p = 0.96). At 3-year follow-up, current smokers had a mean BMD decline of −3 ± 13 HU (p = 0.001), while BMD in former smokers did not change as compared to baseline (1 ± 13 HU, p = 0.34). After adjustment for BMD at baseline and body mass index, current smoking was independently associated with BMD decline (−3.8 HU, p = 0.003). Age, pack years, and the presence of a fracture at baseline did not associate with BMD decline. Conclusions: Current smokers showed a more rapid BMD decline over a 3-year period compared to former smokers. This information might be important to identify subjects at risk for osteoporosis and emphasizes the importance of smoking cessation in light of BMD decline.

  4. From geospatial observations of ocean currents to causal predictors of spatio-economic activity using computer vision and machine learning

    Science.gov (United States)

    Popescu, Florin; Ayache, Stephane; Escalera, Sergio; Baró Solé, Xavier; Capponi, Cecile; Panciatici, Patrick; Guyon, Isabelle

    2016-04-01

    The big data transformation currently revolutionizing science and industry forges novel possibilities in multi-modal analysis scarcely imaginable only a decade ago. One of the important economic and industrial problems that stand to benefit from the recent expansion of data availability and computational prowess is the prediction of electricity demand and renewable energy generation. Both are correlates of human activity: spatiotemporal energy consumption patterns in society are a factor of both demand (weather dependent) and supply, which determine cost - a relation expected to strengthen along with increasing renewable energy dependence. One of the main drivers of European weather patterns is the activity of the Atlantic Ocean and in particular its dominant Northern Hemisphere current: the Gulf Stream. We choose this particular current as a test case in part due to larger amount of relevant data and scientific literature available for refinement of analysis techniques. This data richness is due not only to its economic importance but also to its size being clearly visible in radar and infrared satellite imagery, which makes it easier to detect using Computer Vision (CV). The power of CV techniques makes basic analysis thus developed scalable to other smaller and less known, but still influential, currents, which are not just curves on a map, but complex, evolving, moving branching trees in 3D projected onto a 2D image. We investigate means of extracting, from several image modalities (including recently available Copernicus radar and earlier Infrared satellites), a parameterized representation of the state of the Gulf Stream and its environment that is useful as feature space representation in a machine learning context, in this case with the EC's H2020-sponsored 'See.4C' project, in the context of which data scientists may find novel predictors of spatiotemporal energy flow. Although automated extractors of Gulf Stream position exist, they differ in methodology

  5. Currently Situation, Some Cases and Implications of the Legislation on Access and Benefit-sharing to Biologi cal Genetic Resource in Australia

    Directory of Open Access Journals (Sweden)

    LI Yi-ding

    2017-01-01

    Full Text Available Australia is one of the most abundant in biodiversity country of the global which located in Oceanian and became a signatory coun try of the Convention on Biodiversity, International Treaty on Plant Genetic Resource for Food and Agriculture, Convention on International Trade in Endangered Species. This country stipulated the Environmental Protection and Biodiversity Conservation Act(EPBC, 1999 and Environmental Protection and Biodiversity Conservation Regulations, 2002. Queensland and the North Territory passed the Bio-discovery Act in 2004 and Biological Resource Act in 2006 separately. This paper firstly focus on current situation, characteristic of the legislation on ac cess and benefit-sharing to biological resource in the commonwealth and local place in Australia and then collected and analyzed the typical case of access and benefit-sharing in this country that could bring some experience to China in this field. The conclusion of this paper is that China should stipulated the specific legislation on access and benefit-sharing to biological genetic resource as like the Environmental Protection and Biodiversity Conservation Act(EPBC, 1999 and establish the rule of procedure related to the access and benefit-sharing as like the Environmental Protection and Biodiversity Conservation Regulations, 2002, Bio-discovery Act in 2004, Queensland and the Biological Resource Act in 2006, the North Territory.

  6. Impact of the displacement current on low-frequency electromagnetic fields computed using high-resolution anatomy models

    International Nuclear Information System (INIS)

    Barchanski, A; Gersem, H de; Gjonaj, E; Weiland, T

    2005-01-01

    We present a comparison of simulated low-frequency electromagnetic fields in the human body, calculated by means of the electro-quasistatic formulation. The geometrical data in these simulations were provided by an anatomically realistic, high-resolution human body model, while the dielectric properties of the various body tissues were modelled by the parametric Cole-Cole equation. The model was examined under two different excitation sources and various spatial resolutions in a frequency range from 10 Hz to 1 MHz. An analysis of the differences in the computed fields resulting from a neglect of the permittivity was carried out. On this basis, an estimation of the impact of the displacement current on the simulated low-frequency electromagnetic fields in the human body is obtained. (note)

  7. A Computer-Aided Diagnosis for Evaluating Lung Nodules on Chest CT: the Current Status and Perspective

    Energy Technology Data Exchange (ETDEWEB)

    Goo, Jin Mo [Seoul National University Medical Research Center, Seoul (Korea, Republic of)

    2011-04-15

    As the detection and characterization of lung nodules are of paramount importance in thoracic radiology, various tools for making a computer-aided diagnosis (CAD) have been developed to improve the diagnostic performance of radiologists in clinical practice. Numerous studies over the years have shown that the CAD system can effectively help readers identify more nodules. Moreover, nodule malignancy and the response of malignant lung tumors to treatment can also be assessed using nodule volumetry. CAD also has the potential to objectively analyze the morphology of nodules and enhance the work flow during the assessment of follow-up studies. Therefore, understanding the current status and limitations of CAD for evaluating lung nodules is essential to effectively apply CAD in clinical practice

  8. CURRENT SITUATION OF HUMAN RESOURCES IN ROMANIAN PRE-UNIVERSITY EUCATION CONTEXT OF E.U. INTEGRATION

    Directory of Open Access Journals (Sweden)

    Luminita Claudia CORBU

    2016-02-01

    Full Text Available The paper aims to present development of human resource in the process of accession to the European Union took into account the standards proposed by the European Union, as formulated at the level of principles and objectives, the Single European Act and the Treaty on European Union, including the Treaty Amsterdam, then the sequence of documents developed by the European Commission. It is expected that documents the evolution of education and training in the European Union around concepts such as: further education, knowledge society, knowledge - skills - competitiveness, globalization, discrimination and inequality etc. Around this concept formulated a strategy for workforce training to meet European standards and to escape from captivity regional. But this process of convergent evolution of the labor force for the European market has gone from multiple realities. One of these areas, with its own legacy in terms of human resources and how to perceive its specific educational objectives proposed by Europe was Romania. Knowing one of the toughest centralized systems, the labor market was absorbing all graduates of vocational education, high school and university, after 1989, Romania was confronted with a massive disturbance of the labor market. This dimension has been linked to a certain inadequacies supply training system Romanian, which itself on the one hand, a transformation natural caused by seclusion has been maintained for several years, on the other hand, in a process of adaptation to European standards.

  9. Concomitant Use of Transcranial Direct Current Stimulation and Computer-Assisted Training for the Rehabilitation of Attention in Traumatic Brain Injured Patients: Behavioral and Neuroimaging Results.

    Science.gov (United States)

    Sacco, Katiuscia; Galetto, Valentina; Dimitri, Danilo; Geda, Elisabetta; Perotti, Francesca; Zettin, Marina; Geminiani, Giuliano C

    2016-01-01

    Divided attention (DA), the ability to distribute cognitive resources among two or more simultaneous tasks, may be severely compromised after traumatic brain injury (TBI), resulting in problems with numerous activities involved with daily living. So far, no research has investigated whether the use of non-invasive brain stimulation associated with neuropsychological rehabilitation might contribute to the recovery of such cognitive function. The main purpose of this study was to assess the effectiveness of 10 transcranial direct current stimulation (tDCS) sessions combined with computer-assisted training; it also intended to explore the neural modifications induced by the treatment. Thirty-two patients with severe TBI participated in the study: 16 were part of the experimental group, and 16 part of the control group. The treatment included 20' of tDCS, administered twice a day for 5 days. The electrodes were placed on the dorso-lateral prefrontal cortex. Their location varied across patients and it depended on each participant's specific area of damage. The control group received sham tDCS. After each tDCS session, the patient received computer-assisted cognitive training on DA for 40'. The results showed that the experimental group significantly improved in DA performance between pre- and post-treatment, showing faster reaction times (RTs), and fewer omissions. No improvement was detected between the baseline assessment (i.e., 1 month before treatment) and the pre-training assessment, or within the control group. Functional magnetic resonance imaging (fMRI) data, obtained on the experimental group during a DA task, showed post-treatment lower cerebral activations in the right superior temporal gyrus (BA 42), right and left middle frontal gyrus (BA 6), right postcentral gyrus (BA 3) and left inferior frontal gyrus (BA 9). We interpreted such neural changes as normalization of previously abnormal hyperactivations.

  10. Algorithms for Computing the Magnetic Field, Vector Potential, and Field Derivatives for Circular Current Loops in Cylindrical Coordinates

    Energy Technology Data Exchange (ETDEWEB)

    Walstrom, Peter Lowell [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-24

    A numerical algorithm for computing the field components Br and Bz and their r and z derivatives with open boundaries in cylindrical coordinates for circular current loops is described. An algorithm for computing the vector potential is also described. For the convenience of the reader, derivations of the final expressions from their defining integrals are given in detail, since their derivations (especially for the field derivatives) are not all easily found in textbooks. Numerical calculations are based on evaluation of complete elliptic integrals using the Bulirsch algorithm cel. Since cel can evaluate complete elliptic integrals of a fairly general type, in some cases the elliptic integrals can be evaluated without first reducing them to forms containing standard Legendre forms. The algorithms avoid the numerical difficulties that many of the textbook solutions have for points near the axis because of explicit factors of 1=r or 1=r2 in the some of the expressions.

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  12. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  13. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  14. Prevention of supine hypotensive syndrome in pregnant women undergoing computed tomography - A national survey of current practice

    International Nuclear Information System (INIS)

    McMahon, Michelle A.; Fenwick, Alison; Banks, Amelia; Dineen, Robert A.

    2009-01-01

    Aim: Supine hypotensive syndrome (SHS) can occur in women in the second half of pregnancy due to compression of the aorta and inferior vena cava by the gravid uterus. This results in a decrease in cardiac output with effects ranging from transient asymptomatic hypotension to cardiovascular collapse. SHS can be easily avoided by left lateral tilt positioning. We undertook a nationwide survey to assess the awareness amongst senior computed tomography (CT) radiographers of the potential risk of SHS in women in this patient group, and to identify the extent to which preventative practices and protocols are in place. Methods and materials: A questionnaire was sent to superintendent CT radiographers at all acute NHS Trusts in England and Wales examining awareness of the risk of SHS and the preventative practices and protocols currently used. Results: Completed questionnaires were received from 64% institutions. Of respondents who scan women in this patient group, only 44% were aware of the risk of SHS. No institution had a written protocol specifying positioning of women in this patient group. Seventy-five percent of institutions never employed oblique positioning. Eighty-five percent felt that specific guidelines from the Society of Radiographers or Royal College of Radiologists would be helpful. Conclusion: Current awareness and practices for preventing this easily avoidable but potentially harmful condition are inadequate. Central guidance would be welcomed by a large majority of respondents.

  15. Current and planned numerical development for improving computing performance for long duration and/or low pressure transients

    Energy Technology Data Exchange (ETDEWEB)

    Faydide, B. [Commissariat a l`Energie Atomique, Grenoble (France)

    1997-07-01

    This paper presents the current and planned numerical development for improving computing performance in case of Cathare applications needing real time, like simulator applications. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the general characteristics of the code are presented, dealing with physical models, numerical topics, and validation strategy. Then, the current and planned applications of Cathare in the field of simulators are discussed. Some of these applications were made in the past, using a simplified and fast-running version of Cathare (Cathare-Simu); the status of the numerical improvements obtained with Cathare-Simu is presented. The planned developments concern mainly the Simulator Cathare Release (SCAR) project which deals with the use of the most recent version of Cathare inside simulators. In this frame, the numerical developments are related with the speed up of the calculation process, using parallel processing and improvement of code reliability on a large set of NPP transients.

  16. Current and planned numerical development for improving computing performance for long duration and/or low pressure transients

    International Nuclear Information System (INIS)

    Faydide, B.

    1997-01-01

    This paper presents the current and planned numerical development for improving computing performance in case of Cathare applications needing real time, like simulator applications. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the general characteristics of the code are presented, dealing with physical models, numerical topics, and validation strategy. Then, the current and planned applications of Cathare in the field of simulators are discussed. Some of these applications were made in the past, using a simplified and fast-running version of Cathare (Cathare-Simu); the status of the numerical improvements obtained with Cathare-Simu is presented. The planned developments concern mainly the Simulator Cathare Release (SCAR) project which deals with the use of the most recent version of Cathare inside simulators. In this frame, the numerical developments are related with the speed up of the calculation process, using parallel processing and improvement of code reliability on a large set of NPP transients

  17. Prevention of supine hypotensive syndrome in pregnant women undergoing computed tomography - A national survey of current practice

    Energy Technology Data Exchange (ETDEWEB)

    McMahon, Michelle A.; Fenwick, Alison [Department of Diagnostic Imaging, Queen' s Medical Centre Campus, Nottingham University Hospitals NHS Trust, Derby Road, Nottingham, NG7 2UH (United Kingdom); Banks, Amelia [Department of Anaesthesia, City Hospital Campus, Nottingham University Hospitals NHS Trust, Hucknall Road, Nottingham, NG5 1PB (United Kingdom); Dineen, Robert A. [Department of Diagnostic Imaging, Queen' s Medical Centre Campus, Nottingham University Hospitals NHS Trust, Derby Road, Nottingham, NG7 2UH (United Kingdom)], E-mail: Robert.dineen@nhs.net

    2009-05-15

    Aim: Supine hypotensive syndrome (SHS) can occur in women in the second half of pregnancy due to compression of the aorta and inferior vena cava by the gravid uterus. This results in a decrease in cardiac output with effects ranging from transient asymptomatic hypotension to cardiovascular collapse. SHS can be easily avoided by left lateral tilt positioning. We undertook a nationwide survey to assess the awareness amongst senior computed tomography (CT) radiographers of the potential risk of SHS in women in this patient group, and to identify the extent to which preventative practices and protocols are in place. Methods and materials: A questionnaire was sent to superintendent CT radiographers at all acute NHS Trusts in England and Wales examining awareness of the risk of SHS and the preventative practices and protocols currently used. Results: Completed questionnaires were received from 64% institutions. Of respondents who scan women in this patient group, only 44% were aware of the risk of SHS. No institution had a written protocol specifying positioning of women in this patient group. Seventy-five percent of institutions never employed oblique positioning. Eighty-five percent felt that specific guidelines from the Society of Radiographers or Royal College of Radiologists would be helpful. Conclusion: Current awareness and practices for preventing this easily avoidable but potentially harmful condition are inadequate. Central guidance would be welcomed by a large majority of respondents.

  18. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  19. The actual status of uranium ore resources at Eko Remaja Sector: the need of verification of resources computation and geometrical form of mineralization zone by mining test

    International Nuclear Information System (INIS)

    Johan Baratha; Muljono, D.S.; Agus Sumaryanto; Handoko Supalal

    1996-01-01

    Uranium ore resources calculation was done after ending all of geological work step. Estimation process of ore resources was started from evaluation drilling, continued with borehole logging. From logging, the result has presented in anomaly graphs, then was processed to determine thickness and grade value of ore. Those mineralization points were correlated one another to form mineralization zones which have direction of N 270 degree to N 285 degree with 70 degree dip to North. From Grouping the mineralization distribution, 19 mineralization planes was constructed which contain 553 ton of U 3 O 8 measured. It is suggested that before expanding measured ore deposit area, mining test should be done first at certain mineralization planes to prove the method applied to calculate the reserve. Results form mining test could be very useful to reevaluate all the work-step done. (author); 4 refs; 2 tabs; 8 figs

  20. Cloud Security in 21st Century: Current Key Issues in Service Models on Cloud Computing and how to overcome them

    OpenAIRE

    Sharma, Supern

    2011-01-01

    Cloud computing is a disruptive innovation which offers new ways to increase capacity and capabilities of the organization’s IT infrastructure. In last few years cloud computing has taken computing industry by storm and many organizations are using cloud computing services to increase the efficiency, decrease their IT budgets and play an important role in defining the IT strategy of their business. Cloud computing offers various benefits such as scalability, elasticity, reducing IT expenditur...