WorldWideScience

Sample records for current computational resources

  1. Current status and prospects of computational resources for natural product dereplication: a review.

    Science.gov (United States)

    Mohamed, Ahmed; Nguyen, Canh Hao; Mamitsuka, Hiroshi

    2016-03-01

    Research in natural products has always enhanced drug discovery by providing new and unique chemical compounds. However, recently, drug discovery from natural products is slowed down by the increasing chance of re-isolating known compounds. Rapid identification of previously isolated compounds in an automated manner, called dereplication, steers researchers toward novel findings, thereby reducing the time and effort for identifying new drug leads. Dereplication identifies compounds by comparing processed experimental data with those of known compounds, and so, diverse computational resources such as databases and tools to process and compare compound data are necessary. Automating the dereplication process through the integration of computational resources has always been an aspired goal of natural product researchers. To increase the utilization of current computational resources for natural products, we first provide an overview of the dereplication process, and then list useful resources, categorizing into databases, methods and software tools and further explaining them from a dereplication perspective. Finally, we discuss the current challenges to automating dereplication and proposed solutions.

  2. Current Resource Imagery Projects

    Data.gov (United States)

    Farm Service Agency, Department of Agriculture — Map showing coverage of current Resource imagery projects. High resolution/large scale Resource imagery is typically acquired for the U.S. Forest Service and other...

  3. Quantifying resource use in computations

    NARCIS (Netherlands)

    van Son, R.J.J.H.

    2009-01-01

    It is currently not possible to quantify the resources needed to perform a computation. As a consequence, it is not possible to reliably evaluate the hardware resources needed for the application of algorithms or the running of programs. This is apparent in both computer science, for in- stance, in

  4. Quantifying resource use in computations

    NARCIS (Netherlands)

    van Son, R.J.J.H.

    2009-01-01

    It is currently not possible to quantify the resources needed to perform a computation. As a consequence, it is not possible to reliably evaluate the hardware resources needed for the application of algorithms or the running of programs. This is apparent in both computer science, for in- stance, in

  5. Quantifying Resource Use in Computations

    CERN Document Server

    van Son, R J J H

    2009-01-01

    It is currently not possible to quantify the resources needed to perform a computation. As a consequence, it is not possible to reliably evaluate the hardware resources needed for the application of algorithms or the running of programs. This is apparent in both computer science, for instance, in cryptanalysis, and in neuroscience, for instance, comparative neuro-anatomy. A System versus Environment game formalism is proposed based on Computability Logic that allows to define a computational work function that describes the theoretical and physical resources needed to perform any purely algorithmic computation. Within this formalism, the cost of a computation is defined as the sum of information storage over the steps of the computation. The size of the computational device, eg, the action table of a Universal Turing Machine, the number of transistors in silicon, or the number and complexity of synapses in a neural net, is explicitly included in the computational cost. The proposed cost function leads in a na...

  6. LHCb Computing Resources: 2017 requests

    CERN Document Server

    Bozzi, Concezio

    2016-01-01

    This document presents an assessment of computing resources needed by LHCb in 2017, as resulting from the accumulated experience in Run2 data taking and recent changes in the LHCb computing model parameters.

  7. Aggregated Computational Toxicology Online Resource

    Data.gov (United States)

    U.S. Environmental Protection Agency — Aggregated Computational Toxicology Online Resource (AcTOR) is EPA's online aggregator of all the public sources of chemical toxicity data. ACToR aggregates data...

  8. Framework Resources Multiply Computing Power

    Science.gov (United States)

    2010-01-01

    As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.

  9. Turning Video Resource Management into Cloud Computing

    Directory of Open Access Journals (Sweden)

    Weili Kou

    2016-07-01

    Full Text Available Big data makes cloud computing more and more popular in various fields. Video resources are very useful and important to education, security monitoring, and so on. However, issues of their huge volumes, complex data types, inefficient processing performance, weak security, and long times for loading pose challenges in video resource management. The Hadoop Distributed File System (HDFS is an open-source framework, which can provide cloud-based platforms and presents an opportunity for solving these problems. This paper presents video resource management architecture based on HDFS to provide a uniform framework and a five-layer model for standardizing the current various algorithms and applications. The architecture, basic model, and key algorithms are designed for turning video resources into a cloud computing environment. The design was tested by establishing a simulation system prototype.

  10. LHCb Computing Resources: 2019 requests and reassessment of 2018 requests

    CERN Document Server

    Bozzi, Concezio

    2017-01-01

    This document presents the computing resources needed by LHCb in 2019 and a reassessment of the 2018 requests, as resulting from the current experience of Run2 data taking and minor changes in the LHCb computing model parameters.

  11. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

  12. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning.

  13. Current Issues for Higher Education Information Resources Management.

    Science.gov (United States)

    CAUSE/EFFECT, 1995

    1995-01-01

    Current issues that are important for the future of information resources management in higher education are presented. They include: integrating planning for information resources within institution-wide strategic planning; reengineering fundamental services; change management; distributed computing support; networking; the changing communication…

  14. COMPUTATIONAL RESOURCES FOR BIOFUEL FEEDSTOCK SPECIES

    Energy Technology Data Exchange (ETDEWEB)

    Buell, Carol Robin [Michigan State University; Childs, Kevin L [Michigan State University

    2013-05-07

    While current production of ethanol as a biofuel relies on starch and sugar inputs, it is anticipated that sustainable production of ethanol for biofuel use will utilize lignocellulosic feedstocks. Candidate plant species to be used for lignocellulosic ethanol production include a large number of species within the Grass, Pine and Birch plant families. For these biofuel feedstock species, there are variable amounts of genome sequence resources available, ranging from complete genome sequences (e.g. sorghum, poplar) to transcriptome data sets (e.g. switchgrass, pine). These data sets are not only dispersed in location but also disparate in content. It will be essential to leverage and improve these genomic data sets for the improvement of biofuel feedstock production. The objectives of this project were to provide computational tools and resources for data-mining genome sequence/annotation and large-scale functional genomic datasets available for biofuel feedstock species. We have created a Bioenergy Feedstock Genomics Resource that provides a web-based portal or clearing house for genomic data for plant species relevant to biofuel feedstock production. Sequence data from a total of 54 plant species are included in the Bioenergy Feedstock Genomics Resource including model plant species that permit leveraging of knowledge across taxa to biofuel feedstock species.We have generated additional computational analyses of these data, including uniform annotation, to facilitate genomic approaches to improved biofuel feedstock production. These data have been centralized in the publicly available Bioenergy Feedstock Genomics Resource (http://bfgr.plantbiology.msu.edu/).

  15. Web-Based Computing Resource Agent Publishing

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Web-based Computing Resource Publishing is a efficient way to provide additional computing capacity for users who need more computing resources than that they themselves could afford by making use of idle computing resources in the Web.Extensibility and reliability are crucial for agent publishing. The parent-child agent framework and primary-slave agent framework were proposed respectively and discussed in detail.

  16. Resource Management in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Andrei IONESCU

    2015-01-01

    Full Text Available Mobile cloud computing is a major research topic in Information Technology & Communications. It integrates cloud computing, mobile computing and wireless networks. While mainly built on cloud computing, it has to operate using more heterogeneous resources with implications on how these resources are managed and used. Managing the resources of a mobile cloud is not a trivial task, involving vastly different architectures. The process is outside the scope of human users. Using the resources by the applications at both platform and software tiers come with its own challenges. This paper presents different approaches in use for managing cloud resources at infrastructure and platform levels.

  17. Adaptive computational resource allocation for sensor networks

    Institute of Scientific and Technical Information of China (English)

    WANG Dian-hong; FEI E; YAN Yu-jie

    2008-01-01

    To efficiently utilize the limited computational resource in real-time sensor networks, this paper focu-ses on the challenge of computational resource allocation in sensor networks and provides a solution with the method of economies. It designs a mieroeconomic system in which the applications distribute their computational resource consumption across sensor networks by virtue of mobile agent. Further, it proposes the market-based computational resource allocation policy named MCRA which satisfies the uniform consumption of computational energy in network and the optimal division of the single computational capacity for multiple tasks. The simula-tion in the scenario of target tracing demonstrates that MCRA realizes an efficient allocation of computational re-sources according to the priority of tasks, achieves the superior allocation performance and equilibrium perform-ance compared to traditional allocation policies, and ultimately prolongs the system lifetime.

  18. LHCb Computing Resources: 2018 requests and preview of 2019 requests

    CERN Document Server

    Bozzi, Concezio

    2017-01-01

    This document presents a reassessment of computing resources needed by LHCb in 2018 and a preview of computing requests for 2019, as resulting from the current experience of Run2 data taking and recent changes in the LHCb computing model parameters.

  19. Enabling opportunistic resources for CMS Computing Operations

    Energy Technology Data Exchange (ETDEWEB)

    Hufnagel, Dick [Fermilab

    2015-11-19

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resourcesresources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.

  20. Resourceful Computing in Unstructured Environments

    Science.gov (United States)

    1991-07-31

    Patt A 11, no. 3 (March 1989): 244-257. Little, James J., Guy E. Blelloch, and Todd Cass, "How to Program the Connection Machine for Computer Vision...Blelloch, and Todd Cass, "Parallel Algorithms for Computer Vision on the Connection Machine," Proceedings of the Image Understanding Workshop, Los...L. Jones, Emmanuel Mazer, Patrick A. O’Donnell, "Task-Level Planning of Pick-and-Place Robot Motions," Computer Magazine, vol. 22, no. 3, March 1989

  1. Resource management in mobile computing environments

    CERN Document Server

    Mavromoustakis, Constandinos X; Mastorakis, George

    2014-01-01

    This book reports the latest advances on the design and development of mobile computing systems, describing their applications in the context of modeling, analysis and efficient resource management. It explores the challenges on mobile computing and resource management paradigms, including research efforts and approaches recently carried out in response to them to address future open-ended issues. The book includes 26 rigorously refereed chapters written by leading international researchers, providing the readers with technical and scientific information about various aspects of mobile computing, from basic concepts to advanced findings, reporting the state-of-the-art on resource management in such environments. It is mainly intended as a reference guide for researchers and practitioners involved in the design, development and applications of mobile computing systems, seeking solutions to related issues. It also represents a useful textbook for advanced undergraduate and graduate courses, addressing special t...

  2. Efficient Resource Management in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Rushikesh Shingade

    2015-12-01

    Full Text Available Cloud computing, one of the widely used technology to provide cloud services for users who are charged for receiving services. In the aspect of a maximum number of resources, evaluating the performance of Cloud resource management policies are difficult to optimize efficiently. There are different simulation toolkits available for simulation and modelling the Cloud computing environment like GridSim CloudAnalyst, CloudSim, GreenCloud, CloudAuction etc. In proposed Efficient Resource Management in Cloud Computing (EFRE model, CloudSim is used as a simulation toolkit that allows simulation of DataCenter in Cloud computing system. The CloudSim toolkit also supports the creation of multiple virtual machines (VMs on a node of a DataCenter where cloudlets (user requests are assigned to virtual machines by scheduling policies. This paper represents, allocation policies, Time-Shared and Space-Shared are used for scheduling the cloudlets and compared with the constraints (metrics like total execution time, a number of resources and resource allocation algorithm. CloudSim has been used for simulations and the result of simulation demonstrate that Resource Management is effective.

  3. Framework of Resource Management for Intercloud Computing

    Directory of Open Access Journals (Sweden)

    Mohammad Aazam

    2014-01-01

    Full Text Available There has been a very rapid increase in digital media content, due to which media cloud is gaining importance. Cloud computing paradigm provides management of resources and helps create extended portfolio of services. Through cloud computing, not only are services managed more efficiently, but also service discovery is made possible. To handle rapid increase in the content, media cloud plays a very vital role. But it is not possible for standalone clouds to handle everything with the increasing user demands. For scalability and better service provisioning, at times, clouds have to communicate with other clouds and share their resources. This scenario is called Intercloud computing or cloud federation. The study on Intercloud computing is still in its start. Resource management is one of the key concerns to be addressed in Intercloud computing. Already done studies discuss this issue only in a trivial and simplistic way. In this study, we present a resource management model, keeping in view different types of services, different customer types, customer characteristic, pricing, and refunding. The presented framework was implemented using Java and NetBeans 8.0 and evaluated using CloudSim 3.0.3 toolkit. Presented results and their discussion validate our model and its efficiency.

  4. Integration of Cloud resources in the LHCb Distributed Computing

    Science.gov (United States)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  5. Current and future resources for functional metagenomics.

    Science.gov (United States)

    Lam, Kathy N; Cheng, Jiujun; Engel, Katja; Neufeld, Josh D; Charles, Trevor C

    2015-01-01

    Functional metagenomics is a powerful experimental approach for studying gene function, starting from the extracted DNA of mixed microbial populations. A functional approach relies on the construction and screening of metagenomic libraries-physical libraries that contain DNA cloned from environmental metagenomes. The information obtained from functional metagenomics can help in future annotations of gene function and serve as a complement to sequence-based metagenomics. In this Perspective, we begin by summarizing the technical challenges of constructing metagenomic libraries and emphasize their value as resources. We then discuss libraries constructed using the popular cloning vector, pCC1FOS, and highlight the strengths and shortcomings of this system, alongside possible strategies to maximize existing pCC1FOS-based libraries by screening in diverse hosts. Finally, we discuss the known bias of libraries constructed from human gut and marine water samples, present results that suggest bias may also occur for soil libraries, and consider factors that bias metagenomic libraries in general. We anticipate that discussion of current resources and limitations will advance tools and technologies for functional metagenomics research.

  6. Current and future resources for functional metagenomics

    Directory of Open Access Journals (Sweden)

    Kathy Nguyen Lam

    2015-10-01

    Full Text Available Functional metagenomics is a powerful experimental approach for studying gene function, starting from the extracted DNA of mixed microbial populations. A functional approach relies on the construction and screening of metagenomic libraries – physical libraries that contain DNA cloned from environmental metagenomes. The information obtained from functional metagenomics can help in future annotations of gene function and serve as a complement to sequence-based metagenomics. In this Perspective, we begin by summarizing the technical challenges of constructing metagenomic libraries and emphasize their value as resources. We then discuss libraries constructed using the popular cloning vector, pCC1FOS, and highlight the strengths and shortcomings of this system, alongside possible strategies to maximize existing pCC1FOS-based libraries by screening in diverse hosts. Finally, we discuss the known bias of libraries constructed from human gut and marine water samples, present results that suggest bias may also occur for soil libraries, and consider factors that bias metagenomic libraries in general. We anticipate that discussion of current resources and limitations will advance tools and technologies for functional metagenomics research.

  7. Dynamic computing resource allocation in online flood monitoring and prediction

    Science.gov (United States)

    Kuchar, S.; Podhoranyi, M.; Vavrik, R.; Portero, A.

    2016-08-01

    This paper presents tools and methodologies for dynamic allocation of high performance computing resources during operation of the Floreon+ online flood monitoring and prediction system. The resource allocation is done throughout the execution of supported simulations to meet the required service quality levels for system operation. It also ensures flexible reactions to changing weather and flood situations, as it is not economically feasible to operate online flood monitoring systems in the full performance mode during non-flood seasons. Different service quality levels are therefore described for different flooding scenarios, and the runtime manager controls them by allocating only minimal resources currently expected to meet the deadlines. Finally, an experiment covering all presented aspects of computing resource allocation in rainfall-runoff and Monte Carlo uncertainty simulation is performed for the area of the Moravian-Silesian region in the Czech Republic.

  8. Exploiting multicore compute resources in the CMS experiment

    Science.gov (United States)

    Ramírez, J. E.; Pérez-Calero Yzquierdo, A.; Hernández, J. M.; CMS Collaboration

    2016-10-01

    CMS has developed a strategy to efficiently exploit the multicore architecture of the compute resources accessible to the experiment. A coherent use of the multiple cores available in a compute node yields substantial gains in terms of resource utilization. The implemented approach makes use of the multithreading support of the event processing framework and the multicore scheduling capabilities of the resource provisioning system. Multicore slots are acquired and provisioned by means of multicore pilot agents which internally schedule and execute single and multicore payloads. Multicore scheduling and multithreaded processing are currently used in production for online event selection and prompt data reconstruction. More workflows are being adapted to run in multicore mode. This paper presents a review of the experience gained in the deployment and operation of the multicore scheduling and processing system, the current status and future plans.

  9. Optimised resource construction for verifiable quantum computation

    Science.gov (United States)

    Kashefi, Elham; Wallden, Petros

    2017-04-01

    Recent developments have brought the possibility of achieving scalable quantum networks and quantum devices closer. From the computational point of view these emerging technologies become relevant when they are no longer classically simulatable. Hence a pressing challenge is the construction of practical methods to verify the correctness of the outcome produced by universal or non-universal quantum devices. A promising approach that has been extensively explored is the scheme of verification via encryption through blind quantum computation. We present here a new construction that simplifies the required resources for any such verifiable protocol. We obtain an overhead that is linear in the size of the input (computation), while the security parameter remains independent of the size of the computation and can be made exponentially small (with a small extra cost). Furthermore our construction is generic and could be applied to any universal or non-universal scheme with a given underlying graph.

  10. Limitation of computational resource as physical principle

    CERN Document Server

    Ozhigov, Y I

    2003-01-01

    Limitation of computational resources is considered as a universal principle that for simulation is as fundamental as physical laws are. It claims that all experimentally verifiable implications of physical laws can be simulated by the effective classical algorithms. It is demonstrated through a completely deterministic approach proposed for the simulation of biopolymers assembly. A state of molecule during its assembly is described in terms of the reduced density matrix permitting only limited tunneling. An assembly is treated as a sequence of elementary scatterings of simple molecules from the environment on the point of assembly. A decoherence is treated as a forced measurement of quantum state resulted from the shortage of computational resource. All results of measurements are determined by a choice from the limited number of special options of the nonphysical nature which stay unchanged till the completion of assembly; we do not use the random numbers generators. Observations of equal states during the ...

  11. Automating usability of ATLAS Distributed Computing resources

    CERN Document Server

    "Tupputi, S A; The ATLAS collaboration

    2013-01-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic exclusion/recovery of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources who feature non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes SAM (Site Availability Test) site-by-site SRM tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites.\

  12. Architecturing Conflict Handling of Pervasive Computing Resources

    OpenAIRE

    Jakob, Henner; Consel, Charles; Loriant, Nicolas

    2011-01-01

    International audience; Pervasive computing environments are created to support human activities in different domains (e.g., home automation and healthcare). To do so, applications orchestrate deployed services and devices. In a realistic setting, applications are bound to conflict in their usage of shared resources, e.g., controlling doors for security and fire evacuation purposes. These conflicts can have critical effects on the physical world, putting people and assets at risk. This paper ...

  13. LHCb Computing Resource usage in 2015 (II)

    CERN Document Server

    Bozzi, Concezio

    2016-01-01

    This documents reports the usage of computing resources by the LHCb collaboration during the period January 1st – December 31st 2015. The data in the following sections has been compiled from the EGI Accounting portal: https://accounting.egi.eu. For LHCb specific information, the data is taken from the DIRAC Accounting at the LHCb DIRAC Web portal: http://lhcb-portal-dirac.cern.ch.

  14. Exploiting volatile opportunistic computing resources with Lobster

    Science.gov (United States)

    Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2015-12-01

    Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.

  15. Parallel visualization on leadership computing resources

    Energy Technology Data Exchange (ETDEWEB)

    Peterka, T; Ross, R B [Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Shen, H-W [Department of Computer Science and Engineering, Ohio State University, Columbus, OH 43210 (United States); Ma, K-L [Department of Computer Science, University of California at Davis, Davis, CA 95616 (United States); Kendall, W [Department of Electrical Engineering and Computer Science, University of Tennessee at Knoxville, Knoxville, TN 37996 (United States); Yu, H, E-mail: tpeterka@mcs.anl.go [Sandia National Laboratories, California, Livermore, CA 94551 (United States)

    2009-07-01

    Changes are needed in the way that visualization is performed, if we expect the analysis of scientific data to be effective at the petascale and beyond. By using similar techniques as those used to parallelize simulations, such as parallel I/O, load balancing, and effective use of interprocess communication, the supercomputers that compute these datasets can also serve as analysis and visualization engines for them. Our team is assessing the feasibility of performing parallel scientific visualization on some of the most powerful computational resources of the U.S. Department of Energy's National Laboratories in order to pave the way for analyzing the next generation of computational results. This paper highlights some of the conclusions of that research.

  16. A Semi-Preemptive Computational Service System with Limited Resources and Dynamic Resource Ranking

    Directory of Open Access Journals (Sweden)

    Fang-Yie Leu

    2012-03-01

    Full Text Available In this paper, we integrate a grid system and a wireless network to present a convenient computational service system, called the Semi-Preemptive Computational Service system (SePCS for short, which provides users with a wireless access environment and through which a user can share his/her resources with others. In the SePCS, each node is dynamically given a score based on its CPU level, available memory size, current length of waiting queue, CPU utilization and bandwidth. With the scores, resource nodes are classified into three levels. User requests based on their time constraints are also classified into three types. Resources of higher levels are allocated to more tightly constrained requests so as to increase the total performance of the system. To achieve this, a resource broker with the Semi-Preemptive Algorithm (SPA is also proposed. When the resource broker cannot find suitable resources for the requests of higher type, it preempts the resource that is now executing a lower type request so that the request of higher type can be executed immediately. The SePCS can be applied to a Vehicular Ad Hoc Network (VANET, users of which can then exploit the convenient mobile network services and the wireless distributed computing. As a result, the performance of the system is higher than that of the tested schemes.

  17. Automating usability of ATLAS Distributed Computing resources

    Science.gov (United States)

    Tupputi, S. A.; Di Girolamo, A.; Kouba, T.; Schovancová, J.; Atlas Collaboration

    2014-06-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  18. The Grid Resource Broker, A Ubiquitous Grid Computing Framework

    Directory of Open Access Journals (Sweden)

    Giovanni Aloisio

    2002-01-01

    Full Text Available Portals to computational/data grids provide the scientific community with a friendly environment in order to solve large-scale computational problems. The Grid Resource Broker (GRB is a grid portal that allows trusted users to create and handle computational/data grids on the fly exploiting a simple and friendly web-based GUI. GRB provides location-transparent secure access to Globus services, automatic discovery of resources matching the user's criteria, selection and scheduling on behalf of the user. Moreover, users are not required to learn Globus and they do not need to write specialized code or to rewrite their existing legacy codes. We describe GRB architecture, its components and current GRB features addressing the main differences between our approach and related work in the area.

  19. Current Cloud Computing Review and Cost Optimization by DERSP

    Directory of Open Access Journals (Sweden)

    M. Gomathy

    2014-03-01

    Full Text Available Cloud computing promises to deliver cost saving through the “pay as you use” paradigm. The focus is on adding computing resources when needed and releasing them when the need is serviced. Since cloud computing relies on providing computing power through multiple interconnected computers, there is a paradigm shift from one large machine to a combination of multiple smaller machine instances. In this paper, we review the current cloud computing scenario and provide a set of recommendations that can be used for designing custom applications suited for cloud deployment. We also present a comparative study on the change in cost incurred while using different combinations of machine instances for running an application on cloud; and derive the case for optimal cost

  20. Current Trends in Cloud Computing A Survey of Cloud Computing Systems

    Directory of Open Access Journals (Sweden)

    Harjit Singh

    2012-06-01

    Full Text Available Cloud computing that has become an increasingly important trend, is a virtualization technology that uses the internet and central remote servers to offer the sharing of resources that include infrastructures, software, applications and business processes to the market environment to fulfill the elastic demand. In today’s competitive environment, the service vitality, elasticity, choices and flexibility offered by this scalable technology are too attractive that makes the cloud computing to increasingly becoming an integral part of the enterprise computing environment. This paper presents a survey of the current state of Cloud Computing. It includes a discussion of the evolution process of cloud computing, characteristics of Cloud, current technologies adopted in cloud computing, This paper also presents a comparative study of cloud computing platforms (Amazon, Google and Microsoft, and its challenges.

  1. Resource Optimization Based on Demand in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ramakrishnan Ramanathan

    2014-10-01

    Full Text Available A Cloud Computing gives the opportunity to dynamically scale the computing resources for application. Cloud Computing consist of large number of resources, it is called resource pool. These resources are shared among the cloud consumer using virtualization technology. Virtualization technologies engaged in cloud environment is resource consolidation and management. Cloud consists of physical and virtual resources. Cloud performance is important for Cloud Provider perspective predicts the dynamic nature of users, user demands and application demand. The cloud consumer perspective, the job should be completed on time with minimum cost and limited resources. Finding optimum resource allocation is difficult in huge system like Cluster, Data Centre and Grid. In this study we present two types of resource allocation schemes such as Commitment Allocation (CA and Over Commitment Allocation (OCA in the physical and virtual level resource. These resource allocation schemes helps to identify the virtual resource utilization and physical resource availability.

  2. What's New in Software? Current Sources of Information Boost Effectiveness of Computer-Assisted Instruction.

    Science.gov (United States)

    Ellsworth, Nancy J.

    1990-01-01

    This article reviews current resources on computer-assisted instruction. Included are sources of software and hardware evaluations, advances in current technology, research, an information hotline, and inventories of available technological assistance. (DB)

  3. Optimal Joint Multiple Resource Allocation Method for Cloud Computing Environments

    CERN Document Server

    Kuribayashi, Shin-ichi

    2011-01-01

    Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources. To provide cloud computing services economically, it is important to optimize resource allocation under the assumption that the required resource can be taken from a shared resource pool. In addition, to be able to provide processing ability and storage capacity, it is necessary to allocate bandwidth to access them at the same time. This paper proposes an optimal resource allocation method for cloud computing environments. First, this paper develops a resource allocation model of cloud computing environments, assuming both processing ability and bandwidth are allocated simultaneously to each service request and rented out on an hourly basis. The allocated resources are dedicated to each service request. Next, this paper proposes an optimal joint multiple resource allocation method, based on the above resource allocation model. It is demonstrated by simulation evaluation that the p...

  4. Scalable resource management in high performance computers.

    Energy Technology Data Exchange (ETDEWEB)

    Frachtenberg, E. (Eitan); Petrini, F. (Fabrizio); Fernandez Peinador, J. (Juan); Coll, S. (Salvador)

    2002-01-01

    Clusters of workstations have emerged as an important platform for building cost-effective, scalable and highly-available computers. Although many hardware solutions are available today, the largest challenge in making large-scale clusters usable lies in the system software. In this paper we present STORM, a resource management tool designed to provide scalability, low overhead and the flexibility necessary to efficiently support and analyze a wide range of job scheduling algorithms. STORM achieves these feats by closely integrating the management daemons with the low-level features that are common in state-of-the-art high-performance system area networks. The architecture of STORM is based on three main technical innovations. First, a sizable part of the scheduler runs in the thread processor located on the network interface. Second, we use hardware collectives that are highly scalable both for implementing control heartbeats and to distribute the binary of a parallel job in near-constant time, irrespective of job and machine sizes. Third, we use an I/O bypass protocol that allows fast data movements from the file system to the communication buffers in the network interface and vice versa. The experimental results show that STORM can launch a job with a binary of 12MB on a 64 processor/32 node cluster in less than 0.25 sec on an empty network, in less than 0.45 sec when all the processors are busy computing other jobs, and in less than 0.65 sec when the network is flooded with a background traffic. This paper provides experimental and analytical evidence that these results scale to a much larger number of nodes. To the best of our knowledge, STORM is at least two orders of magnitude faster than existing production schedulers in launching jobs, performing resource management tasks and gang scheduling.

  5. Computational chemistry reviews of current trends v.4

    CERN Document Server

    1999-01-01

    This volume presents a balanced blend of methodological and applied contributions. It supplements well the first three volumes of the series, revealing results of current research in computational chemistry. It also reviews the topographical features of several molecular scalar fields. A brief discussion of topographical concepts is followed by examples of their application to several branches of chemistry.The size of a basis set applied in a calculation determines the amount of computer resources necessary for a particular task. The details of a common strategy - the ab initio model potential

  6. Computing Bounds on Resource Levels for Flexible Plans

    Science.gov (United States)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  7. Contract on using computer resources of another

    Directory of Open Access Journals (Sweden)

    Cvetković Mihajlo

    2016-01-01

    Full Text Available Contractual relations involving the use of another's property are quite common. Yet, the use of computer resources of others over the Internet and legal transactions arising thereof certainly diverge from the traditional framework embodied in the special part of contract law dealing with this issue. Modern performance concepts (such as: infrastructure, software or platform as high-tech services are highly unlikely to be described by the terminology derived from Roman law. The overwhelming novelty of high-tech services obscures the disadvantageous position of contracting parties. In most cases, service providers are global multinational companies which tend to secure their own unjustified privileges and gain by providing lengthy and intricate contracts, often comprising a number of legal documents. General terms and conditions in these service provision contracts are further complicated by the '.service level agreement', rules of conduct and (nonconfidentiality guarantees. Without giving the issue a second thought, users easily accept the pre-fabricated offer without reservations, unaware that such a pseudo-gratuitous contract actually conceals a highly lucrative and mutually binding agreement. The author examines the extent to which the legal provisions governing sale of goods and services, lease, loan and commodatum may apply to 'cloud computing' contracts, and analyses the scope and advantages of contractual consumer protection, as a relatively new area in contract law. The termination of a service contract between the provider and the user features specific post-contractual obligations which are inherent to an online environment.

  8. Current perspectives in contaminant hydrology and water resources sustainability

    Science.gov (United States)

    Bradley, Paul M.

    2013-01-01

    Human society depends on liquid freshwater resources to meet drinking, sanitation and hygiene, agriculture, and industry needs. Improved resource monitoring and better understanding of the anthropogenic threats to freshwater environments are critical to efficient management of freshwater resources and ultimately to the survival and quality of life of the global human population. This book helps address the need for improved freshwater resource monitoring and threat assessment by presenting current reviews and case studies focused on the fate and transport of contaminants in the environment and on the sustainability of groundwater and surface-water resources around the world. It is intended for students and professionals working in hydrology and water resources management.

  9. Overview of water resource assessment in South Africa: Current ...

    African Journals Online (AJOL)

    is essential to planners and designers of water supply schemes and those ... Particular emphasis is given to the evolution of the computer as an ... we now call the historical firm yield. ..... In their article, Strategic planning for water resources in.

  10. THE STRATEGY OF RESOURCE MANAGEMENT BASED ON GRID COMPUTING

    Institute of Scientific and Technical Information of China (English)

    Wang Ruchuan; Han Guangfa; Wang Haiyan

    2006-01-01

    This paper analyzes the defaults of traditional method according to the resource management method of grid computing based on virtual organization. It supports the concept to ameliorate the resource management with mobile agent and gives the ameliorated resource management model. Also pointed out is the methodology of ameliorating resource management and the way to realize in reality.

  11. Dynamic Resource Management and Job Scheduling for High Performance Computing

    OpenAIRE

    2016-01-01

    Job scheduling and resource management plays an essential role in high-performance computing. Supercomputing resources are usually managed by a batch system, which is responsible for the effective mapping of jobs onto resources (i.e., compute nodes). From the system perspective, a batch system must ensure high system utilization and throughput, while from the user perspective it must ensure fast response times and fairness when allocating resources across jobs. Parallel jobs can be divide...

  12. A study of computer graphics technology in application of communication resource management

    Science.gov (United States)

    Li, Jing; Zhou, Liang; Yang, Fei

    2017-08-01

    With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.

  13. Mobile devices and computing cloud resources allocation for interactive applications

    Directory of Open Access Journals (Sweden)

    Krawczyk Henryk

    2017-06-01

    Full Text Available Using mobile devices such as smartphones or iPads for various interactive applications is currently very common. In the case of complex applications, e.g. chess games, the capabilities of these devices are insufficient to run the application in real time. One of the solutions is to use cloud computing. However, there is an optimization problem of mobile device and cloud resources allocation. An iterative heuristic algorithm for application distribution is proposed. The algorithm minimizes the energy cost of application execution with constrained execution time.

  14. Computational approaches to analogical reasoning current trends

    CERN Document Server

    Richard, Gilles

    2014-01-01

    Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity an...

  15. NASA Center for Computational Sciences: History and Resources

    Science.gov (United States)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  16. Analysis on the Application of Cloud Computing to the Teaching Resources Sharing Construction in Colleges and Universities

    Institute of Scientific and Technical Information of China (English)

    LIU Mi

    2015-01-01

    Cloud computing is a new computing model. The application of cloud computing to the field of higher education informatization has been very popular currently. In this paper, the concept and characteristics of cloud computing are introduced, the current situation of the teaching resources sharing and construction in colleges and universities is analyzed, and finally the influence of cloud computing on the construction of teaching information resources is discussed.

  17. Computing at DESY — current setup, trends and strategic directions

    Science.gov (United States)

    Ernst, Michael

    1998-05-01

    Since the HERA experiments H1 and ZEUS started data taking in '92, the computing environment at DESY has changed dramatically. Running a mainframe centred computing for more than 20 years, DESY switched to a heterogeneous, fully distributed computing environment within only about two years in almost every corner where computing has its applications. The computing strategy was highly influenced by the needs of the user community. The collaborations are usually limited by current technology and their ever increasing demands is the driving force for central computing to always move close to the technology edge. While DESY's central computing has a multidecade experience in running Central Data Recording/Central Data Processing for HEP experiments, the most challenging task today is to provide for clear and homogeneous concepts in the desktop area. Given that lowest level commodity hardware draws more and more attention, combined with the financial constraints we are facing already today, we quickly need concepts for integrated support of a versatile device which has the potential to move into basically any computing area in HEP. Though commercial solutions, especially addressing the PC management/support issues, are expected to come to market in the next 2-3 years, we need to provide for suitable solutions now. Buying PC's at DESY currently at a rate of about 30/month will otherwise absorb any available manpower in central computing and still will leave hundreds of unhappy people alone. Though certainly not the only region, the desktop issue is one of the most important one where we need HEP-wide collaboration to a large extent, and right now. Taking into account that there is traditionally no room for R&D at DESY, collaboration, meaning sharing experience and development resources within the HEP community, is a predominant factor for us.

  18. Computation of current distributions using FEMLAB

    Energy Technology Data Exchange (ETDEWEB)

    Shankar, M.S.; Pullabhotla, S.R. [Vellore Institute of Technology, Tamilnadu (India); Vijayasekaran, B. [Central Electrochemical Research Institute, Tamilnadu (India); Chemical Engineering, Tennessee Technological University, Tennessee (United States); Basha, C.A.

    2009-04-15

    An efficient method for the computation of current density and surface concentration distributions in electrochemical processes is analyzed using the commercial mathematical software FEMLAB. To illustrate the utility of the software, the procedure is applied to some realistic problems encountered in electrochemical engineering, such as current distribution in a continuous moving electrode, parallel plate electrode, hull cell, curvilinear hull cell, thin layer galvanic cell, through-hole plating, and a recessed disc electrode. The model equations of the above cases are considered and their implementations into the software, FEMLAB, are analyzed. The technique is attractive because it involves a systematic way of coupling equations to perform case studies. (Abstract Copyright [2009], Wiley Periodicals, Inc.)

  19. Integration of Cloud resources in the LHCb Distributed Computing

    CERN Document Server

    Ubeda Garcia, Mario; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keepin...

  20. Using Current Resources to Implement Wellness Programming for Preschoolers

    Science.gov (United States)

    Cirignano, Sherri M.

    2013-01-01

    Currently, there is a nationwide effort to include preschool-aged children in wellness efforts for the prevention of obesity. National resources include guidelines, best practices, and tip sheets to assist in the implementation of these interventions. The Let's Move! Child Care Checklist is a resource that can be used to assess the level at…

  1. Bulgarian-Polish Language Resources (Current State and Future Development

    Directory of Open Access Journals (Sweden)

    Ludmila Dimitrova

    2015-06-01

    Full Text Available Bulgarian-Polish Language Resources (Current State and Future Development The paper briefly reviews the first Bulgarian-Polish digital bilingual resources: corpora and dictionaries, which are currently developed under bilateral collaboration between IMI-BAS and ISS-PAS: joint research project “Semantics and contrastive linguistics with a focus on a bilingual electronic dictionary”, coordinated by L. Dimitrova (IMI-BAS and V. Koseska (ISS-PAS.

  2. Resource estimation in high performance medical image computing.

    Science.gov (United States)

    Banalagay, Rueben; Covington, Kelsie Jade; Wilkes, D M; Landman, Bennett A

    2014-10-01

    Medical imaging analysis processes often involve the concatenation of many steps (e.g., multi-stage scripts) to integrate and realize advancements from image acquisition, image processing, and computational analysis. With the dramatic increase in data size for medical imaging studies (e.g., improved resolution, higher throughput acquisition, shared databases), interesting study designs are becoming intractable or impractical on individual workstations and servers. Modern pipeline environments provide control structures to distribute computational load in high performance computing (HPC) environments. However, high performance computing environments are often shared resources, and scheduling computation across these resources necessitates higher level modeling of resource utilization. Submission of 'jobs' requires an estimate of the CPU runtime and memory usage. The resource requirements for medical image processing algorithms are difficult to predict since the requirements can vary greatly between different machines, different execution instances, and different data inputs. Poor resource estimates can lead to wasted resources in high performance environments due to incomplete executions and extended queue wait times. Hence, resource estimation is becoming a major hurdle for medical image processing algorithms to efficiently leverage high performance computing environments. Herein, we present our implementation of a resource estimation system to overcome these difficulties and ultimately provide users with the ability to more efficiently utilize high performance computing resources.

  3. PERFORMANCE IMPROVEMENT IN CLOUD COMPUTING USING RESOURCE CLUSTERING

    Directory of Open Access Journals (Sweden)

    G. Malathy

    2013-01-01

    Full Text Available Cloud computing is a computing paradigm in which the various tasks are assigned to a combination of connections, software and services that can be accessed over the network. The computing resources and services can be efficiently delivered and utilized, making the vision of computing utility realizable. In various applications, execution of services with more number of tasks has to perform with minimum intertask communication. The applications are more likely to exhibit different patterns and levels and the distributed resources organize into various topologies for information and query dissemination. In a distributed system the resource discovery is a significant process for finding appropriate nodes. The earlier resource discovery mechanism in cloud system relies on the recent observations. In this study, resource usage distribution for a group of nodes with identical resource usages patterns are identified and kept as a cluster and is named as resource clustering approach. The resource clustering approach is modeled using CloudSim, a toolkit for modeling and simulating cloud computing environments and the evaluation improves the performance of the system in the usage of the resources. Results show that resource clusters are able to provide high accuracy for resource discovery.

  4. Resource Centered Computing delivering high parallel performance

    OpenAIRE

    2014-01-01

    International audience; Modern parallel programming requires a combination of differentparadigms, expertise and tuning, that correspond to the differentlevels in today's hierarchical architectures. To cope with theinherent difficulty, ORWL (ordered read-write locks) presents a newparadigm and toolbox centered around local or remote resources, suchas data, processors or accelerators. ORWL programmers describe theircomputation in terms of access to these resources during criticalsections. Exclu...

  5. Resource management in utility and cloud computing

    CERN Document Server

    Zhao, Han

    2013-01-01

    This SpringerBrief reviews the existing market-oriented strategies for economically managing resource allocation in distributed systems. It describes three new schemes that address cost-efficiency, user incentives, and allocation fairness with regard to different scheduling contexts. The first scheme, taking the Amazon EC2? market as a case of study, investigates the optimal resource rental planning models based on linear integer programming and stochastic optimization techniques. This model is useful to explore the interaction between the cloud infrastructure provider and the cloud resource c

  6. A Matchmaking Strategy Of Mixed Resource On Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wisam Elshareef

    2015-08-01

    Full Text Available Abstract Today cloud computing has become a key technology for online allotment of computing resources and online storage of user data in a lower cost where computing resources are available all the time over the Internet with pay per use concept. Recently there is a growing need for resource management strategies in a cloud computing environment that encompass both end-users satisfaction and a high job submission throughput with appropriate scheduling. One of the major and essential issues in resource management is related to allocate incoming tasks to suitable virtual machine matchmaking. The main objective of this paper is to propose a matchmaking strategy between the incoming requests and various resources in the cloud environment to satisfy the requirements of users and to load balance the workload on resources. Load Balancing is an important aspect of resource management in a cloud computing environment. So this paper proposes a dynamic weight active monitor DWAM load balance algorithm which allocates on the fly the incoming requests to the all available virtual machines in an efficient manner in order to achieve better performance parameters such as response time processing time and resource utilization. The feasibility of the proposed algorithm is analyzed using Cloudsim simulator which proves the superiority of the proposed DWAM algorithm over its counterparts in literature. Simulation results demonstrate that proposed algorithm dramatically improves response time data processing time and more utilized of resource compared Active monitor and VM-assign algorithms.

  7. Research on Cloud Computing Resources Provisioning Based on Reinforcement Learning

    Directory of Open Access Journals (Sweden)

    Zhiping Peng

    2015-01-01

    Full Text Available As one of the core issues for cloud computing, resource management adopts virtualization technology to shield the underlying resource heterogeneity and complexity which makes the massive distributed resources form a unified giant resource pool. It can achieve efficient resource provisioning by using the rational implementing resource management methods and techniques. Therefore, how to manage cloud computing resources effectively becomes a challenging research topic. By analyzing the executing progress of a user job in the cloud computing environment, we proposed a novel resource provisioning scheme based on the reinforcement learning and queuing theory in this study. With the introduction of the concepts of Segmentation Service Level Agreement (SSLA and Utilization Unit Time Cost (UUTC, we viewed the resource provisioning problem in cloud computing as a sequential decision issue, and then we designed a novel optimization object function and employed reinforcement learning to solve it. Experiment results not only demonstrated the effectiveness of the proposed scheme, but also proved to outperform the common methods of resource utilization rate in terms of SLA collision avoidance and user costs.

  8. Resource Provisioning in SLA-Based Cluster Computing

    Science.gov (United States)

    Xiong, Kaiqi; Suh, Sang

    Cluster computing is excellent for parallel computation. It has become increasingly popular. In cluster computing, a service level agreement (SLA) is a set of quality of services (QoS) and a fee agreed between a customer and an application service provider. It plays an important role in an e-business application. An application service provider uses a set of cluster computing resources to support e-business applications subject to an SLA. In this paper, the QoS includes percentile response time and cluster utilization. We present an approach for resource provisioning in such an environment that minimizes the total cost of cluster computing resources used by an application service provider for an e-business application that often requires parallel computation for high service performance, availability, and reliability while satisfying a QoS and a fee negotiated between a customer and the application service provider. Simulation experiments demonstrate the applicability of the approach.

  9. Resource-efficient linear optical quantum computation.

    Science.gov (United States)

    Browne, Daniel E; Rudolph, Terry

    2005-07-01

    We introduce a scheme for linear optics quantum computation, that makes no use of teleported gates, and requires stable interferometry over only the coherence length of the photons. We achieve a much greater degree of efficiency and a simpler implementation than previous proposals. We follow the "cluster state" measurement based quantum computational approach, and show how cluster states may be efficiently generated from pairs of maximally polarization entangled photons using linear optical elements. We demonstrate the universality and usefulness of generic parity measurements, as well as introducing the use of redundant encoding of qubits to enable utilization of destructive measurements--both features of use in a more general context.

  10. Resource requirements for digital computations on electrooptical systems.

    Science.gov (United States)

    Eshaghian, M M; Panda, D K; Kumar, V K

    1991-03-10

    In this paper we study the resource requirements of electrooptical organizations in performing digital computing tasks. We define a generic model of parallel computation using optical interconnects, called the optical model of computation (OMC). In this model, computation is performed in digital electronics and communication is performed using free space optics. Using this model we derive relationships between information transfer and computational resources in solving a given problem. To illustrate our results, we concentrate on a computationally intensive operation, 2-D digital image convolution. Irrespective of the input/output scheme and the order of computation, we show a lower bound of ?(nw) on the optical volume required for convolving a w x w kernel with an n x n image, if the input bits are given to the system only once.

  11. Resource requirements for digital computations on electrooptical systems

    Science.gov (United States)

    Eshaghian, Mary M.; Panda, Dhabaleswar K.; Kumar, V. K. Prasanna

    1991-03-01

    The resource requirements of electrooptical organizations in performing digital computing tasks are studied via a generic model of parallel computation using optical interconnects, called the 'optical model of computation' (OMC). In this model, computation is performed in digital electronics and communication is performed using free space optics. Relationships between information transfer and computational resources in solving a given problem are derived. A computationally intensive operation, two-dimensional digital image convolution is undertaken. Irrespective of the input/output scheme and the order of computation, a lower bound of Omega(nw) is obtained on the optical volume required for convolving a w x w kernel with an n x n image, if the input bits are given to the system only once.

  12. Data-centric computing on distributed resources

    NARCIS (Netherlands)

    Cushing, R.S.

    2015-01-01

    Distributed computing has always been a challenge due to the NP-completeness of finding optimal underlying management routines. The advent of big data increases the dimensionality of the problem whereby data partitionability, processing complexity and locality play a crucial role in the effectivenes

  13. Allocation Strategies of Virtual Resources in Cloud-Computing Networks

    Directory of Open Access Journals (Sweden)

    D.Giridhar Kumar

    2014-11-01

    Full Text Available In distributed computing, Cloud computing facilitates pay per model as per user demand and requirement. Collection of virtual machines including both computational and storage resources will form the Cloud. In Cloud computing, the main objective is to provide efficient access to remote and geographically distributed resources. Cloud faces many challenges, one of them is scheduling/allocation problem. Scheduling refers to a set of policies to control the order of work to be performed by a computer system. A good scheduler adapts its allocation strategy according to the changing environment and the type of task. In this paper we will see FCFS, Round Robin scheduling in addition to Linear Integer Programming an approach of resource allocation.

  14. Index of current water-resources activities in Ohio, 1985

    Science.gov (United States)

    Eberle, Michael

    1985-01-01

    This report summarizes the U. S. Geological Survey 's Water Resources Division 's program in Ohio in 1985. The work of the Ohio District is carried out through the District office in Columbus and a field office in New Philadelphia. Collection of basic data needed for continuing determination and evaluation of the quantity, quality, and use of Ohio 's water resources is the responsibility of the District 's Hydrologic Surveillance Section. The Hydrologic Investigations Section conducts analytical and interpretive water-resource appraisals describing the occurrence, availability, and the physical, chemical, and biological characteristics of surface and groundwater. In addition to introductory material describing the structure of the Ohio District, information is presented on current projects, sites at which basic surface- and groundwater data are collected , and reports of Ohio 's water resources published by the U.S. Geological Survey and cooperating agencies. (USGS)

  15. An Efficient Algorithm for Resource Allocation in Parallel and Distributed Computing Systems

    Directory of Open Access Journals (Sweden)

    S.F. El-Zoghdy

    2013-03-01

    Full Text Available Resource allocation in heterogeneous parallel and distributed computing systems is the process of allocating user tasks to processing elements for execution such that some performance objective is optimized. In this paper, a new resource allocation algorithm for the computing grid environment is proposed. It takes into account the heterogeneity of the computational resources. It resolves the single point of failure problem which many of the current algorithms suffer from. In this algorithm, any site manager receives two kinds of tasks namely, remote tasks arriving from its associated local grid manager, and local tasks submitted directly to the site manager by local users in its domain. It allocates the grid workload based on the resources occupation ratio and the communication cost. The grid overall mean task response time is considered as the main performance metric that need to be minimized. The simulation results show that the proposed resource allocation algorithm improves the grid overall mean task response time. (Abstract

  16. A global resource for computational chemistry

    OpenAIRE

    2004-01-01

    Describes the creation and curation of the ca 200,000 molecules and calculations deposited in this collection (WWMM) This article has been submitted to the Journal Of Molecular Modeling (Springer) which allows self-archiving of preprints (but not postprints) - ROMEO-yellow A modular distributable system has been built for high-throughput computation of molecular structures and properties. It has been used to process 250K compounds from the NCI database and to make the results searchabl...

  17. Genomic resources in fruit plants: an assessment of current status.

    Science.gov (United States)

    Rai, Manoj K; Shekhawat, N S

    2015-01-01

    The availability of many genomic resources such as genome sequences, functional genomics resources including microarrays and RNA-seq, sufficient numbers of molecular markers, express sequence tags (ESTs) and high-density genetic maps is causing a rapid acceleration of genetics and genomic research of many fruit plants. This is leading to an increase in our knowledge of the genes that are linked to many horticultural and agronomically important traits. Recently, some progress has also been made on the identification and functional analysis of miRNAs in some fruit plants. This is one of the most active research fields in plant sciences. The last decade has witnessed development of genomic resources in many fruit plants such as apple, banana, citrus, grapes, papaya, pears, strawberry etc.; however, many of them are still not being exploited. Furthermore, owing to lack of resources, infrastructure and research facilities in many lesser-developed countries, development of genomic resources in many underutilized or less-studied fruit crops, which grow in these countries, is limited. Thus, research emphasis should be given to those fruit crops for which genomic resources are relatively scarce. The development of genomic databases of these less-studied fruit crops will enable biotechnologists to identify target genes that underlie key horticultural and agronomical traits. This review presents an overview of the current status of the development of genomic resources in fruit plants with the main emphasis being on genome sequencing, EST resources, functional genomics resources including microarray and RNA-seq, identification of quantitative trait loci and construction of genetic maps as well as efforts made on the identification and functional analysis of miRNAs in fruit plants.

  18. Cloud Scheduler: a resource manager for distributed compute clouds

    CERN Document Server

    Armstrong, P; Bishop, A; Charbonneau, A; Desmarais, R; Fransham, K; Hill, N; Gable, I; Gaudet, S; Goliath, S; Impey, R; Leavett-Brown, C; Ouellete, J; Paterson, M; Pritchet, C; Penfold-Brown, D; Podaima, W; Schade, D; Sobie, R J

    2010-01-01

    The availability of Infrastructure-as-a-Service (IaaS) computing clouds gives researchers access to a large set of new resources for running complex scientific applications. However, exploiting cloud resources for large numbers of jobs requires significant effort and expertise. In order to make it simple and transparent for researchers to deploy their applications, we have developed a virtual machine resource manager (Cloud Scheduler) for distributed compute clouds. Cloud Scheduler boots and manages the user-customized virtual machines in response to a user's job submission. We describe the motivation and design of the Cloud Scheduler and present results on its use on both science and commercial clouds.

  19. Computer Usage as Instructional Resources for Vocational Training in Nigeria

    Science.gov (United States)

    Oguzor, Nkasiobi Silas

    2011-01-01

    The use of computers has become the driving force in the delivery of instruction of today's vocational education and training (VET) in Nigeria. Though computers have become an increasingly accessible resource for educators to use in their teaching activities, most teachers are still unable to integrate it in their teaching and learning processes.…

  20. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Science.gov (United States)

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  1. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  2. Shared resource control between human and computer

    Science.gov (United States)

    Hendler, James; Wilson, Reid

    1989-01-01

    The advantages of an AI system of actively monitoring human control of a shared resource (such as a telerobotic manipulator) are presented. A system is described in which a simple AI planning program gains efficiency by monitoring human actions and recognizing when the actions cause a change in the system's assumed state of the world. This enables the planner to recognize when an interaction occurs between human actions and system goals, and allows maintenance of an up-to-date knowledge of the state of the world and thus informs the operator when human action would undo a goal achieved by the system, when an action would render a system goal unachievable, and efficiently replans the establishment of goals after human intervention.

  3. Current Computer Network Security Issues/Threats

    National Research Council Canada - National Science Library

    Ammar Yassir; Alaa A K Ismaeel

    2016-01-01

    Computer network security has been a subject of concern for a long period. Many efforts have been made to address the existing and emerging threats such as viruses and Trojan among others without any significant success...

  4. National Resource for Computation in Chemistry (NRCC). Attached scientific processors for chemical computations: a report to the chemistry community

    Energy Technology Data Exchange (ETDEWEB)

    Ostlund, N.S.

    1980-01-01

    The demands of chemists for computational resources are well known and have been amply documented. The best and most cost-effective means of providing these resources is still open to discussion, however. This report surveys the field of attached scientific processors (array processors) and attempts to indicate their present and possible future use in computational chemistry. Array processors have the possibility of providing very cost-effective computation. This report attempts to provide information that will assist chemists who might be considering the use of an array processor for their computations. It describes the general ideas and concepts involved in using array processors, the commercial products that are available, and the experiences reported by those currently using them. In surveying the field of array processors, the author makes certain recommendations regarding their use in computational chemistry. 5 figures, 1 table (RWR)

  5. Computer Resources Handbook for Flight Critical Systems.

    Science.gov (United States)

    1985-01-01

    4- lr , 4-21 71 -r:v-’.7. 7777 -.- - ~~ --- 2- ’K 2. N It- NATIONAL 8UURFAU OF E UORGoPY RESOLUI TESI 4.4, % % ! O0 ASI-TR-85,-502O (0 COMPUTER...associated with the ,.l-a, and the status of the originating unit or function is identifiel (e. g., ’.." 4, . ..-. operating in no rrrji / r estr i ct ed emrg...lllllEEEEElhEE IEEEEEEEEEEEEE Eu. -2w |’’ ".4 -, M.iii - /, - ,, IV. . ,,. 1 0 2-4 11M ~ 2 - Hill- 14 W15 NATIONAL BURAU OF S MCROGOPY RESOUYI TESI 5’W, 4

  6. Computational thermodynamics in electric current metallurgy

    DEFF Research Database (Denmark)

    Bhowmik, Arghya; Qin, R.S.

    2015-01-01

    A priori derivation for the extra free energy caused by the passing electric current in metal is presented. The analytical expression and its discrete format in support of the numerical calculation of thermodynamics in electric current metallurgy have been developed. This enables the calculation...... of electric current distribution, current induced temperature distribution and free energy sequence of various phase transitions in multiphase materials. The work is particularly suitable for the study of magnetic materials that contain various magnetic phases. The latter has not been considered in literature....... The method has been validated against the analytical solution of current distribution and experimental observation of microstructure evolution. It provides a basis for the design, prediction and implementation of the electric current metallurgy. The applicability of the theory is discussed in the derivations....

  7. CPT White Paper on Tier-1 Computing Resource Needs

    CERN Document Server

    CERN. Geneva. CPT Project

    2006-01-01

    In the summer of 2005, CMS like the other LHC experiments published a Computing Technical Design Report (C-TDR) for the LHCC, which describes the CMS computing models as a distributed system of Tier-0, Tier-1, and Tier-2 regional computing centers, and the CERN analysis facility, the CMS-CAF. The C-TDR contains information on resource needs for the different computing tiers that are derived from a set of input assumptions and desiderata on how to achieve high-throughput and a robust computing environment. At the CERN Computing Resources Review Board meeting in October 2005, the funding agencies agreed on a Memorandum of Understanding (MoU) describing the worldwide collaboration on LHC computing (WLCG). In preparation for this meeting the LCG project had put together information from countries regarding their pledges for computing resources at Tier-1 and Tier-2 centers. These pledges include the amount of CPU power, disk storage, tape storage library space, and network connectivity for each of the LHC experime...

  8. Resource pre-allocation algorithms for low-energy task scheduling of cloud computing

    Institute of Scientific and Technical Information of China (English)

    Xiaolong Xu; Lingling Cao; Xinheng Wang

    2016-01-01

    In order to lower the power consumption and im-prove the coefficient of resource utilization of current cloud computing systems, this paper proposes two resource pre-alocation algorithms based on the “shut down the re-dundant, turn on the demanded” strategy here. Firstly, a green cloud computing model is presented, abstracting the task scheduling problem to the virtual machine deployment issue with the virtualization technology. Secondly, the future work-loads of system need to be predicted: a cubic exponential smoothing algorithm based on the conservative control (CESCC) strategy is proposed, combining with the current state and resource distribution of system, in order to calculate the demand of resources for the next period of task requests. Then, a multi-objective constrained optimization model of power consumption and a low-energy resource alocation algorithm based on probabilistic matching (RA-PM) are pro-posed. In order to reduce the power consumption further, the resource alocation algorithm based on the improved simu-lated annealing (RA-ISA) is designed with the improved simulated annealing algorithm. Experimental results show that the prediction and conservative control strategy make re-source pre-alocation catch up with demands, and improve the efficiency of real-time response and the stability of the system. Both RA-PM and RA-ISA can activate fewer hosts, achieve better load balance among the set of high applicable hosts, maximize the utilization of resources, and greatly reduce the power consumption of cloud computing systems.

  9. Application-adaptive resource scheduling in a computational grid

    Institute of Scientific and Technical Information of China (English)

    LUAN Cui-ju; SONG Guang-hua; ZHENG Yao

    2006-01-01

    Selecting appropriate resources for running a job efficiently is one of the common objectives in a computational grid.Resource scheduling should consider the specific characteristics of the application, and decide the metrics to be used accordingly.This paper presents a distributed resource scheduling framework mainly consisting of a job scheduler and a local scheduler. In order to meet the requirements of different applications, we adopt HGSA, a Heuristic-based Greedy Scheduling Algorithm, to schedule jobs in the grid, where the heuristic knowledge is the metric weights of the computing resources and the metric workload impact factors. The metric weight is used to control the effect of the metric on the application. For different applications, only metric weights and the metric workload impact factors need to be changed, while the scheduling algorithm remains the same.Experimental results are presented to demonstrate the adaptability of the HGSA.

  10. EST analysis pipeline: use of distributed computing resources.

    Science.gov (United States)

    González, Francisco Javier; Vizcaíno, Juan Antonio

    2011-01-01

    This chapter describes how a pipeline for the analysis of expressed sequence tag (EST) data can be -implemented, based on our previous experience generating ESTs from Trichoderma spp. We focus on key steps in the workflow, such as the processing of raw data from the sequencers, the clustering of ESTs, and the functional annotation of the sequences using BLAST, InterProScan, and BLAST2GO. Some of the steps require the use of intensive computing power. Since these resources are not available for small research groups or institutes without bioinformatics support, an alternative will be described: the use of distributed computing resources (local grids and Amazon EC2).

  11. Using OSG Computing Resources with (iLC)Dirac

    CERN Document Server

    Sailer, Andre

    2017-01-01

    CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called 'SiteDirectors', which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional sitespecific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were develo...

  12. Load Balancing in Local Computational Grids within Resource Allocation Process

    Directory of Open Access Journals (Sweden)

    Rouhollah Golmohammadi

    2012-11-01

    Full Text Available A suitable resource allocation method in computational grids should schedule resources in a way that provides the requirements of the users and the resource providers; i.e., the maximum number of tasks should be completed in their time and budget constraints and the received load be distributed equally between resources. This is a decision-making problem, while the scheduler should select a resource from all ones. This process is a multi criteria decision-making problem; because of affect of different properties of resources on this decision. The goal of this decision-making process is balancing the load and completing the tasks in their defined constraints. The proposed algorithm is an analytic hierarchy process based Resource Allocation (ARA method. This method estimates a value for the preference of each resource and then selects the appropriate resource based on the allocated values. The simulations show the ARA method decreases the task failure rate at least 48% and increases the balance factor more than 3.4%.

  13. Grid Computing: A Collaborative Approach in Distributed Environment for Achieving Parallel Performance and Better Resource Utilization

    Directory of Open Access Journals (Sweden)

    Sashi Tarun

    2011-01-01

    Full Text Available From the very beginning various measures are taken or consider for better utilization of available limited resources in the computer system for operational environment, this is came in consideration because most of the time our system get free and not able to exploit the system resource/capabilities as whole cause low performance. Parallel Computing can work efficiently, where operations are handled by multi-processors independently or efficiently, without any other processing capabilities. All processing unit’s works in a parallel fashioned and increases the system throughput without any resource allocation problem among different processing units. But this is limited and effective within a single machine. Today in this computing world, maintaining and establishing high speed computational work environment in a distributed scenario seems to be a challenging task because this environment made all operations by not depending on single resources but by interacting with otherresources in the vast network architecture. All current resource management system can only work smoothly if they apply these resources within their clusters, local organizations or disputed among many users who needs processing power, but for vast distributed environment performing various operational activities seems to be difficult because data is physically not maintained in a centralized location, it is geographically dispersed on multiple remote computers systems. Computers in the distributed environment have to depend on multiple resources for their task completion. Effective performance with high availability of resources to each computer in this speedy distributed computational environment is the major concern. To solve this problem a new approach is coined called “Grid Computing” environment. Grid uses a Middleware to coordinate disparate resources across a network, allows users to function as a virtual whole and make computing fast. In this paper I want to

  14. A Survey on Resource Allocation Strategies in Cloud Computing

    Directory of Open Access Journals (Sweden)

    V.Vinothina

    2012-06-01

    Full Text Available Cloud computing has become a new age technology that has got huge potentials in enterprises and markets. Clouds can make it possible to access applications and associated data from anywhere. Companies are able to rent resources from cloud for storage and other computational purposes so that their infrastructure cost can be reduced significantly. Further they can make use of company-wide access to applications, based on pay-as-you-go model. Hence there is no need for getting licenses for individual products. However one of the major pitfalls in cloud computing is related to optimizing the resources being allocated. Because of the uniqueness of the model, resource allocation is performed with the objective of minimizing the costs associated with it. The other challenges of resource allocation are meeting customer demands and application requirements. In this paper, various resource allocation strategies and their challenges are discussed in detail. It is believed that this paper would benefit both cloud users and researchers in overcoming the challenges faced.

  15. GridFactory - Distributed computing on ephemeral resources

    DEFF Research Database (Denmark)

    Orellana, Frederik; Niinimaki, Marko

    2011-01-01

    A novel batch system for high throughput computing is presented. The system is specifically designed to leverage virtualization and web technology to facilitate deployment on cloud and other ephemeral resources. In particular, it implements a security model suited for forming collaborations...

  16. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    OpenAIRE

    Cirasella, Jill

    2009-01-01

    This article is an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news.

  17. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  18. Computing Resource And Work Allocations Using Social Profiles

    Directory of Open Access Journals (Sweden)

    Peter Lavin

    2013-01-01

    Full Text Available If several distributed and disparate computer resources exist, many of whichhave been created for different and diverse reasons, and several large scale com-puting challenges also exist with similar diversity in their backgrounds, then oneproblem which arises in trying to assemble enough of these resources to addresssuch challenges is the need to align and accommodate the different motivationsand objectives which may lie behind the existence of both the resources andthe challenges. Software agents are offered as a mainstream technology formodelling the types of collaborations and relationships needed to do this. Asan initial step towards forming such relationships, agents need a mechanism toconsider social and economic backgrounds. This paper explores addressing so-cial and economic differences using a combination of textual descriptions knownas social profiles and search engine technology, both of which are integrated intoan agent technology.

  19. [Social and health resources in Catalonia. Current situation].

    Science.gov (United States)

    Bullich-Marín, Ingrid; Sánchez-Ferrín, Pau; Cabanes-Duran, Concepció; Salvà-Casanovas, Antoni

    2017-03-20

    The network of social and health care has advanced since its inception. Furthermore, news services have been created and some resources have been adapted within the framework of respective health plans. This article presents the current situation of the different social and health resources in Catalonia, as well as the main changes that have occurred in recent years, more specifically in the period of the Health Plan 2011-2015. This period is characterised by an adaptation of the social and health network within the context of chronic care, for which the development of intermediate care resources has become the most relevant aspect. There is also a need to create a single long-term care sector in which the health care quality is guaranteed. Moreover, in this period, integral and cross-care level is promoted in the health system through a greater coordination between all different levels of care. The social and health network, due to its trajectory and expertise, plays a key role in the quality of care for people with social and medical needs. Copyright © 2017 SEGG. Publicado por Elsevier España, S.L.U. All rights reserved.

  20. Current Status on Resource and Recycling Technology for Rare Earths

    Science.gov (United States)

    Takeda, Osamu; Okabe, Toru H.

    2014-06-01

    The development of recycling technologies for rare earths is essential for resource security and supply stability because high-quality rare earth mines are concentrated in China and the demand for rare earth metals such as neodymium and dysprosium, used as raw materials in permanent magnets (neodymium magnet), is expected to increase rapidly in the near future. It is also important to establish a recycling-based society from the perspective of the conservation of finite and valuable mineral resources and the reduction of the environmental load associated with mining and smelting. In this article, the current status of rare earth resource as well as that of recycling technology for the magnets is reviewed. The importance of establishing an efficient recycling process for rare earths is discussed from the characteristics of supply chain of rare earths, and the technological bases of the recycling processes for the magnet are introduced. Further, some fundamental researches on the development of new recycling processes based on pyrometallurgical process are introduced, and the features of the recycling processes are evaluated.

  1. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    Science.gov (United States)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie; Atlas Collaboration

    2014-06-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  2. Dynamic integration of remote cloud resources into local computing clusters

    Energy Technology Data Exchange (ETDEWEB)

    Fleig, Georg; Erli, Guenther; Giffels, Manuel; Hauth, Thomas; Quast, Guenter; Schnepf, Matthias [Institut fuer Experimentelle Kernphysik, Karlsruher Institut fuer Technologie (Germany)

    2016-07-01

    In modern high-energy physics (HEP) experiments enormous amounts of data are analyzed and simulated. Traditionally dedicated HEP computing centers are built or extended to meet this steadily increasing demand for computing resources. Nowadays it is more reasonable and more flexible to utilize computing power at remote data centers providing regular cloud services to users as they can be operated in a more efficient manner. This approach uses virtualization and allows the HEP community to run virtual machines containing a dedicated operating system and transparent access to the required software stack on almost any cloud site. The dynamic management of virtual machines depending on the demand for computing power is essential for cost efficient operation and sharing of resources with other communities. For this purpose the EKP developed the on-demand cloud manager ROCED for dynamic instantiation and integration of virtualized worker nodes into the institute's computing cluster. This contribution will report on the concept of our cloud manager and the implementation utilizing a remote OpenStack cloud site and a shared HPC center (bwForCluster located in Freiburg).

  3. Common accounting system for monitoring the ATLAS Distributed Computing resources

    CERN Document Server

    Karavakis, E; The ATLAS collaboration; Campana, S; Gayazov, S; Jezequel, S; Saiz, P; Sargsyan, L; Schovancova, J; Ueda, I

    2014-01-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  4. Research on message resource optimization in computer supported collaborative design

    Institute of Scientific and Technical Information of China (English)

    张敬谊; 张申生; 陈纯; 王波

    2004-01-01

    An adaptive mechanism is presented to reduce bandwidth usage and to optimize the use of computing resources of heterogeneous computer mixes utilized in CSCD to reach the goal of collaborative design in distributed-synchronous mode.The mechanism is realized on a C/S architecture based on operation information sharing. Firstly, messages are aggregated into packets on the client. Secondly, an outgoing-message weight priority queue with traffic adjusting technique is cached on the server. Thirdly, an incoming-message queue is cached on the client. At last, the results of implementing the proposed scheme in a simple collaborative design environment are presented.

  5. Current Trends in Computer-Based Education in Medicine

    Science.gov (United States)

    Farquhar, Barbara B.; Votaw, Robert G.

    1978-01-01

    Important current trends in the use of computer technology to enhance medical education are reported in the areas of simulation and assessment of clinical competence, curriculum integration, financial support, and means of exchanging views and scientific information. (RAO)

  6. The NASA's Long-Term Global Solar Energy Resource: Current Solar Resource Variability and Future Improvements

    Science.gov (United States)

    Stackhouse, P. W.; Cox, S. J.; Zhang, T.; Chandler, W.; Westberg, D.; Hoell, J. M.

    2011-12-01

    Considering the likelihood of global climate change and the global competition for energy resources, there is an increasing need to provide improved global Earth surface solar resource information. The improved long-term records are needed to better understand and quantify potential shifts in the solar resource with anticipated changes in climatic weather patterns. As part of the World Climate Research Programme's (WCRP) Global Energy and Water Cycle Experiment (GEWEX), NASA has an active Surface Radiation Budget project that has produced long-term global gridded estimates of the surface solar fluxes. These fluxes have been processed and made available to the solar energy community over the years through NASA's Surface meteorology and Solar Energy web site (SSE). This web site provides solar resource and accompanying meteorological variables specifically tailored to the renewable energy community spanning a 22 year period. The web application has been improved over time with usage growing nearly exponentially over the last few years. This paper presents the global and regional variability of the solar resource from the current data available at the SSE web application. The variability is compared for large different spatial scales and compared to other data sets where appropriate. We assess the interannual variability compared against surface sites and other satellite based data sets. These comparisons quantify the limits of usefulness of this data set. For instance, we find long-term linear trends that are dominated by satellite based artifacts in some areas, but agree well with surface measurements in others. Nevertheless, the extremes of solar variability are quantified and show agreement with surface observations good enough for most feasibility studies of solar energy systems. This presentation also contains a description of work currently on going to replace the current solar resource information available on SSE with a completely reprocessed version. The

  7. The current status of research on resources recycling in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Seung-Hee; Kuh, Sung-Eun; Kim, Dong-Su [Ewha Womans University, Seoul (Korea)

    1999-03-31

    The current domestic research status for resources recycling has been reviewed by surveying the technical and review papers reported to some academic journals. The surveyed articles were classified based upon several categories, including recycling fields according to the kinds of recyclable materials, applied recycling technologies, organizations where the research was conducted, and references according to publication year and region. The survey showed that the recycling of metallurgical waste is being studied most actively. Also, the investigation of fly ash recycling is surveyed to be actively conducted. In the aspect of recycling technologies, chemical technologies are shown to be more widely applied than physical ones. For research-conducting organizations, academic institutes have been more active in the research of recycling field compared with national/private research institutes and industries. In the reference survey, English-written articles and the articles published between 1991-1995 period are shown to be most referred. (author). 6 refs., 7 tabs., 8 figs.

  8. Multicriteria Resource Brokering in Cloud Computing for Streaming Service

    Directory of Open Access Journals (Sweden)

    Chih-Lun Chou

    2015-01-01

    Full Text Available By leveraging cloud computing such as Infrastructure as a Service (IaaS, the outsourcing of computing resources used to support operations, including servers, storage, and networking components, is quite beneficial for various providers of Internet application. With this increasing trend, resource allocation that both assures QoS via Service Level Agreement (SLA and avoids overprovisioning in order to reduce cost becomes a crucial priority and challenge in the design and operation of complex service-based platforms such as streaming service. On the other hand, providers of IaaS also concern their profit performance and energy consumption while offering these virtualized resources. In this paper, considering both service-oriented and infrastructure-oriented criteria, we regard this resource allocation problem as Multicriteria Decision Making problem and propose an effective trade-off approach based on goal programming model. To validate its effectiveness, a cloud architecture for streaming application is addressed and extensive analysis is performed for related criteria. The results of numerical simulations show that the proposed approach strikes a balance between these conflicting criteria commendably and achieves high cost efficiency.

  9. Energy based Efficient Resource Scheduling in Green Computing

    Directory of Open Access Journals (Sweden)

    B.Vasumathi,

    2015-11-01

    Full Text Available Cloud Computing is an evolving area of efficient utilization of computing resources. Data centers accommodating Cloud applications ingest massive quantities of energy, contributing to high functioning expenditures and carbon footprints to the atmosphere. Hence, Green Cloud computing resolutions are required not only to save energy for the environment but also to decrease operating charges. In this paper, we emphasis on the development of energy based resource scheduling framework and present an algorithm that consider the synergy between various data center infrastructures (i.e., software, hardware, etc., and performance. In specific, this paper proposes (a architectural principles for energy efficient management of Clouds; (b energy efficient resource allocation strategies and scheduling algorithm considering Quality of Service (QoS outlooks. The performance of the proposed algorithm has been evaluated with the existing energy based scheduling algorithms. The experimental results demonstrate that this approach is effective in minimizing the cost and energy consumption of Cloud applications thus moving towards the achievement of Green Clouds.

  10. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    CERN Document Server

    Öhman, H; The ATLAS collaboration; Hendrix, V

    2014-01-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. With the new cloud technologies come also new challenges, and one such is the contextualization of cloud resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible, which precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration, dynamic resource scaling, and high degree of scalability.

  11. Pre-allocation Strategies of Computational Resources in Cloud Computing using Adaptive Resonance Theory-2

    CERN Document Server

    Nair, T R Gopalakrishnan

    2012-01-01

    One of the major challenges of cloud computing is the management of request-response coupling and optimal allocation strategies of computational resources for the various types of service requests. In the normal situations the intelligence required to classify the nature and order of the request using standard methods is insufficient because the arrival of request is at a random fashion and it is meant for multiple resources with different priority order and variety. Hence, it becomes absolutely essential that we identify the trends of different request streams in every category by auto classifications and organize preallocation strategies in a predictive way. It calls for designs of intelligent modes of interaction between the client request and cloud computing resource manager. This paper discusses about the corresponding scheme using Adaptive Resonance Theory-2.

  12. The current crisis in human resources for health in Africa

    African Journals Online (AJOL)

    resource and training policies, weak institutions, and inappropriate structures [1]. Dimensions of the human resource crisis: .... medical personnel are often misused for management ... redefinition of functions, reforms in the staffing standards,.

  13. The current state of water resources of Transcarpathia

    Directory of Open Access Journals (Sweden)

    V. І. Nikolaichuk

    2015-07-01

    Full Text Available Throughout their existence, humans use the water of rivers, lakes and underground sources not only for water supply but also for dumping of polluted waters and wastes into it. Significant development of urbanization, concentration of urban industrial enterprises, transport, increase in mining, expansion of drainage and irrigation reclamation, plowing of the river channels, creating a large number of landfills resulted in significant, and in some regions critical, depletion and contamination of the surface and ground waters. Because of this disastrous situation, the society is getting more and more concerned about the state of the environment. The public became increasingly interested in the state of the soil cover, air, water resources, and biotic diversity. Transcarpathian region (Zakarpattya is situated in the heart of Europe, bordered by four Central European countries (Poland, Slovakia, Hungary and Romania and two regions of Ukraine (Lviv and Ivano-Frankivsk regions. Transcarpathian region (Zakarpattya is one of the richest regions of Ukraine in terms of water resources. The territory is permeated by the dense network of rivers. There are in total 9,429 rivers of 19,866 kmlength flowing in the region. Among them, the rivers Tysa, Borzhava, Latoryca, Uzh have the length of over 100 kmeach. 25 cities and urban settlements of the area are substantially provided with the centralized water intake of underground drinking water. The rural areas have virtually no centralized water supply; mainly, it is carried out due to domestic wells or water boreholes. Predicted resources of underground drinking waters in the region are equal to 1,109,300 m3/day. The use of fresh water in 2014 per capita amounted to 23,769 m3, 15% less than in 2009. The main pollutants of surface water bodies are the facilities of utility companies in the region. Analysis of studies of surface water quality in Transcarpathian region in 2014 shows that water quality meets the

  14. [Resource activation in clinical psychology and psychotherapy: review of theoretical issues and current research].

    Science.gov (United States)

    Groß, L J; Stemmler, M; de Zwaan, M

    2012-08-01

    This review summarises theoretical issues and current research on working with clients' resources and strengths in clinical psychology and psychotherapy. Resource activation is considered as an important common factor in psychotherapy. In general, resource activation means an explicit focus on resources, strengths and potentials of the clients. After defining the term resources, considerations with regard to therapeutic attitude, principles of resource activation, approaches to resource diagnostics and different research strategies are presented. Current research focuses especially on the relation between resource activation and process variables in out-patient treatment.

  15. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  16. Computer Simulation and Digital Resources for Plastic Surgery Psychomotor Education.

    Science.gov (United States)

    Diaz-Siso, J Rodrigo; Plana, Natalie M; Stranix, John T; Cutting, Court B; McCarthy, Joseph G; Flores, Roberto L

    2016-10-01

    Contemporary plastic surgery residents are increasingly challenged to learn a greater number of complex surgical techniques within a limited period. Surgical simulation and digital education resources have the potential to address some limitations of the traditional training model, and have been shown to accelerate knowledge and skills acquisition. Although animal, cadaver, and bench models are widely used for skills and procedure-specific training, digital simulation has not been fully embraced within plastic surgery. Digital educational resources may play a future role in a multistage strategy for skills and procedures training. The authors present two virtual surgical simulators addressing procedural cognition for cleft repair and craniofacial surgery. Furthermore, the authors describe how partnerships among surgical educators, industry, and philanthropy can be a successful strategy for the development and maintenance of digital simulators and educational resources relevant to plastic surgery training. It is our responsibility as surgical educators not only to create these resources, but to demonstrate their utility for enhanced trainee knowledge and technical skills development. Currently available digital resources should be evaluated in partnership with plastic surgery educational societies to guide trainees and practitioners toward effective digital content.

  17. MADLVF: An Energy Efficient Resource Utilization Approach for Cloud Computing

    Directory of Open Access Journals (Sweden)

    J.K. Verma

    2014-06-01

    Full Text Available Last few decades have remained the witness of steeper growth in demand for higher computational power. It is merely due to shift from the industrial age to Information and Communication Technology (ICT age which was marginally the result of digital revolution. Such trend in demand caused establishment of large-scale data centers situated at geographically apart locations. These large-scale data centers consume a large amount of electrical energy which results into very high operating cost and large amount of carbon dioxide (CO2 emission due to resource underutilization. We propose MADLVF algorithm to overcome the problems such as resource underutilization, high energy consumption, and large CO2 emissions. Further, we present a comparative study between the proposed algorithm and MADRS algorithms showing proposed methodology outperforms over the existing one in terms of energy consumption and the number of VM migrations.

  18. Resources

    Science.gov (United States)

    ... resources Alzheimer's - resources Anorexia nervosa - resources Arthritis - resources Asthma and allergy - resources Autism - resources Blindness - resources BPH - resources Breastfeeding - resources Bulimia - resources Burns - resources Cancer - resources Cerebral ...

  19. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  20. Enabling Grid Computing resources within the KM3NeT computing model

    Science.gov (United States)

    Filippidis, Christos

    2016-04-01

    KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that - located at the bottom of the Mediterranean Sea - will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  1. Littoral drift computations on mutual wave and current influence

    NARCIS (Netherlands)

    Bijker, E.W.

    1971-01-01

    11th Conference on Coastal Engineering in London 1968, the author presented a method for computing the littoral drift starting from the longshore current velocity as this is generated by the waves and with the assumption that the material is stirred up by the waves. In this paper measurements in a m

  2. Current Picture for China’s Mineral Resource Availability

    Institute of Scientific and Technical Information of China (English)

    陈志

    2008-01-01

    Economic growth and structural change has caused China to consume an increasingly immense amount of mineral resources. This article is intended to present a fundamental picture of mineral resource shortages facing China through an in-depth analysis of mineral reserves, demand and supply as well as structure. We believe that China will continue to face a shortage of certain representative minerals resources in the foreseeable future. As a result, China has to rely on imports of such minerals to meet surging domestic demand.

  3. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    Science.gov (United States)

    Cirasella, Jill

    2009-01-01

    This article presents an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news. All of the quantum computing resources described in this article are freely available, English-language web sites that fall into one…

  4. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    Science.gov (United States)

    Cirasella, Jill

    2009-01-01

    This article presents an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news. All of the quantum computing resources described in this article are freely available, English-language web sites that fall into one…

  5. An Optimal Solution of Resource Provisioning Cost in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Arun Pandian

    2013-03-01

    Full Text Available In cloud computing, providing an optimal resource to user becomes more and more important. Cloud computing users can access the pool of computing resources through internet. Cloud providers are charge for these computing resources based on cloud resource usage. The provided resource plans are reservation and on demand. The computing resources are provisioned by cloud resource provisioning model. In this model resource cost is high due to the difficulty in optimization of resource cost under uncertainty. The resource optimization cost is dealing with an uncertainty of resource provisioning cost. The uncertainty of resource provisioning cost consists: on demand cost, Reservation cost, Expending cost. This problem leads difficulty to achieve optimal solution of resource provisioning cost in cloud computing. The Stochastic Integer Programming is applied for difficulty to obtain optimal resource provisioning cost. The Two Stage Stochastic Integer Programming with recourse is applied to solve the complexity of optimization problems under uncertainty. The stochastic programming is enhanced as Deterministic Equivalent Formulation for solve the probability distribution of all scenarios to reduce the on demand cost. The Benders Decomposition is applied for break down the resource optimization problem into multiple sub problems to reduce the on demand cost and reservation cost. The Sample Average Approximation is applied for reduce the problem scenarios in a resource optimization problem. This algorithm is used to reduce the reservation cost and expending cost.

  6. A resource-sharing model based on a repeated game in fog computing

    Directory of Open Access Journals (Sweden)

    Yan Sun

    2017-03-01

    Full Text Available With the rapid development of cloud computing techniques, the number of users is undergoing exponential growth. It is difficult for traditional data centers to perform many tasks in real time because of the limited bandwidth of resources. The concept of fog computing is proposed to support traditional cloud computing and to provide cloud services. In fog computing, the resource pool is composed of sporadic distributed resources that are more flexible and movable than a traditional data center. In this paper, we propose a fog computing structure and present a crowd-funding algorithm to integrate spare resources in the network. Furthermore, to encourage more resource owners to share their resources with the resource pool and to supervise the resource supporters as they actively perform their tasks, we propose an incentive mechanism in our algorithm. Simulation results show that our proposed incentive mechanism can effectively reduce the SLA violation rate and accelerate the completion of tasks.

  7. A resource-sharing model based on a repeated game in fog computing.

    Science.gov (United States)

    Sun, Yan; Zhang, Nan

    2017-03-01

    With the rapid development of cloud computing techniques, the number of users is undergoing exponential growth. It is difficult for traditional data centers to perform many tasks in real time because of the limited bandwidth of resources. The concept of fog computing is proposed to support traditional cloud computing and to provide cloud services. In fog computing, the resource pool is composed of sporadic distributed resources that are more flexible and movable than a traditional data center. In this paper, we propose a fog computing structure and present a crowd-funding algorithm to integrate spare resources in the network. Furthermore, to encourage more resource owners to share their resources with the resource pool and to supervise the resource supporters as they actively perform their tasks, we propose an incentive mechanism in our algorithm. Simulation results show that our proposed incentive mechanism can effectively reduce the SLA violation rate and accelerate the completion of tasks.

  8. Dynamic resource allocation scheme for distributed heterogeneous computer systems

    Science.gov (United States)

    Liu, Howard T. (Inventor); Silvester, John A. (Inventor)

    1991-01-01

    This invention relates to a resource allocation in computer systems, and more particularly, to a method and associated apparatus for shortening response time and improving efficiency of a heterogeneous distributed networked computer system by reallocating the jobs queued up for busy nodes to idle, or less-busy nodes. In accordance with the algorithm (SIDA for short), the load-sharing is initiated by the server device in a manner such that extra overhead in not imposed on the system during heavily-loaded conditions. The algorithm employed in the present invention uses a dual-mode, server-initiated approach. Jobs are transferred from heavily burdened nodes (i.e., over a high threshold limit) to low burdened nodes at the initiation of the receiving node when: (1) a job finishes at a node which is burdened below a pre-established threshold level, or (2) a node is idle for a period of time as established by a wakeup timer at the node. The invention uses a combination of the local queue length and the local service rate ratio at each node as the workload indicator.

  9. A Review of Computer Science Resources for Learning and Teaching with K-12 Computing Curricula: An Australian Case Study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-01-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age…

  10. A Review of Computer Science Resources for Learning and Teaching with K-12 Computing Curricula: An Australian Case Study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-01-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age…

  11. A Dynamic Resource Allocation Method for Parallel DataProcessing in Cloud Computing

    Directory of Open Access Journals (Sweden)

    V. V. Kumar

    2012-01-01

    Full Text Available Problem statement: One of the Cloud Services, Infrastructure as a Service(IaaS provides a Compute resourses for demand in various applications like Parallel Data processing. The computer resources offered in the cloud are extremely dynamic and probably heterogeneous. Nephele is the first data processing framework to explicitly exploit the dynamic resource allocation offered by today’s IaaS clouds for both, task scheduling and execution. Particular tasks of processing a job can be assigned to different types of virtual machines which are automatically instantiated and terminated during the job execution. However, the current algorithms does not consider the resource overload or underutilization during the job execution. In this study, we have focussed on increasing the efficacy of the scheduling algorithm for the real time Cloud Computing services. Approach: Our Algorithm utilizes the Turnaround time Utility effieciently by differentiating it into a gain function and a loss function for a single task. The algorithm also assigns high priority for task of early completion and less priority for abortions /deadlines issues of real time tasks. Results: The algorithm has been implemented on both preemptive and Non-premptive methods. The experimental results shows that it outperfoms the existing utility based scheduling algorithms and also compare its performance with both preemptive and Non-preemptive scheduling methods. Conculsion: Hence, a novel Turnaround time utility scheduling approach which focuses on both high priority and the low priority tasks that arrives for scheduling is proposed.

  12. Monitoring of computing resource utilization of the ATLAS experiment

    CERN Document Server

    Rousseau, D; The ATLAS collaboration; Vukotic, I; Aidel, O; Schaffer, RD; Albrand, S

    2012-01-01

    Due to the good performance of the LHC accelerator, the ATLAS experiment has seen higher than anticipated levels for both the event rate and the average number of interactions per bunch crossing. In order to respond to these changing requirements, the current and future usage of CPU, memory and disk resources has to be monitored, understood and acted upon. This requires data collection at a fairly fine level of granularity: the performance of each object written and each algorithm run, as well as a dozen per-job variables, are gathered for the different processing steps of Monte Carlo generation and simulation and the reconstruction of both data and Monte Carlo. We present a system to collect and visualize the data from both the online Tier-0 system and distributed grid production jobs. Around 40 GB of performance data are expected from up to 200k jobs per day, thus making performance optimization of the underlying Oracle database of utmost importance.

  13. Monitoring of computing resource utilization of the ATLAS experiment

    Science.gov (United States)

    Rousseau, David; Dimitrov, Gancho; Vukotic, Ilija; Aidel, Osman; Schaffer, Rd; Albrand, Solveig

    2012-12-01

    Due to the good performance of the LHC accelerator, the ATLAS experiment has seen higher than anticipated levels for both the event rate and the average number of interactions per bunch crossing. In order to respond to these changing requirements, the current and future usage of CPU, memory and disk resources has to be monitored, understood and acted upon. This requires data collection at a fairly fine level of granularity: the performance of each object written and each algorithm run, as well as a dozen per-job variables, are gathered for the different processing steps of Monte Carlo generation and simulation and the reconstruction of both data and Monte Carlo. We present a system to collect and visualize the data from both the online Tier-0 system and distributed grid production jobs. Around 40 GB of performance data are expected from up to 200k jobs per day, thus making performance optimization of the underlying Oracle database of utmost importance.

  14. The Current State of China's Freshwater Resources and Related Suggestions

    Institute of Scientific and Technical Information of China (English)

    Wu Ruijin

    2001-01-01

    China has many lakes, marshlands and rivers. Due to their uneven geographical distribution and varied degrees of salinity, their exploitable freshwater resources are limited. In the wake of the highspeed growth of national economy in recent years, human infringement upon their natural settings has been increasingly intense, leading to the degeneration of China's lacustrine ecosystems and the degradation of their surrounding environments. Lakes are shrinking and becoming more saline. In arid and semi-arid inland areas, some of them have even disappeared. In addition, lake water pollution and eutrophication in densely populated areas are getting worse, resulting in serious water shortages in some places. Silt deposition in lake basins, water surface shrinkage caused by hectic and irrational reclamation for farmland, the prevalence of flooding and water-logging calamities and ecosystem depletion caused by predatory exploitation of fishery resources, all of these have become restrictive factors in regional sustainable development. The author of this article suggests measures for the protection and sustainable exploitation of limnetic settings in China.

  15. Application of Selective Algorithm for Effective Resource Provisioning in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Modern day continued demand for resource hungry services and applications in IT sector has led to development of Cloud computing. Cloud computing environment involves high cost infrastructure on one hand and need high scale computational resources on the other hand. These resources need to be provisioned (allocation and scheduling) to the end users in most efficient manner so that the tremendous capabilities of cloud are utilized effectively and efficiently. In this paper we discuss a selecti...

  16. The Mechanism of Resource Dissemination and Resource Discovery for Computational Grid%计算网格的资源分发和发现机制

    Institute of Scientific and Technical Information of China (English)

    武秀川; 鞠九滨

    2003-01-01

    Computational Grid is a large-scale distributed computing environment. The resource management of com-putational Grid discoveries and locates and allocates resources for users within the filed of grid environment as theyhave a request to these resources. The other case for that is co-operating in order to finish a large computing. Thesetasks are accomplished by the mechanism of resource dissemination and resource discovery of the resource manage-ment for the grid system. In this paper, some problems about resource dissemination and resource discovery are dis-cussed and analyzed,further more future work about that is proposed.

  17. Book Review: Current Issues in International Human Resource Management and Strategy Research

    DEFF Research Database (Denmark)

    Gretzinger, Susanne

    2009-01-01

    The article reviews the book "Current Issues in International Human Resource Management and Strategy Research," edited by Marion Festing and Susanne Royer.......The article reviews the book "Current Issues in International Human Resource Management and Strategy Research," edited by Marion Festing and Susanne Royer....

  18. Book Review: Current Issues in International Human Resource Management and Strategy Research

    DEFF Research Database (Denmark)

    Gretzinger, Susanne

    2009-01-01

    The article reviews the book "Current Issues in International Human Resource Management and Strategy Research," edited by Marion Festing and Susanne Royer.......The article reviews the book "Current Issues in International Human Resource Management and Strategy Research," edited by Marion Festing and Susanne Royer....

  19. Distributed Computation Resources for Earth System Grid Federation (ESGF)

    Science.gov (United States)

    Duffy, D.; Doutriaux, C.; Williams, D. N.

    2014-12-01

    The Intergovernmental Panel on Climate Change (IPCC), prompted by the United Nations General Assembly, has published a series of papers in their Fifth Assessment Report (AR5) on processes, impacts, and mitigations of climate change in 2013. The science used in these reports was generated by an international group of domain experts. They studied various scenarios of climate change through the use of highly complex computer models to simulate the Earth's climate over long periods of time. The resulting total data of approximately five petabytes are stored in a distributed data grid known as the Earth System Grid Federation (ESGF). Through the ESGF, consumers of the data can find and download data with limited capabilities for server-side processing. The Sixth Assessment Report (AR6) is already in the planning stages and is estimated to create as much as two orders of magnitude more data than the AR5 distributed archive. It is clear that data analysis capabilities currently in use will be inadequate to allow for the necessary science to be done with AR6 data—the data will just be too big. A major paradigm shift from downloading data to local systems to perform data analytics must evolve to moving the analysis routines to the data and performing these computations on distributed platforms. In preparation for this need, the ESGF has started a Compute Working Team (CWT) to create solutions that allow users to perform distributed, high-performance data analytics on the AR6 data. The team will be designing and developing a general Application Programming Interface (API) to enable highly parallel, server-side processing throughout the ESGF data grid. This API will be integrated with multiple analysis and visualization tools, such as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT), netCDF Operator (NCO), and others. This presentation will provide an update on the ESGF CWT's overall approach toward enabling the necessary storage proximal computational

  20. An Improved Constraint Based Resource Scheduling Approach Using Job Grouping Strategy in Grid Computing

    Directory of Open Access Journals (Sweden)

    Payal Singhal

    2013-01-01

    Full Text Available Grid computing is a collection of distributed resources interconnected by networks to provide a unified virtual computing resource view to the user. Grid computing has one important responsibility of resource management and techniques to allow the user to make optimal use of the job completion time and achieving good throughput. It is a big deal to design the efficient scheduler and is implementation. In this paper, the constraint based job and resource scheduling algorithm has been proposed. The four constraints are taken into account for grouping the jobs, i.e. Resource memory, Job memory, Job MI and the fourth constraint L2 cache are considered. Our implementation is to reduce the processing time efficiently by adding the fourth constraint L2 cache of the resource and is allocated to the resource for parallel computing. The L2 cache is a part of computer’s processor; it increases the performance of computer. It is smaller and extremely fast computer memory. The use of more constraint of the resource and job can increase the efficiency more. The work has been done in MATLAB using the parallel computing toolbox. All the constraints are calculated using different functions in MATLAB and are allocated to the resource based on it. The resource memory, Cache, job memory size and job MI are the key factors to group the jobs according to the available capability of the selected resource. The processing time is taken into account to analyze the feasibility of the algorithms.

  1. wolfPAC: building a high-performance distributed computing network for phylogenetic analysis using 'obsolete' computational resources.

    Science.gov (United States)

    Reeves, Patrick A; Friedman, Philip H; Richards, Christopher M

    2005-01-01

    wolfPAC is an AppleScript-based software package that facilitates the use of numerous, remotely located Macintosh computers to perform computationally-intensive phylogenetic analyses using the popular application PAUP* (Phylogenetic Analysis Using Parsimony). It has been designed to utilise readily available, inexpensive processors and to encourage sharing of computational resources within the worldwide phylogenetics community.

  2. Energy current loss instability model on a computer

    Science.gov (United States)

    Edighoffer, John A.

    1995-04-01

    The computer program called Energy Stability in a Recirculating Accelerator (ESRA) Free Electron Laser (FEL) has been written to model bunches of particles in longitudinal phase space transversing a recirculating accelerator and the associated rf changes and aperture current losses. This energy-current loss instability was first seen by Los Alamos's FEL group in their energy recovery experiments. This code addresses these stability issues and determines the transport, noise, feedback and other parameters for which these FEL systems are stable or unstable. Two representative systems are modeled, one for the Novosibirisk high power FEL racetrack microtron for photochemical research, the other is the CEBAF proposed UV FEL system. Both of these systems are stable with prudent choices of parameters.

  3. Computational structures technology at Grumman: Current practice/future needs

    Science.gov (United States)

    Pifko, Allan B.; Eidinoff, Harvey

    1992-05-01

    The current practice for the design analysis of new airframe structural systems is to construct a master finite element model of the vehicle in order to develop internal load distributions. The inputs to this model include the geometry which is taken directly from CADAM and CATIA structural layout and aerodynamic loads and mass distribution computer models. This master model is sufficiently detailed to define major load paths and for the computation of dynamic mode shapes and structural frequencies, but not detailed enough to define local stress gradients and notch stresses. This master model is then used to perform structural optimization studies that will provide minimum weights for major structural members. The post-processed output from the master model, load, stress, and strain analysis is then used by structural analysts to perform detailed stress analysis of local regions in order to design local structure with all its required details. This local analysis consists of hand stress analysis and life prediction analysis with the assistance of manuals, design charts, computer stress and structural life analysis and sometimes finite element or boundary element analysis. The resulting design is verified by fatigue tests.

  4. SLA-Oriented Resource Provisioning for Cloud Computing: Challenges, Architecture, and Solutions

    CERN Document Server

    Buyya, Rajkumar; Calheiros, Rodrigo N

    2012-01-01

    Cloud computing systems promise to offer subscription-oriented, enterprise-quality computing services to users worldwide. With the increased demand for delivering services to a large number of users, they need to offer differentiated services to users and meet their quality expectations. Existing resource management systems in data centers are yet to support Service Level Agreement (SLA)-oriented resource allocation, and thus need to be enhanced to realize cloud computing and utility computing. In addition, no work has been done to collectively incorporate customer-driven service management, computational risk management, and autonomic resource management into a market-based resource management system to target the rapidly changing enterprise requirements of Cloud computing. This paper presents vision, challenges, and architectural elements of SLA-oriented resource management. The proposed architecture supports integration of marketbased provisioning policies and virtualisation technologies for flexible alloc...

  5. The Relative Effectiveness of Computer-Based and Traditional Resources for Education in Anatomy

    Science.gov (United States)

    Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R.; Wainman, Bruce

    2013-01-01

    There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic…

  6. Power-Aware Resource Reconfiguration Using Genetic Algorithm in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Li Deng

    2016-01-01

    Full Text Available Cloud computing enables scalable computation based on virtualization technology. However, current resource reallocation solution seldom considers the stability of virtual machine (VM placement pattern. Varied workloads of applications would lead to frequent resource reconfiguration requirements due to repeated appearance of hot nodes. In this paper, several algorithms for VM placement (multiobjective genetic algorithm (MOGA, power-aware multiobjective genetic algorithm (pMOGA, and enhanced power-aware multiobjective genetic algorithm (EpMOGA are presented to improve stability of VM placement pattern with less migration overhead. The energy consumption is also considered. A type-matching controller is designed to improve evolution process. Nondominated sorting genetic algorithm II (NSGAII is used to select new generations during evolution process. Our simulation results demonstrate that these algorithms all provide resource reallocation solutions with long stabilization time of nodes. pMOGA and EpMOGA also better balance the relationship of stabilization and energy efficiency by adding number of active nodes as one of optimal objectives. Type-matching controller makes EpMOGA superior to pMOGA.

  7. Current Situation of Introduction and Use of African Crop Germplasm Resources and Recommendations

    Institute of Scientific and Technical Information of China (English)

    Zili; DING; Minghua; YAO; Chunhai; JIAO

    2015-01-01

    Africa is the origin center of many crops. It is rich in original ecological resources,especially special resources which are excellent materials for breeding research. With acceleration of commercial seeds in agriculture of African countries,some original ecological resources are disappearing. Through experience of introduction of African varieties in recent years,it analyzed current situation of introduction and use of African crop germplasm resources. Finally,it came up with recommendations for rescuing and taking full advantage of excellent African resources,solving difficult problem restricting crop breeding,enriching China’s crop germplasm bank,and improving China’s and African crop breeding level and innovation ability.

  8. INJECT AN ELASTIC GRID COMPUTING TECHNIQUES TO OPTIMAL RESOURCE MANAGEMENT TECHNIQUE OPERATIONS

    Directory of Open Access Journals (Sweden)

    R. Surendran

    2013-01-01

    Full Text Available Evaluation of sharing on the Internet well- developed from energetic technique of grid computing. Dynamic Grid Computing is Resource sharing in large level high performance computing networks at worldwide. Existing systems have a Limited innovation for resource management process. In proposed work, Grid Computing is an Internet based computing for Optimal Resource Management Technique Operations (ORMTO. ORMTO are Elastic scheduling algorithm, finding the Best Grid node for a task prediction, Fault tolerance resource selection, Perfect resource co-allocation, Grid balanced Resource matchmaking and Agent based grid service, wireless mobility resource access. Survey the various resource management techniques based on the performance measurement factors like time complexity, Space complexity and Energy complexity find the ORMTO with Grid computing. Objectives of ORMTO will provide an efficient Resource co-allocation automatically for a user who is submitting the job without grid knowledge, design a Grid service (portal for selects the Best Fault tolerant Resource for a given task in a fast, secure and efficient manner and provide an Enhanced grid balancing system for multi-tasking via Hybrid topology based Grid Ranking. Best Quality of Service (QOS parameters are important role in all RMT. Proposed system ORMTO use the greater number of QOS Parameters for better enhancement of existing RMT. In proposed system, follow the enhanced techniques and algorithms use to improve the Grid based ORMTO.

  9. Current Computational Challenges for CMC Processes, Properties, and Structures

    Science.gov (United States)

    DiCarlo, James

    2008-01-01

    In comparison to current state-of-the-art metallic alloys, ceramic matrix composites (CMC) offer a variety of performance advantages, such as higher temperature capability (greater than the approx.2100 F capability for best metallic alloys), lower density (approx.30-50% metal density), and lower thermal expansion. In comparison to other competing high-temperature materials, CMC are also capable of providing significantly better static and dynamic toughness than un-reinforced monolithic ceramics and significantly better environmental resistance than carbon-fiber reinforced composites. Because of these advantages, NASA, the Air Force, and other U.S. government agencies and industries are currently seeking to implement these advanced materials into hot-section components of gas turbine engines for both propulsion and power generation. For applications such as these, CMC are expected to result in many important performance benefits, such as reduced component cooling air requirements, simpler component design, reduced weight, improved fuel efficiency, reduced emissions, higher blade frequencies, reduced blade clearances, and higher thrust. Although much progress has been made recently in the development of CMC constituent materials and fabrication processes, major challenges still remain for implementation of these advanced composite materials into viable engine components. The objective of this presentation is to briefly review some of those challenges that are generally related to the need to develop physics-based computational approaches to allow CMC fabricators and designers to model (1) CMC processes for fiber architecture formation and matrix infiltration, (2) CMC properties of high technical interest such as multidirectional creep, thermal conductivity, matrix cracking stress, damage accumulation, and degradation effects in aggressive environments, and (3) CMC component life times when all of these effects are interacting in a complex stress and service

  10. Effective Computer Resource Management: Keeping the Tail from Wagging the Dog.

    Science.gov (United States)

    Sampson, James P., Jr.

    1982-01-01

    Predicts that student services will be increasingly influenced by computer technology. Suggests this resource be managed effectively to minimize potential problems and prevent a mechanistic and impersonal environment. Urges student personnel workers to assume active responsibility for planning, evaluating, and operating computer resources. (JAC)

  11. Economic-based Distributed Resource Management and Scheduling for Grid Computing

    CERN Document Server

    Buyya, R

    2002-01-01

    Computational Grids, emerging as an infrastructure for next generation computing, enable the sharing, selection, and aggregation of geographically distributed resources for solving large-scale problems in science, engineering, and commerce. As the resources in the Grid are heterogeneous and geographically distributed with varying availability and a variety of usage and cost policies for diverse users at different times and, priorities as well as goals that vary with time. The management of resources and application scheduling in such a large and distributed environment is a complex task. This thesis proposes a distributed computational economy as an effective metaphor for the management of resources and application scheduling. It proposes an architectural framework that supports resource trading and quality of services based scheduling. It enables the regulation of supply and demand for resources and provides an incentive for resource owners for participating in the Grid and motives the users to trade-off bet...

  12. Wide-Area Computing: Resource Sharing on a Large Scale

    Science.gov (United States)

    1999-01-01

    fault propagation, and a set of useful failure mode assumptions. Handle multilanguage and legacy applications “I don’t know what computer language...ence. He is a member of the IEEE Computer Society and the ACM. Frederick Knabe is a senior research scientist in the Department of Computer Science

  13. Cloud Computing and Information Technology Resource Cost Management for SMEs

    DEFF Research Database (Denmark)

    Kuada, Eric; Adanu, Kwame; Olesen, Henning

    2013-01-01

    This paper analyzes the decision-making problem confronting SMEs considering the adoption of cloud computing as an alternative to in-house computing services provision. The economics of choosing between in-house computing and a cloud alternative is analyzed by comparing the total economic costs o...

  14. Professional Computer Education Organizations--A Resource for Administrators.

    Science.gov (United States)

    Ricketts, Dick

    Professional computer education organizations serve a valuable function by generating, collecting, and disseminating information concerning the role of the computer in education. This report touches briefly on the reasons for the rapid and successful development of professional computer education organizations. A number of attributes of effective…

  15. Study on Cloud Computing Resource Scheduling Strategy Based on the Ant Colony Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Lingna He

    2012-09-01

    Full Text Available In order to replace the traditional Internet software usage patterns and enterprise management mode, this paper proposes a new business calculation mode- cloud computing, resources scheduling strategy is the key technology in cloud computing, Based on the study of cloud computing system structure and the mode of operation, The key research for cloud computing the process of the work scheduling and resource allocation problems based on ant colony algorithm , Detailed analysis and design of the specific implementation for cloud resources scheduling . And in CloudSim simulation environment and simulation experiments, the results show that the algorithm has better scheduling performance and load balance than general algorithm.

  16. CURRENT WAYS TO HARVEST ENERGY USING A COMPUTER MOUSE

    Directory of Open Access Journals (Sweden)

    Frantisek Horvat

    2014-02-01

    Full Text Available This paper deals with the idea of an energy harvesting (EH system that uses the mechanical energy from finger presses on the buttons of a computer mouse by means of a piezomaterial (PVF2. The piezomaterial is placed in the mouse at the interface between the button and the body. This paper reviews the parameters of the PVF2 piezomaterial and tests their possible implementation into EH systems utilizing these types of mechanical interactions. The paper tests the viability of two EH concepts: a battery management system, and a semi-autonomous system. A statistical estimate of the button operations is performed for various computer activities, showing that an average of up to 3300 mouse clicks per hour was produced for gaming applications, representing a tip frequency of 0.91 Hz on the PVF2 member. This frequency is tested on the PVF2 system, and an assessment of the two EH systems is reviewed. The results show that fully autonomous systems are not suitable for capturing low-frequency mechanical interactions, due to the parameters of current piezomaterials, and the resulting very long startup phase. However, a hybrid EH system which uses available power to initiate the circuit and eliminate the startup phase may be explored for future studies.

  17. Development of a Computer-Based Resource for Inclusion Science Classrooms

    Science.gov (United States)

    Olsen, J. K.; Slater, T.

    2005-12-01

    Current instructional issues necessitate educators start with curriculum and determine how educational technology can assist students in achieving positive learning goals, functionally supplementing the classroom instruction. Technology projects incorporating principles of situated learning have been shown to provide effective framework for learning, and computer technology has been shown to facilitate learning among special needs students. Students with learning disabilities may benefit from assistive technology, but these resources are not always utilized during classroom instruction: technology is only effective if teachers view it as an integral part of the learning process. The materials currently under development are in the domain of earth and space science, part of the Arizona 5-8 Science Content Standards. The concern of this study is to determine a means of assisting inclusive education that is both feasible and effective in ensuring successful science learning outcomes for all students whether regular education or special needs.

  18. Forecasting Performance in Organizations: An Application of Current-Value Human Resources Accounting. Final Report.

    Science.gov (United States)

    Pecorella, Patricia A.; And Others

    A methodology to describe current-value human resources accounting (HRA) was developed to aid management in decision making and provide information about the effects of organizational policies and practices on the value of the organizations' human resources. A two-phase activity was designed to investigate the nature of the relationship between…

  19. Relational Computing Using HPC Resources: Services and Optimizations

    OpenAIRE

    2015-01-01

    Computational epidemiology involves processing, analysing and managing large volumes of data. Such massive datasets cannot be handled efficiently by using traditional standalone database management systems, owing to their limitation in the degree of computational efficiency and bandwidth to scale to large volumes of data. In this thesis, we address management and processing of large volumes of data for modeling, simulation and analysis in epidemiological studies. Traditionally, compute intens...

  20. Integration and Exposure of Large Scale Computational Resources Across the Earth System Grid Federation (ESGF)

    Science.gov (United States)

    Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.

    2015-12-01

    As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.

  1. Science and Technology Resources on the Internet: Computer Security.

    Science.gov (United States)

    Kinkus, Jane F.

    2002-01-01

    Discusses issues related to computer security, including confidentiality, integrity, and authentication or availability; and presents a selected list of Web sites that cover the basic issues of computer security under subject headings that include ethics, privacy, kids, antivirus, policies, cryptography, operating system security, and biometrics.…

  2. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    Science.gov (United States)

    Zhang, Nan; Yang, Xiaolong; Zhang, Min; Sun, Yan

    2016-01-01

    Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  3. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  4. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods.

    Science.gov (United States)

    Germain, Pierre-Luc; Vitriolo, Alessandro; Adamo, Antonio; Laise, Pasquale; Das, Vivek; Testa, Giuseppe

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. P300 brain computer interface: current challenges and emerging trends

    Directory of Open Access Journals (Sweden)

    Reza eFazel-Rezai

    2012-07-01

    Full Text Available A brain-computer interface (BCI enables communication without movement based on brain signals measured with electroencephalography (EEG. BCIs usually rely on one of three types of signals: the P300 and other components of the event-related potential (ERP, steady state visual evoked potential (SSVEP, or event related desynchronization (ERD. Although P300 BCIs were introduced over twenty years ago, the past few years have seen a strong increase in P300 BCI research. This closed-loop BCI approach relies on the P300 and other components of the event-related potential (ERP, based on an oddball paradigm presented to the subject. In this paper, we overview the current status of P300 BCI technology, and then discuss new directions: paradigms for eliciting P300s; signal processing methods; applications; and hybrid BCIs. We conclude that P300 BCIs are quite promising, as several emerging directions have not yet been fully explored and could lead to improvements in bit rate, reliability, usability, and flexibility.

  6. Computed tomography: acquisition process, technology and current state

    Directory of Open Access Journals (Sweden)

    Óscar Javier Espitia Mendoza

    2016-02-01

    Full Text Available Computed tomography is a noninvasive scan technique widely applied in areas such as medicine, industry, and geology. This technique allows the three-dimensional reconstruction of the internal structure of an object which is lighted with an X-rays source. The reconstruction is formed with two-dimensional cross-sectional images of the object. Each cross-sectional is obtained from measurements of physical phenomena, such as attenuation, dispersion, and diffraction of X-rays, as result of their interaction with the object. In general, measurements acquisition is performed with methods based on any of these phenomena and according to various architectures classified in generations. Furthermore, in response to the need to simulate acquisition systems for CT, software dedicated to this task has been developed. The objective of this research is to determine the current state of CT techniques, for this, a review of methods, different architectures used for the acquisition and some of its applications is presented. Additionally, results of simulations are presented. The main contributions of this work are the detailed description of acquisition methods and the presentation of the possible trends of the technique.

  7. Quantum computing with incoherent resources and quantum jumps.

    Science.gov (United States)

    Santos, M F; Cunha, M Terra; Chaves, R; Carvalho, A R R

    2012-04-27

    Spontaneous emission and the inelastic scattering of photons are two natural processes usually associated with decoherence and the reduction in the capacity to process quantum information. Here we show that, when suitably detected, these photons are sufficient to build all the fundamental blocks needed to perform quantum computation in the emitting qubits while protecting them from deleterious dissipative effects. We exemplify this by showing how to efficiently prepare graph states for the implementation of measurement-based quantum computation.

  8. iTools: a framework for classification, categorization and integration of computational biology resources.

    Science.gov (United States)

    Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W

    2008-05-28

    The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management

  9. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  10. Optimal Computing Resource Management Based on Utility Maximization in Mobile Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Haoyu Meng

    2017-01-01

    Full Text Available Mobile crowdsourcing, as an emerging service paradigm, enables the computing resource requestor (CRR to outsource computation tasks to each computing resource provider (CRP. Considering the importance of pricing as an essential incentive to coordinate the real-time interaction among the CRR and CRPs, in this paper, we propose an optimal real-time pricing strategy for computing resource management in mobile crowdsourcing. Firstly, we analytically model the CRR and CRPs behaviors in form of carefully selected utility and cost functions, based on concepts from microeconomics. Secondly, we propose a distributed algorithm through the exchange of control messages, which contain the information of computing resource demand/supply and real-time prices. We show that there exist real-time prices that can align individual optimality with systematic optimality. Finally, we also take account of the interaction among CRPs and formulate the computing resource management as a game with Nash equilibrium achievable via best response. Simulation results demonstrate that the proposed distributed algorithm can potentially benefit both the CRR and CRPs. The coordinator in mobile crowdsourcing can thus use the optimal real-time pricing strategy to manage computing resources towards the benefit of the overall system.

  11. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    Science.gov (United States)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in

  12. Research on the digital education resources of sharing pattern in independent colleges based on cloud computing environment

    Science.gov (United States)

    Xiong, Ting; He, Zhiwen

    2017-06-01

    Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.

  13. Justification of Filter Selection for Robot Balancing in Conditions of Limited Computational Resources

    Science.gov (United States)

    Momot, M. V.; Politsinskaia, E. V.; Sushko, A. V.; Semerenko, I. A.

    2016-08-01

    The paper considers the problem of mathematical filter selection, used for balancing of wheeled robot in conditions of limited computational resources. The solution based on complementary filter is proposed.

  14. Relaxed resource advance reservation policy in grid computing

    Institute of Scientific and Technical Information of China (English)

    XIAO Peng; HU Zhi-gang

    2009-01-01

    The advance reservation technique has been widely applied in many grid systems to provide end-to-end quality of service (QoS). However, it will result in low resource utilization rate and high rejection rate when the reservation rate is high. To mitigate these negative effects brought about by advance reservation, a relaxed advance reservation policy is proposed, which allows accepting new reservation requests that overlap the existing reservations under certain conditions. Both the benefits and the risks of the proposed policy are presented theoretically. The experimental results show that the policy can achieve a higher resource utilization rate and lower rejection rate compared to the conventional reservation policy and backfilling technique. In addition, the policy shows better adaptation when the grid systems are in the presence of a high reservation rate.

  15. Research on Digital Agricultural Information Resources Sharing Plan Based on Cloud Computing

    OpenAIRE

    2011-01-01

    Part 1: Decision Support Systems, Intelligent Systems and Artificial Intelligence Applications; International audience; In order to provide the agricultural works with customized, visual, multi-perspective and multi-level active service, we conduct a research of digital agricultural information resources sharing plan based on cloud computing to integrate and publish the digital agricultural information resources efficiently and timely. Based on cloud computing and virtualization technology, w...

  16. Regional research exploitation of the LHC a case-study of the required computing resources

    CERN Document Server

    Almehed, S; Eerola, Paule Anna Mari; Mjörnmark, U; Smirnova, O G; Zacharatou-Jarlskog, C; Åkesson, T

    2002-01-01

    A simulation study to evaluate the required computing resources for a research exploitation of the Large Hadron Collider (LHC) has been performed. The evaluation was done as a case study, assuming existence of a Nordic regional centre and using the requirements for performing a specific physics analysis as a yard-stick. Other imput parameters were: assumption for the distribution of researchers at the institutions involved, an analysis model, and two different functional structures of the computing resources.

  17. Efficient Qos Based Resource Scheduling Using PAPRIKA Method for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Hilda Lawrance

    2013-03-01

    Full Text Available Cloud computing is increasingly been used in enterprises and business markets for serving demanding jobs. The performance of resource scheduling in cloud computing is important due to the increase in number of users, services and type of services. Resource scheduling is influenced by many factors such as CPU speed, memory, bandwidth etc. Therefore resource scheduling can be modeled as a multi criteria decision making problem. This study proposes an efficient QoS based resource scheduling algorithm using potentially all pair-wise rankings of all possible alternatives (PAPRIKA. The tasks are arranged based on the QoS parameters and the resources are allocated to the appropriate tasks based on PAPRIKA method and user satisfaction. The scheduling algorithm was simulated with cloudsim tool package. The experiment shows that, the algorithm reduces task completion time and improves resource utility rate.

  18. Economic models for management of resources in peer-to-peer and grid computing

    Science.gov (United States)

    Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David

    2001-07-01

    The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.

  19. System Impacts from Interconnection of Distributed Resources: Current Status and Identification of Needs for Further Development

    Energy Technology Data Exchange (ETDEWEB)

    Basso, T. S.

    2009-01-01

    This report documents and evaluates system impacts from the interconnection of distributed resources to transmission and distribution systems, including a focus on renewable distributed resource technologies. The report also identifies system impact-resolution approaches and actions, including extensions of existing approaches. Lastly, the report documents the current challenges and examines what is needed to gain a clearer understanding of what to pursue to better avoid or address system impact issues.

  20. Multi-Programmatic and Institutional Computing Capacity Resource Attachment 2 Statement of Work

    Energy Technology Data Exchange (ETDEWEB)

    Seager, M

    2002-04-15

    Lawrence Livermore National Laboratory (LLNL) has identified high-performance computing as a critical competency necessary to meet the goals of LLNL's scientific and engineering programs. Leadership in scientific computing demands the availability of a stable, powerful, well-balanced computational infrastructure, and it requires research directed at advanced architectures, enabling numerical methods and computer science. To encourage all programs to benefit from the huge investment being made by the Advanced Simulation and Computing Program (ASCI) at LLNL, and to provide a mechanism to facilitate multi-programmatic leveraging of resources and access to high-performance equipment by researchers, M&IC was created. The Livermore Computing (LC) Center, a part of the Computations Directorate Integrated Computing and Communications (ICC) Department can be viewed as composed of two facilities, one open and one secure. This acquisition is focused on the M&IC resources in the Open Computing Facility (OCF). For the M&IC program, recent efforts and expenditures have focused on enhancing capacity and stabilizing the TeraCluster 2000 (TC2K) resource. Capacity is a measure of the ability to process a varied workload from many scientists simultaneously. Capability represents the ability to deliver a very large system to run scientific calculations at large scale. In this procurement action, we intend to significantly increase the capability of the M&IC resource to address multiple teraFLOP/s problems, and well as increasing the capacity to do many 100 gigaFLOP/s calculations.

  1. Computers and Resource-Based History Teaching: A UK Perspective.

    Science.gov (United States)

    Spaeth, Donald A.; Cameron, Sonja

    2000-01-01

    Presents an overview of developments in computer-aided history teaching for higher education in the United Kingdom and the United States. Explains that these developments have focused on providing students with access to primary sources to enhance their understanding of historical methods and content. (CMK)

  2. The current state of the creation and modernization of national geodetic and cartographic resources in Poland

    Directory of Open Access Journals (Sweden)

    Doskocz Adam

    2016-01-01

    Full Text Available All official data are currently integrated and harmonized in a spatial reference system. This paper outlines a national geodetic and cartographic resources in Poland. The national geodetic and cartographic resources are an important part of the spatial information infrastructure in the European Community. They also provide reference data for other resources of Spatial Data Infrastructure (SDI, including: main and detailed geodetic control networks, base maps, land and buildings registries, geodetic registries of utilities and topographic maps. This paper presents methods of producing digital map data and technical standards for field surveys, and in addition paper also presents some aspects of building Global and Regional SDI.

  3. General-purpose computer networks and resource sharing in ERDA. Volume 3. Remote resource-sharing experience and findings

    Energy Technology Data Exchange (ETDEWEB)

    1977-07-15

    The investigation focused on heterogeneous networks in which a variety of dissimilar computers and operating systems were interconnected nationwide. Homogeneous networks, such as MFE net and SACNET, were not considered since they could not be used for general purpose resource sharing. Issues of privacy and security are of concern in any network activity. However, consideration of privacy and security of sensitive data arise to a much lesser degree in unclassified scientific research than in areas involving personal or proprietary information. Therefore, the existing mechanisms at individual sites for protecting sensitive data were relied on, and no new protection mechanisms to prevent infringement of privacy and security were attempted. Further development of ERDA networking will need to incorporate additional mechanisms to prevent infringement of privacy. The investigation itself furnishes an excellent example of computational resource sharing through a heterogeneous network. More than twenty persons, representing seven ERDA computing sites, made extensive use of both ERDA and non-ERDA computers in coordinating, compiling, and formatting the data which constitute the bulk of this report. Volume 3 analyzes the benefits and barriers encountered in actual resource sharing experience, and provides case histories of typical applications.

  4. General-purpose computer networks and resource sharing in ERDA. Volume 3. Remote resource-sharing experience and findings

    Energy Technology Data Exchange (ETDEWEB)

    1977-07-15

    The investigation focused on heterogeneous networks in which a variety of dissimilar computers and operating systems were interconnected nationwide. Homogeneous networks, such as MFE net and SACNET, were not considered since they could not be used for general purpose resource sharing. Issues of privacy and security are of concern in any network activity. However, consideration of privacy and security of sensitive data arise to a much lesser degree in unclassified scientific research than in areas involving personal or proprietary information. Therefore, the existing mechanisms at individual sites for protecting sensitive data were relied on, and no new protection mechanisms to prevent infringement of privacy and security were attempted. Further development of ERDA networking will need to incorporate additional mechanisms to prevent infringement of privacy. The investigation itself furnishes an excellent example of computational resource sharing through a heterogeneous network. More than twenty persons, representing seven ERDA computing sites, made extensive use of both ERDA and non-ERDA computers in coordinating, compiling, and formatting the data which constitute the bulk of this report. Volume 3 analyzes the benefits and barriers encountered in actual resource sharing experience, and provides case histories of typical applications.

  5. Adaptive workflow scheduling in grid computing based on dynamic resource availability

    Directory of Open Access Journals (Sweden)

    Ritu Garg

    2015-06-01

    Full Text Available Grid computing enables large-scale resource sharing and collaboration for solving advanced science and engineering applications. Central to the grid computing is the scheduling of application tasks to the resources. Various strategies have been proposed, including static and dynamic strategies. The former schedules the tasks to resources before the actual execution time and later schedules them at the time of execution. Static scheduling performs better but it is not suitable for dynamic grid environment. The lack of dedicated resources and variations in their availability at run time has made this scheduling a great challenge. In this study, we proposed the adaptive approach to schedule workflow tasks (dependent tasks to the dynamic grid resources based on rescheduling method. It deals with the heterogeneous dynamic grid environment, where the availability of computing nodes and links bandwidth fluctuations are inevitable due to existence of local load or load by other users. The proposed adaptive workflow scheduling (AWS approach involves initial static scheduling, resource monitoring and rescheduling with the aim to achieve the minimum execution time for workflow application. The approach differs from other techniques in literature as it considers the changes in resources (hosts and links availability and considers the impact of existing load over the grid resources. The simulation results using randomly generated task graphs and task graphs corresponding to real world problems (GE and FFT demonstrates that the proposed algorithm is able to deal with fluctuations of resource availability and provides overall optimal performance.

  6. Computer simulation of transport driven current in tokamaks

    Energy Technology Data Exchange (ETDEWEB)

    Nunan, W.J.; Dawson, J.M. (University of California at Los Angeles, Department of Physics, 405 Hilgard Avenue, Los Angeles, California 90024-1547 (United States))

    1994-09-19

    We have investigated transport driven current in tokamaks via 2+1/2 dimensional, electromagnetic, particle-in-cell simulations. These have demonstrated a steady increase of toroidal current in centrally fueled plasmas. Neoclassical theory predicts that the bootstrap current vanishes at large aspect ratio, but we see equal or greater current growth in straight cylindrical plasmas. These results indicate that a centrally fueled and heated tokamak may sustain its toroidal current, even without the seed current'' which the neoclassical bootstrap theory requires.

  7. Quantum Computing Resource Estimate of Molecular Energy Simulation

    CERN Document Server

    Whitfield, James D; Aspuru-Guzik, Alán

    2010-01-01

    Over the last century, ingenious physical and mathematical insights paired with rapidly advancing technology have allowed the field of quantum chemistry to advance dramatically. However, efficient methods for the exact simulation of quantum systems on classical computers do not exist. The present paper reports an extension of one of the authors' previous work [Aspuru-Guzik et al., Science {309} p. 1704, (2005)] where it was shown that the chemical Hamiltonian can be efficiently simulated using a quantum computer. In particular, we report in detail how a set of molecular integrals can be used to create a quantum circuit that allows the energy of a molecular system with fixed nuclear geometry to be extracted using the phase estimation algorithm proposed by Abrams and Lloyd [Phys. Rev. Lett. {83} p. 5165, (1999)]. We extend several known results related to this idea and present numerical examples of the state preparation procedure required in the algorithm. With future quantum devices in mind, we provide a compl...

  8. MCPLOTS. A particle physics resource based on volunteer computing

    Energy Technology Data Exchange (ETDEWEB)

    Karneyeu, A. [Joint Inst. for Nuclear Research, Moscow (Russian Federation); Mijovic, L. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Irfu/SPP, CEA-Saclay, Gif-sur-Yvette (France); Prestel, S. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Lund Univ. (Sweden). Dept. of Astronomy and Theoretical Physics; Skands, P.Z. [European Organization for Nuclear Research (CERN), Geneva (Switzerland)

    2013-07-15

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  9. MCPLOTS: a particle physics resource based on volunteer computing

    CERN Document Server

    Karneyeu, A; Prestel, S; Skands, P Z

    2014-01-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC@HOME platform.

  10. Computer Simulation of Transport Driven Current in Tokamaks

    Science.gov (United States)

    Nunan, William Joseph, III

    1995-01-01

    Plasma transport phenomena can drive large currents parallel to an externally applied magnetic field. The Bootstrap Current Theory accounts for the effect of Banana Diffusion on toroidal current, but the effect is not confined to that transport regime, or even to toroidal geometry. Our electromagnetic particle simulations have demonstrated that Maxwellian plasmas in static toroidal and vertical fields spontaneously develop significant toroidal current, even in the absence of the "seed current" which the Bootstrap Theory requires. Other simulations, in both cylindrical and toroidal geometries, and without any externally imposed electric field, show that if the plasma column is centrally fueled, then an initial toroidal current grows steadily, apparently due to a dynamo effect. The straight cylinder does not exhibit kink instabilities because k_ {z} = 0 in this 2 + 1/2 dimensional model. When the plasma is fueled at the edge rather than the center, the effect is diminished. Fueling at an intermediate radius should produce a level of current drive in between these two limits, because the key to the current drive seems to be the amount of total poloidal flux which the plasma crosses in the process of escaping. In a reactor, injected (cold) fuel ions must reach the center, and be heated up in order to burn; therefore, central fueling is needed anyway, and the resulting influx of cold plasma and outflux of hot plasma drives the toroidal current. Our simulations indicate that central fueling, coupled with the central heating due to fusion reactions may provide all of the required toroidal current. The Neoclassical Theory predicts that the Bootstrap Current approaches zero as the aspect ratio approaches infinity; however, in straight cylindrical plasma simulations, axial current increases over time at nearly the same rate as in the toroidal case. These results indicate that a centrally fueled and heated tokamak may sustain its own toroidal current, even in the absence of

  11. A Resource Scheduling Strategy in Cloud Computing Based on Multi-agent Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Wuxue Jiang

    2013-11-01

    Full Text Available Resource scheduling strategies in cloud computing are used either to improve system operating efficiency, or to improve user satisfaction. This paper presents an integrated scheduling strategy considering both resources credibility and user satisfaction. It takes user satisfaction as objective function and resources credibility as a part of the user satisfaction, and realizes optimal scheduling by using genetic algorithm. We integrate this scheduling strategy into Agent subsequently and propose a cloud computing system architecture based on Multi-agent. The numerical results show that this scheduling strategy improves not only the system operating efficiency, but also the user satisfaction.  

  12. Mobile Cloud Computing: Resource Discovery, Session Connectivity and Other Open Issues

    NARCIS (Netherlands)

    Schüring, Markus; Karagiannis, Georgios

    2011-01-01

    Abstract—Cloud computing can be considered as a model that provides network access to a shared pool of resources, such as storage and computing power, which can be rapidly provisioned and released with minimal management effort. This paper describes a research activity in the area of mobile cloud co

  13. Young Children's Exploration of Semiotic Resources during Unofficial Computer Activities in the Classroom

    Science.gov (United States)

    Bjorkvall, Anders; Engblom, Charlotte

    2010-01-01

    The article describes and discusses the learning potential of unofficial techno-literacy activities in the classroom with regards to Swedish 7-8-year-olds' exploration of semiotic resources when interacting with computers. In classroom contexts where every child works with his or her own computer, such activities tend to take up a substantial…

  14. The portability of computer-related educational resources : summary and directions for further research

    NARCIS (Netherlands)

    De Diana, Italo; Collis, Betty A.

    1990-01-01

    In this Special Issue of the Journal of Research on Computing in Education, the portability of computer-related educational resources has been examined by a number of researchers and practitioners, reflecting various backgrounds, cultures, and experiences. A first iteration of a general model of fac

  15. Orchestrating the XO Computer with Digital and Conventional Resources to Teach Mathematics

    Science.gov (United States)

    Díaz, A.; Nussbaum, M.; Varela, I.

    2015-01-01

    Recent research has suggested that simply providing each child with a computer does not lead to an improvement in learning. Given that dozens of countries across the world are purchasing computers for their students, we ask which elements are necessary to improve learning when introducing digital resources into the classroom. Understood the…

  16. Mobile Cloud Computing: Resource Discovery, Session Connectivity and Other Open Issues

    NARCIS (Netherlands)

    Schüring, Markus; Karagiannis, Georgios

    2011-01-01

    Abstract—Cloud computing can be considered as a model that provides network access to a shared pool of resources, such as storage and computing power, which can be rapidly provisioned and released with minimal management effort. This paper describes a research activity in the area of mobile cloud

  17. A Comparative Study on Resource Allocation Policies in Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Bhavani B H

    2015-11-01

    Full Text Available Cloud computing is one of the latest models used for sharing pool of resources like CPU, memory, network bandwidth, hard drive etc. over the Internet. These resources are requested by the cloud user and are used on a rented basis just like electricity, water, LPG etc. When requests are made by the cloud user, allocation has to be done by the cloud service provider. With the limited amount of resources available, resource allocation becomes a challenging task for the cloud service provider as the resources are to be virtualized and allocated. These resources can be allocated dynamically or statically based on the type of request made by the cloud user and also depending on the application. In this paper, survey on both Static and Dynamic Allocation techniques are made. Also, comparison of both static and dynamic resource allocation techniques is made.

  18. Positron Computed Tomography: Current State, Clinical Results and Future Trends

    Science.gov (United States)

    Schelbert, H. R.; Phelps, M. E.; Kuhl, D. E.

    1980-09-01

    An overview is presented of positron computed tomography: its advantages over single photon emission tomography, its use in metabolic studies of the heart and chemical investigation of the brain, and future trends. (ACR)

  19. Parallel Computational Fluid Dynamics: Current Status and Future Requirements

    Science.gov (United States)

    Simon, Horst D.; VanDalsem, William R.; Dagum, Leonardo; Kutler, Paul (Technical Monitor)

    1994-01-01

    One or the key objectives of the Applied Research Branch in the Numerical Aerodynamic Simulation (NAS) Systems Division at NASA Allies Research Center is the accelerated introduction of highly parallel machines into a full operational environment. In this report we discuss the performance results obtained from the implementation of some computational fluid dynamics (CFD) applications on the Connection Machine CM-2 and the Intel iPSC/860. We summarize some of the experiences made so far with the parallel testbed machines at the NAS Applied Research Branch. Then we discuss the long term computational requirements for accomplishing some of the grand challenge problems in computational aerosciences. We argue that only massively parallel machines will be able to meet these grand challenge requirements, and we outline the computer science and algorithm research challenges ahead.

  20. Ubiquitous Wireless Computing: Current Research Progress, Challenging, and Future Directions

    OpenAIRE

    Elyas, Palantei

    2014-01-01

    - The aggressive research activities and generous studies focusing on the ubiquitous mobile computing carried-out during the last two decades have gained very tremendous outcomes to apply in broad areas of modern society lives. In the near future, the computing technology application is highly possible to emerge as the dominant method to connect any objects to the global ICT infrastructure, the internet. This talk mainly discusses several R&D achievements performed during the last five yea...

  1. Using High Performance Computing to Support Water Resource Planning

    Energy Technology Data Exchange (ETDEWEB)

    Groves, David G. [RAND Corporation, Santa Monica, CA (United States); Lembert, Robert J. [RAND Corporation, Santa Monica, CA (United States); May, Deborah W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Leek, James R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Syme, James [RAND Corporation, Santa Monica, CA (United States)

    2015-10-22

    In recent years, decision support modeling has embraced deliberation-withanalysis— an iterative process in which decisionmakers come together with experts to evaluate a complex problem and alternative solutions in a scientifically rigorous and transparent manner. Simulation modeling supports decisionmaking throughout this process; visualizations enable decisionmakers to assess how proposed strategies stand up over time in uncertain conditions. But running these simulation models over standard computers can be slow. This, in turn, can slow the entire decisionmaking process, interrupting valuable interaction between decisionmakers and analytics.

  2. Canada ocean energy atlas phase 1 : potential tidal current energy resources analysis background

    Energy Technology Data Exchange (ETDEWEB)

    Tarbotton, M.; Larson, M. [Triton Consultants Ltd., Vancouver, BC (Canada)

    2006-05-15

    This report was prepared as a background document for a preliminary tidal current resource inventory for Canadian waters. Energy calculations in the study were based on preliminary estimates of known tidal flows. The inventory was based on nautical charts, Canadian sailing directions, tide and tidal current constituent data, and numerical tidal modelling data. A finite element harmonic tidal model tool was used to provide tidal height and current velocities data for a varying number of tidal constituents. The study identified several major tidal current power resources throughout Canada. It was concluded that modelling studies should concentrate on Minas Basin in Nova Scotia; Georgia and Johnstone Straits in British Columbia; and Hudson's Strait and Ungava Bay. Modelling studies should provide estimates of extractable energy as well as provide initial assessments of the environmental impacts of tidal energy extraction in all 3 regions. 3 refs., 8 tabs., 16 figs.

  3. ARMS: An Agent-Based Resource Management System for Grid Computing

    Directory of Open Access Journals (Sweden)

    Junwei Cao

    2002-01-01

    Full Text Available Resource management is an important component of a grid computing infrastructure. The scalability and adaptability of such systems are two key challenges that must be addressed. In this work an agent-based resource management system, ARMS, is implemented for grid computing. ARMS utilises the performance prediction techniques of the PACE toolkit to provide quantitative data regarding the performance of complex applications running on a local grid resource. At the meta-level, a hierarchy of homogeneous agents are used to provide a scalable and adaptable abstraction of the system architecture. Each agent is able to cooperate with other agents and thereby provide service advertisement and discovery for the scheduling of applications that need to utilise grid resources. A case study with corresponding experimental results is included to demonstrate the efficiency of the resource management and scheduling system.

  4. PDBparam: Online Resource for Computing Structural Parameters of Proteins.

    Science.gov (United States)

    Nagarajan, R; Archana, A; Thangakani, A Mary; Jemimah, S; Velmurugan, D; Gromiha, M Michael

    2016-01-01

    Understanding the structure-function relationship in proteins is a longstanding goal in molecular and computational biology. The development of structure-based parameters has helped to relate the structure with the function of a protein. Although several structural features have been reported in the literature, no single server can calculate a wide-ranging set of structure-based features from protein three-dimensional structures. In this work, we have developed a web-based tool, PDBparam, for computing more than 50 structure-based features for any given protein structure. These features are classified into four major categories: (i) interresidue interactions, which include short-, medium-, and long-range interactions, contact order, long-range order, total contact distance, contact number, and multiple contact index, (ii) secondary structure propensities such as α-helical propensity, β-sheet propensity, and propensity of amino acids to exist at various positions of α-helix and amino acid compositions in high B-value regions, (iii) physicochemical properties containing ionic interactions, hydrogen bond interactions, hydrophobic interactions, disulfide interactions, aromatic interactions, surrounding hydrophobicity, and buriedness, and (iv) identification of binding site residues in protein-protein, protein-nucleic acid, and protein-ligand complexes. The server can be freely accessed at http://www.iitm.ac.in/bioinfo/pdbparam/. We suggest the use of PDBparam as an effective tool for analyzing protein structures.

  5. Getting the Most from Distributed Resources With an Analytics Platform for ATLAS Computing Services

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Bryant, Lincoln

    2016-01-01

    To meet a sharply increasing demand for computing resources for LHC Run 2, ATLAS distributed computing systems reach far and wide to gather CPU resources and storage capacity to execute an evolving ecosystem of production and analysis workflow tools. Indeed more than a hundred computing sites from the Worldwide LHC Computing Grid, plus many “opportunistic” facilities at HPC centers, universities, national laboratories, and public clouds, combine to meet these requirements. These resources have characteristics (such as local queuing availability, proximity to data sources and target destinations, network latency and bandwidth capacity, etc.) affecting the overall processing efficiency and throughput. To quantitatively understand and in some instances predict behavior, we have developed a platform to aggregate, index (for user queries), and analyze the more important information streams affecting performance. These data streams come from the ATLAS production system (PanDA), the distributed data management s...

  6. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Göttingen

    CERN Document Server

    Meyer, J; The ATLAS collaboration; Weber, P

    2010-01-01

    GoeGrid is a grid resource center located in Goettingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center will be presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster will be detailed. The benefits are an efficient use of computer and manpower resources. Further interdisciplinary projects are commonly organized courses for students of all fields to support education on grid-computing.

  7. The current state of computing in building design and practise

    DEFF Research Database (Denmark)

    Sørensen, Lars Schiøtt; Andersen, Tom

    1996-01-01

    The paper outlines a general survey on computer use in the danish AEC-sector, including a detailed study of the use of knowledge-based systems. It is colcluded that the use of AI-based technology is next to nothing, simply because a lack of awareness of such technology.......The paper outlines a general survey on computer use in the danish AEC-sector, including a detailed study of the use of knowledge-based systems. It is colcluded that the use of AI-based technology is next to nothing, simply because a lack of awareness of such technology....

  8. Argonne's Laboratory Computing Resource Center 2009 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B. (CLS-CI)

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  9. Using Digital Resources for the ECE Curriculum in China: Current Needs and Future Development

    Directory of Open Access Journals (Sweden)

    Jing Zhou

    2009-12-01

    Full Text Available Using digital resources is an important development in the Early Childhood Education (ECE curriculum in China. Guo and Wang (2005 report that 98% of urban ECE programs have computers with Internet connections, in addition to other technical facilities, which are used daily in ECE classrooms. However, the lack of curriculum-related digital resources and of a network to share them makes it difficult for teachers to share these resources for teaching. Further, this development of digital resources for ECE should consider Chinese cultures of learning (Jin & Cortazzi, 2006 in order to meet the needs of Chinese learners and maximize the learning effect. This paper focuses on the major features of digital resources in Chinese ECE and a framework for developing the content through examining existing digital resources and materials. Methods of inquiry and evaluation include the use of focus groups of kindergarten teachers in different provinces in China.The findings (Chen & Zhou, 2009; Zhou & Chen, 2009 indicate that an ECE digital resource should have features of individualization, interaction, sharing and sociability in networking in Chinese educational contexts, for supporting teaching design, practice, evaluation and reflection. An effective framework of ECE digital resources is recommended to contain three key parts:1 A Teacher Planning System to support teachers’ information searches, classified according to themes, subjects or types of activities.2 A Children’s Learning System to offer interactive learning at school or home following the classroom curriculum.3 A Family Support System to involve parents in their children’s learning and development.

  10. Software Defined Resource Orchestration System for Multitask Application in Heterogeneous Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Qi Qi

    2016-01-01

    Full Text Available The mobile cloud computing (MCC that combines mobile computing and cloud concept takes wireless access network as the transmission medium and uses mobile devices as the client. When offloading the complicated multitask application to the MCC environment, each task executes individually in terms of its own computation, storage, and bandwidth requirement. Due to user’s mobility, the provided resources contain different performance metrics that may affect the destination choice. Nevertheless, these heterogeneous MCC resources lack integrated management and can hardly cooperate with each other. Thus, how to choose the appropriate offload destination and orchestrate the resources for multitask is a challenge problem. This paper realizes a programming resource provision for heterogeneous energy-constrained computing environments, where a software defined controller is responsible for resource orchestration, offload, and migration. The resource orchestration is formulated as multiobjective optimal problem that contains the metrics of energy consumption, cost, and availability. Finally, a particle swarm algorithm is used to obtain the approximate optimal solutions. Simulation results show that the solutions for all of our studied cases almost can hit Pareto optimum and surpass the comparative algorithm in approximation, coverage, and execution time.

  11. A Novel Approach for Resource Discovery using Random Projection on Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    M.N.Faruk

    2013-04-01

    Full Text Available Cloud computing offers different type’s utilities to the IT industries. Generally the resources are scattered throughout the clouds. It has to enable the ability to find different resources are available at clouds, This again an important criteria of distributed systems. This paper investigates the problem of locating resources which is multi variant in nature. It also used to locate the relevant dimensions of resources which is avail at the same cloud. It is also addresses the random projection on each cloud and discover the possible resources at each iteration, the outcome of each iteration updated on collision matrix. All the discovered elements are updated at the Management fabric. This paper also describes the feasibility on discovering different types of resources available each cloud.

  12. PhoenixCloud: Provisioning Resources for Heterogeneous Workloads in Cloud Computing

    CERN Document Server

    Zhan, Jianfeng; Shi, Weisong; Gong, Shimin; Zang, Xiutao

    2010-01-01

    As more and more service providers choose Cloud platforms, which is provided by third party resource providers, resource providers needs to provision resources for heterogeneous workloads in different Cloud scenarios. Taking into account the dramatic differences of heterogeneous workloads, can we coordinately provision resources for heterogeneous workloads in Cloud computing? In this paper we focus on this important issue, which is investigated by few previous work. Our contributions are threefold: (1) we respectively propose a coordinated resource provisioning solution for heterogeneous workloads in two typical Cloud scenarios: first, a large organization operates a private Cloud for two heterogeneous workloads; second, a large organization or two service providers running heterogeneous workloads revert to a public Cloud; (2) we build an agile system PhoenixCloud that enables a resource provider to create coordinated runtime environments on demand for heterogeneous workloads when they are consolidated on a C...

  13. Case study of an application of computer mapping in oil-shale resource mapping

    Energy Technology Data Exchange (ETDEWEB)

    Davis, F.G.F. Jr.; Smith, J.W.

    1979-01-01

    The Laramie Energy Technology Center, U.S. Department of Energy, is responsible for evaluating the resources of potential oil and the deposit characteristics of oil shales of the Green River Formation in Colorado, Utah, and Wyoming. While the total oil shale resource represents perhaps 2 trillion barrels of oil, only parts of this total are suitable for any particular development process. To evaluate the resource according to deposit characteristics, a computer system for making resource calculations and geological maps has been established. The system generates resource tables where the calculations have been performed over user-defined geological intervals. The system also has the capability of making area calculations and generating resource maps of geological quality. The graphics package that generates the maps uses corehole assay data and digitized map data. The generated maps may include the following features: selected drainages, towns, political boundaries, township and section surveys, and corehole locations. The maps are then generated according to user-defined scales.

  14. Computation of charged current neutrino-Te reactions cross sections

    Science.gov (United States)

    Tsakstara, V.; Kosmas, T. S.; Sinatkas, J.

    2016-08-01

    Neutrino-nucleus reactions, involving both neutral current (NC) and charged current (CC) interactions are important probes in modern neutrino physics searches. In the present work, we study the concrete CC reactions 130 Te(vℓ,ℓ-)130 I and 130 Te(ṽℓ,ℓ+)130Sb which are of current experimental interest for the CUORE and COBRA experiments operating at Gran Sasso underground laboratory in Italy. The nuclear wave functions for the required initial and final nuclear states are derived by employing the proton-neutron (p-n) quasi-particle random phase approximation (QRPA) which has been previously tested in our neutral-current v-nucleus studies for Te isotopes.

  15. Current Cloud Computing Security Concerns from Consumer Perspective

    Institute of Scientific and Technical Information of China (English)

    Hafiz Gulfam Ahmad; Zeeshan Ahmad

    2013-01-01

    In recent years cloud computing is the subject of extensive research in the emerging field of information technology and has become a promising business.The reason behind this widespread interest is its abilityto increase the capacity and capability of enterprises,having no investment for new infrastructure,no software license requirement and no need of any training. Security concern is the main limitation factor in the growth of this new born technology.The secur-ity responsibilities of both,the provider and the consumer greatly differ between cloud service models.In this paper we discuss a variety of security risks,authentication issues,trust,and legal regularity in cloud environment with consumer perspective.Early research focused only on techni-cal and business consequences of cloud computing and ignored consumer perspective.There-fore,this paper discusses the consumer security and privacy preferences.

  16. Categorization of Computing Education Resources into the ACM Computing Classification System

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yinlin [Virginia Polytechnic Institute and State University (Virginia Tech); Bogen, Paul Logasa [ORNL; Fox, Dr. Edward A. [Virginia Polytechnic Institute and State University (Virginia Tech); Hsieh, Dr. Haowei [University of Iowa; Cassel, Dr. Lillian N. [Villanova University

    2012-01-01

    The Ensemble Portal harvests resources from multiple heterogonous federated collections. Managing these dynamically increasing collections requires an automatic mechanism to categorize records in to corresponding topics. We propose an approach to use existing ACM DL metadata to build classifiers for harvested resources in the Ensemble project. We also present our experience on utilizing the Amazon Mechanical Turk platform to build ground truth training data sets from Ensemble collections.

  17. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  19. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Goettingen

    CERN Document Server

    Meyer, J; The ATLAS collaboration; Weber, P

    2011-01-01

    GoeGrid is a grid resource center located in G¨ottingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and manpower resources.

  20. EGI-EUDAT integration activity - Pair data and high-throughput computing resources together

    Science.gov (United States)

    Scardaci, Diego; Viljoen, Matthew; Vitlacil, Dejan; Fiameni, Giuseppe; Chen, Yin; sipos, Gergely; Ferrari, Tiziana

    2016-04-01

    EGI (www.egi.eu) is a publicly funded e-infrastructure put together to give scientists access to more than 530,000 logical CPUs, 200 PB of disk capacity and 300 PB of tape storage to drive research and innovation in Europe. The infrastructure provides both high throughput computing and cloud compute/storage capabilities. Resources are provided by about 350 resource centres which are distributed across 56 countries in Europe, the Asia-Pacific region, Canada and Latin America. EUDAT (www.eudat.eu) is a collaborative Pan-European infrastructure providing research data services, training and consultancy for researchers, research communities, research infrastructures and data centres. EUDAT's vision is to enable European researchers and practitioners from any research discipline to preserve, find, access, and process data in a trusted environment, as part of a Collaborative Data Infrastructure (CDI) conceived as a network of collaborating, cooperating centres, combining the richness of numerous community-specific data repositories with the permanence and persistence of some of Europe's largest scientific data centres. EGI and EUDAT, in the context of their flagship projects, EGI-Engage and EUDAT2020, started in March 2015 a collaboration to harmonise the two infrastructures, including technical interoperability, authentication, authorisation and identity management, policy and operations. The main objective of this work is to provide end-users with a seamless access to an integrated infrastructure offering both EGI and EUDAT services and, then, pairing data and high-throughput computing resources together. To define the roadmap of this collaboration, EGI and EUDAT selected a set of relevant user communities, already collaborating with both infrastructures, which could bring requirements and help to assign the right priorities to each of them. In this way, from the beginning, this activity has been really driven by the end users. The identified user communities are

  1. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  2. ADAPTIVE MULTI-TENANCY POLICY FOR ENHANCING SERVICE LEVEL AGREEMENT THROUGH RESOURCE ALLOCATION IN CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    MasnidaHussin

    2016-07-01

    Full Text Available The appearance of infinite computing resources that available on demand and fast enough to adapt with load surges makes Cloud computing favourable service infrastructure in IT market. Core feature in Cloud service infrastructures is Service Level Agreement (SLA that led seamless service at high quality of service to client. One of the challenges in Cloud is providing heterogeneous computing services for the clients. With the increasing number of clients/tenants in the Cloud, unsatisfied agreement is becoming a critical factor. In this paper, we present an adaptive resource allocation policy which attempts to improve accountable in Cloud SLA while aiming for enhancing system performance. Specifically, our allocation incorporates dynamic matching SLA rules to deal with diverse processing requirements from tenants.Explicitly, it reduces processing overheadswhile achieving better service agreement. Simulation experiments proved the efficacy of our allocation policy in order to satisfy the tenants; and helps improve reliable computing

  3. Mandates, needs, equitable resources, and current research in English language teacher education: The case of Turkey

    Directory of Open Access Journals (Sweden)

    Saban Cepik

    2014-02-01

    Full Text Available Improving the quality of English language teacher education (ELTE programs has become a major point of consideration; however, such programmatic evaluations are markedly rare. This study utilizes both numeric and interpretive qualitative data in a blended research design. The study addresses, vis-à-vis current research in related fields: What is the current situation of the Turkish ELTE programs in terms of curriculum strength and faculty resources? How do the program directors and teacher candidates envision the situation of their programs in terms of curriculum strength and faculty resources? Data included 45 ELTE curricula, interviews with 24 program directors and pre-service teachers, documents, and test scores. Findings revealed several significant associations between school type (public/private and rank (low/high and the number of faculty with expertise in critical areas in the field. Qualitative critical evaluations suggest both perceptual matches and mismatches between program directors and teacher candidates regarding programmatic strengths and weaknesses.

  4. A survey on resource allocation in high performance distributed computing systems

    Energy Technology Data Exchange (ETDEWEB)

    Hussain, Hameed; Malik, Saif Ur Rehman; Hameed, Abdul; Khan, Samee Ullah; Bickler, Gage; Min-Allah, Nasro; Qureshi, Muhammad Bilal; Zhang, Limin; Yongji, Wang; Ghani, Nasir; Kolodziej, Joanna; Zomaya, Albert Y.; Xu, Cheng-Zhong; Balaji, Pavan; Vishnu, Abhinav; Pinel, Fredric; Pecero, Johnatan E.; Kliazovich, Dzmitry; Bouvry, Pascal; Li, Hongxiang; Wang, Lizhe; Chen, Dan; Rayes, Ammar

    2013-11-01

    An efficient resource allocation is a fundamental requirement in high performance computing (HPC) systems. Many projects are dedicated to large-scale distributed computing systems that have designed and developed resource allocation mechanisms with a variety of architectures and services. In our study, through analysis, a comprehensive survey for describing resource allocation in various HPCs is reported. The aim of the work is to aggregate under a joint framework, the existing solutions for HPC to provide a thorough analysis and characteristics of the resource management and allocation strategies. Resource allocation mechanisms and strategies play a vital role towards the performance improvement of all the HPCs classifications. Therefore, a comprehensive discussion of widely used resource allocation strategies deployed in HPC environment is required, which is one of the motivations of this survey. Moreover, we have classified the HPC systems into three broad categories, namely: (a) cluster, (b) grid, and (c) cloud systems and define the characteristics of each class by extracting sets of common attributes. All of the aforementioned systems are cataloged into pure software and hybrid/hardware solutions. The system classification is used to identify approaches followed by the implementation of existing resource allocation strategies that are widely presented in the literature.

  5. Human resources for mental health care: current situation and strategies for action.

    Science.gov (United States)

    Kakuma, Ritsuko; Minas, Harry; van Ginneken, Nadja; Dal Poz, Mario R; Desiraju, Keshav; Morris, Jodi E; Saxena, Shekhar; Scheffler, Richard M

    2011-11-05

    A challenge faced by many countries is to provide adequate human resources for delivery of essential mental health interventions. The overwhelming worldwide shortage of human resources for mental health, particularly in low-income and middle-income countries, is well established. Here, we review the current state of human resources for mental health, needs, and strategies for action. At present, human resources for mental health in countries of low and middle income show a serious shortfall that is likely to grow unless effective steps are taken. Evidence suggests that mental health care can be delivered effectively in primary health-care settings, through community-based programmes and task-shifting approaches. Non-specialist health professionals, lay workers, affected individuals, and caregivers with brief training and appropriate supervision by mental health specialists are able to detect, diagnose, treat, and monitor individuals with mental disorders and reduce caregiver burden. We also discuss scale-up costs, human resources management, and leadership for mental health, particularly within the context of low-income and middle-income countries.

  6. Computer-assisted Orthopaedic Surgery: Current State and Future Perspective

    Directory of Open Access Journals (Sweden)

    Guoyan eZheng

    2015-12-01

    Full Text Available Introduced about two decades ago, computer-assisted orthopaedic surgery (CAOS has emerged as a new and independent area, due to the importance of treatment of musculoskeletal diseases in orthopaedics and traumatology, increasing availability of different imaging modalities, and advances in analytics and navigation tools. The aim of this paper is to present the basic elements of CAOS devices and to review state-of-the-art examples of different imaging modalities used to create the virtual representations, of different position tracking devices for navigation systems, of different surgical robots, of different methods for registration and referencing, and of CAOS modules that have been realized for different surgical procedures. Future perspectives will also be outlined.

  7. Brain-computer interfaces current trends and applications

    CERN Document Server

    Azar, Ahmad

    2015-01-01

    The success of a BCI system depends as much on the system itself as on the user’s ability to produce distinctive EEG activity. BCI systems can be divided into two groups according to the placement of the electrodes used to detect and measure neurons firing in the brain. These groups are: invasive systems, electrodes are inserted directly into the cortex are used for single cell or multi unit recording, and electrocorticography (EcoG), electrodes are placed on the surface of the cortex (or dura); noninvasive systems, they are placed on the scalp and use electroencephalography (EEG) or magnetoencephalography (MEG) to detect neuron activity. The book is basically divided into three parts. The first part of the book covers the basic concepts and overviews of Brain Computer Interface. The second part describes new theoretical developments of BCI systems. The third part covers views on real applications of BCI systems.

  8. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    Science.gov (United States)

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis."

  9. A High-Resolution, Wave and Current Resource Assessment of Japan: The Web GIS Dataset

    CERN Document Server

    Webb, Adrean; Fujimoto, Wataru; Horiuchi, Kazutoshi; Kiyomatsu, Keiji; Matsuda, Kazuhiro; Miyazawa, Yasumasa; Varlamov, Sergey; Yoshikawa, Jun

    2016-01-01

    The University of Tokyo and JAMSTEC have conducted state-of-the-art wave and current resource assessments to assist with generator site identification and construction in Japan. These assessments are publicly-available and accessible via a web GIS service designed by WebBrain that utilizes TDS and GeoServer software with Leaflet libraries. The web GIS dataset contains statistical analyses of wave power, ocean and tidal current power, ocean temperature power, and other basic physical variables. The data (2D maps, time charts, depth profiles, etc.) is accessed through interactive browser sessions and downloadable files.

  10. Computational electromagnetics and model-based inversion a modern paradigm for eddy-current nondestructive evaluation

    CERN Document Server

    Sabbagh, Harold A; Sabbagh, Elias H; Aldrin, John C; Knopp, Jeremy S

    2013-01-01

    Computational Electromagnetics and Model-Based Inversion: A Modern Paradigm for Eddy Current Nondestructive Evaluation describes the natural marriage of the computer to eddy-current NDE. Three distinct topics are emphasized in the book: (a) fundamental mathematical principles of volume-integral equations as a subset of computational electromagnetics, (b) mathematical algorithms applied to signal-processing and inverse scattering problems, and (c) applications of these two topics to problems in which real and model data are used. By showing how mathematics and the computer can solve problems more effectively than current analog practices, this book defines the modern technology of eddy-current NDE. This book will be useful to advanced students and practitioners in the fields of computational electromagnetics, electromagnetic inverse-scattering theory, nondestructive evaluation, materials evaluation and biomedical imaging. Users of eddy-current NDE technology in industries as varied as nuclear power, aerospace,...

  11. Computer-Aided Design of a Direct Current Electromagnet

    Directory of Open Access Journals (Sweden)

    Iancu Tătucu

    2009-10-01

    Full Text Available The paper presents the mathematical model and the simulation of a direct current electromagnet used for the transport of the steel ingots. For the simulation of any device one must dispose of a mathematical model, able to describe as accurately as possible the phenomena that take place. As the processes occurred in the case of an electromagnet are of an electromagnetic nature, one used the model of the electromagnetic potentials, and the simulation was performed with the help of the specialised software ANSYS.

  12. Dynamic provisioning of local and remote compute resources with OpenStack

    Science.gov (United States)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  13. Current experience with computed tomographic cystography and blunt trauma.

    Science.gov (United States)

    Deck, A J; Shaves, S; Talner, L; Porter, J R

    2001-12-01

    We present our experience with computed tomographic (CT) cystography for the diagnosis of bladder rupture in patients with blunt abdominal and pelvic trauma and compare the results of CT cystography to operative exploration. We identified all blunt trauma patients diagnosed with bladder rupture from January 1992 to September 1998. We also reviewed the radiology computerized information system (RIS) for all CT cystograms performed for the evaluation of blunt trauma during the same time period. The medical records and pertinent radiographs of the patients with bladder rupture who underwent CT cystography as part of their admission evaluation were reviewed. Operative findings were compared to radiographic findings. Altogether, 316 patients had CT cystograms as part of an initial evaluation for blunt trauma. Of these patients, 44 had an ultimate diagnosis of bladder rupture; 42 patients had CT cystograms indicating bladder rupture. A total of 28 patients underwent formal bladder exploration; 23 (82%) had operative findings that exactly (i.e., presence and type of rupture) matched the CT cystogram interpretation. The overall sensitivity and specificity of CT cystography for detection of bladder rupture were 95% and 100%, respectively. For intraperitoneal rupture, the sensitivity and specificity were 78% and 99%, respectively. CT cystography provides an expedient evaluation for bladder rupture caused by blunt trauma and has an accuracy comparable to that reported for plain film cystography. We recommend CT cystography over plain film cystography for patients undergoing CT evaluation for other blunt trauma-related injuries.

  14. Improving the Distribution of Resource in a Grid Computing Network Services

    Directory of Open Access Journals (Sweden)

    Najmeh fillolahe

    2016-03-01

    Full Text Available In this study the computational grid environment and a queuing theory based algorithm have been examined for distribution of resources in the computational grid that in which the resources are connected to each other in the form of a star topology. By using the concepts of queue system and how to distribute the subtasks, this algorithm supply the workload power for distribution of existing resources while implementation of tasks in the shortest time. In the first phase of the algorithm it can be seen by computation of consumed time for tasks and subtasks that the grid system reduces the average response time generally. But in the second phase due to the lack of load balance between resources and imbalance in distribution of subtasks between them, in addition to establishing of workload balance, the tasks’ response time also has been increased in long-term. And in third phase in addition to establishing of workload balance, the average response time also has been reduced. Thus by using this algorithm tow important factorize. efficiency and load balance has been enhanced as far as possible. Also the distribution of subtasks in the grid environment and allocation of resources to them is implemented by considering this tow factors.

  15. Developing Online Learning Resources: Big Data, Social Networks, and Cloud Computing to Support Pervasive Knowledge

    Science.gov (United States)

    Anshari, Muhammad; Alas, Yabit; Guan, Lim Sei

    2016-01-01

    Utilizing online learning resources (OLR) from multi channels in learning activities promise extended benefits from traditional based learning-centred to a collaborative based learning-centred that emphasises pervasive learning anywhere and anytime. While compiling big data, cloud computing, and semantic web into OLR offer a broader spectrum of…

  16. Scheduling real-time indivisible loads with special resource allocation requirements on cluster computing

    Directory of Open Access Journals (Sweden)

    Abeer Hamdy

    2010-10-01

    Full Text Available The paper presents a heuristic algorithm to schedule real time indivisible loads represented as directed sequential task graph on a cluster computing. One of the cluster nodes has some special resources (denoted by special node that may be needed by one of the indivisible loads

  17. Impact of remote sensing upon the planning, management and development of water resources. Summary of computers and computer growth trends for hydrologic modeling and the input of ERTS image data processing load

    Science.gov (United States)

    Castruccio, P. A.; Loats, H. L., Jr.

    1975-01-01

    An analysis of current computer usage by major water resources users was made to determine the trends of usage and costs for the principal hydrologic users/models. The laws and empirical relationships governing the growth of the data processing loads were described and applied to project the future data loads. Data loads for ERTS CCT image processing were computed and projected through the 1985 era. The analysis showns significant impact due to the utilization and processing of ERTS CCT's data.

  18. Study on the Current Situation and Protection Countermeasures of Wild Plant Resources in Xishuangbanna National Nature Reserve

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [Objective] The aim was to study the current situation and protection countermeasures of wild plant resources in Xishuangbanna National Nature Reserve. [Method] The current situation of wild plant resources in Xishuangbanna National Nature Reserve was researched by means of route survey, sample plot survey and literature survey, and then the main impact factors of wild plant resources were analyzed by using participatory rural appraisal and problem tree analysis, finally protection countermeasures were put ...

  19. Towards Self Configured Multi-Agent Resource Allocation Framework for Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    M.N.Faruk

    2014-05-01

    Full Text Available The construction of virtualization and Cloud computing environment to assure numerous features such as improved flexibility, stabilized energy efficiency with minimal operating costs for IT industry. However, highly unpredictable workloads can create demands to promote quality-of-service assurance in the mean while promising competent resource utilization. To evade breach on SLA’s (Service-Level Agreements or may have unproductive resource utilization, In a virtual environment resource allocations must be tailored endlessly during the execution for the dynamic application workloads. In this proposed work, we described a hybrid approach on self-configured resource allocation model in cloud environments based on dynamic workloads application models. We narrated a comprehensive setup of a delegate stimulated enterprise application, the new Virtenterprise_Cloudapp benchmark, deployed on dynamic virtualized cloud platform.

  20. Including Alternative Resources in State Renewable Portfolio Standards: Current Design and Implementation Experience

    Energy Technology Data Exchange (ETDEWEB)

    Heeter, J.; Bird, L.

    2012-11-01

    Currently, 29 states, the District of Columbia, and Puerto Rico have instituted a renewable portfolio standard (RPS). An RPS sets a minimum threshold for how much renewable energy must be generated in a given year. Each state policy is unique, varying in percentage targets, timetables, and eligible resources. This paper examines state experience with implementing renewable portfolio standards that include energy efficiency, thermal resources, and non-renewable energy and explores compliance experience, costs, and how states evaluate, measure, and verify energy efficiency and convert thermal energy. It aims to gain insights from the experience of states for possible federal clean energy policy as well as to share experience and lessons for state RPS implementation.

  1. Allocating Tactical High-Performance Computer (HPC) Resources to Offloaded Computation in Battlefield Scenarios

    Science.gov (United States)

    2013-12-01

    devices. Offloading solutions such as Cuckoo (12), MAUI(13), COMET(14), and ThinkAir(15) offload applications via Wi-Fi or 3G networks to servers or...Soldier Smartphone Program. Information Week, 2010. 12. Kemp, R.; Palmer, N.; Kielmann, T.; Bal, H. Cuckoo : A Computation Offloading Framework for...ARMY RESEARCH LAB RDRL CIH S TAMIM SOOKOOR DALE SHIRES DAVID BRUNO RONDA TAYLOR SONG PARK 20 INTENTIONALLY LEFT BLANK. 21

  2. Analisis Teknik-Teknik Keamanan Pada Future Cloud Computing vs Current Cloud Computing: Survey Paper

    Directory of Open Access Journals (Sweden)

    Beny Nugraha

    2016-08-01

    Full Text Available Cloud computing adalah salah satu dari teknologi jaringan yang sedang berkembang pesat saat ini, hal ini dikarenakan cloud computing memiliki kelebihan dapat meningkatkan fleksibilitas dan kapabilitas dari proses komputer secara dinamis tanpa perlu mengeluarkan dana besar untuk membuat infrastruktur baru, oleh karena itu, peningkatan kualitas keamanan jaringan cloud computing sangat diperlukan. Penelitian ini akan meneliti teknik-teknik keamanan yang ada pada cloud computing saat ini dan arsitektur cloud computing masa depan, yaitu NEBULA. Teknik-teknik keamanan tersebut akan dibandingkan dalam hal kemampuannya dalam menangani serangan-serangan keamanan yang mungkin terjadi pada cloud computing. Metode yang digunakan pada penelitian ini adalah metode attack centric, yaitu setiap serangan keamanan dianalisis karakteristiknya dan kemudian diteliti mekanisme keamanan untuk menanganinya. Terdapat empat serangan keamanan yang diteliti dalam penelitian ini, dengan mengetahui bagaimana cara kerja sebuah serangan keamanan, maka akan diketahui juga mekanisme keamanan yang mana yang bisa mengatasi serangan tersebut. Dari penelitian ini didapatkan bahwa NEBULA memiliki tingkat keamanan yang paling tinggi. NEBULA memiliki tiga teknik baru yaitu Proof of Consent (PoC, Proof of Path (PoP, dan teknik kriptografi ICING. Ketiga teknik tersebut ditambah dengan teknik onion routing dapat mengatasi serangan keamanan yang dianalisa pada penelitian ini.

  3. Dynamic scheduling model of computing resource based on MAS cooperation mechanism

    Institute of Scientific and Technical Information of China (English)

    JIANG WeiJin; ZHANG LianMei; WANG Pu

    2009-01-01

    Allocation of grid resources aims at improving resource utility and grid application performance. Currently, the algorithms proposed for this purpose do not fit well the autonomic, dynamic, distributive and heterogeneous features of the grid environment. According to MAS (multi-agent system) cooperation mechanism and market bidding game rules, a model of allocating allocation of grid resources based on market economy is introduced to reveal the relationship between supply and demand. This model can make good use of the studying and negotiating ability of consumers' agent and takes full consideration of the consumer's behavior, thus rendering the application and allocation of resource of the consumers rational and valid. In the meantime, the utility function of consumer Is given; the existence and the uniqueness of Nash equilibrium point in the resource allocation game and the Nash equilibrium solution are discussed. A dynamic game algorithm of allocating grid resources is designed. Experimental results demonstrate that this algorithm diminishes effectively the unnecessary latency, improves significantly the smoothness of response time, the ratio of throughput and resource utility, thus rendering the supply and demand of the whole grid resource reasonable and the overall grid load balanceable.

  4. Human resource aspects of antiretroviral treatment delivery models: current practices and recommendations.

    Science.gov (United States)

    Assefa, Yibeltal; Van Damme, Wim; Hermann, Katharina

    2010-01-01

    PURPOSE OF VIEW: To illustrate and critically assess what is currently being published on the human resources for health dimension of antiretroviral therapy (ART) delivery models. The use of human resources for health can have an effect on two crucial aspects of successful ART programmes, namely the scale-up capacity and the long-term retention in care. Task shifting as the delegation of tasks from higher qualified to lower qualified cadres has become a widespread practice in ART delivery models in low-income countries in recent years. It is increasingly shown to effectively reduce the workload for scarce medical doctors without compromising the quality of care. At the same time, it becomes clear that task shifting can only be successful when accompanied by intensive training, supervision and support from existing health system structures. Although a number of recent publications have focussed on task shifting in ART delivery models, there is a lack of accessible information on the link between task shifting and patient outcomes. Current ART delivery models do not focus sufficiently on retention in care as arguably one of the most important issues for the long-term success of ART programmes. There is a need for context-specific re-designing of current ART delivery models in order to increase access to ART and improve long-term retention.

  5. An Extensible Scientific Computing Resources Integration Framework Based on Grid Service

    Science.gov (United States)

    Cui, Binge; Chen, Xin; Song, Pingjian; Liu, Rongjie

    Scientific computing resources (e.g., components, dynamic linkable libraries, etc) are very valuable assets for the scientific research. However, due to historical reasons, most computing resources can’t be shared by other people. The emergence of Grid computing provides a turning point to solve this problem. The legacy applications can be abstracted and encapsulated into Grid service, and they may be found and invoked on the Web using SOAP messages. The Grid service is loosely coupled with the external JAR or DLL, which builds a bridge from users to computing resources. We defined an XML schema to describe the functions and interfaces of the applications. This information can be acquired by users by invoking the “getCapabilities” operation of the Grid service. We also proposed the concept of class pool to eliminate the memory leaks when invoking the external jars using reflection. The experiment shows that the class pool not only avoids the PermGen space waste and Tomcat server exception, but also significantly improves the application speed. The integration framework has been implemented successfully in a real project.

  6. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  7. The Indus basin in the framework of current and future water resources management

    Science.gov (United States)

    Laghari, A. N.; Vanham, D.; Rauch, W.

    2012-04-01

    The Indus basin is one of the regions in the world that is faced with major challenges for its water sector, due to population growth, rapid urbanisation and industrialisation, environmental degradation, unregulated utilization of the resources, inefficient water use and poverty, all aggravated by climate change. The Indus Basin is shared by 4 countries - Pakistan, India, Afghanistan and China. With a current population of 237 million people which is projected to increase to 319 million in 2025 and 383 million in 2050, already today water resources are abstracted almost entirely (more than 95% for irrigation). Climate change will result in increased water availability in the short term. However in the long term water availability will decrease. Some current aspects in the basin need to be re-evaluated. During the past decades water abstractions - and especially groundwater extractions - have augmented continuously to support a rice-wheat system where rice is grown during the kharif (wet, summer) season (as well as sugar cane, cotton, maize and other crops) and wheat during the rabi (dry, winter) season. However, the sustainability of this system in its current form is questionable. Additional water for domestic and industrial purposes is required for the future and should be made available by a reduction in irrigation requirements. This paper gives a comprehensive listing and description of available options for current and future sustainable water resources management (WRM) within the basin. Sustainable WRM practices include both water supply management and water demand management options. Water supply management options include: (1) reservoir management as the basin is characterised by a strong seasonal behaviour in water availability (monsoon and meltwater) and water demands; (2) water quality conservation and investment in wastewater infrastructure; (3) the use of alternative water resources like the recycling of wastewater and desalination; (4) land use

  8. Maize provitamin A carotenoids, current resources and future metabolic engineering challenges.

    Directory of Open Access Journals (Sweden)

    Eleanore T Wurtzel

    2012-02-01

    Full Text Available Vitamin A deficiency is a serious global health problem that can be alleviated by improved nutrition. Development of cereal crops with increased provitamin A carotenoids can provide a sustainable solution to eliminating vitamin A deficiency worldwide. Maize is a model for cereals and a major staple carbohydrate source. Here, we discuss maize carotenogenesis with regard to pathway regulation, available resources, and current knowledge for improving carotenoid content and levels of provitamin A carotenoids in edible maize endosperm. This knowledge will be applied to improve the nutritional composition of related Poaceae crops. We discuss opportunities and challenges for optimizing provitamin A carotenoid biofortification of cereal food crops.

  9. The Indus basin in the framework of current and future water resources management

    Directory of Open Access Journals (Sweden)

    A. N. Laghari

    2012-04-01

    Full Text Available The Indus basin is one of the regions in the world that is faced with major challenges for its water sector, due to population growth, rapid urbanisation and industrialisation, environmental degradation, unregulated utilization of the resources, inefficient water use and poverty, all aggravated by climate change. The Indus Basin is shared by 4 countries – Pakistan, India, Afghanistan and China. With a current population of 237 million people which is projected to increase to 319 million in 2025 and 383 million in 2050, already today water resources are abstracted almost entirely (more than 95% for irrigation. Climate change will result in increased water availability in the short term. However in the long term water availability will decrease. Some current aspects in the basin need to be re-evaluated. During the past decades water abstractions – and especially groundwater extractions – have augmented continuously to support a rice-wheat system where rice is grown during the kharif (wet, summer season (as well as sugar cane, cotton, maize and other crops and wheat during the rabi (dry, winter season. However, the sustainability of this system in its current form is questionable. Additional water for domestic and industrial purposes is required for the future and should be made available by a reduction in irrigation requirements. This paper gives a comprehensive listing and description of available options for current and future sustainable water resources management (WRM within the basin. Sustainable WRM practices include both water supply management and water demand management options. Water supply management options include: (1 reservoir management as the basin is characterised by a strong seasonal behaviour in water availability (monsoon and meltwater and water demands; (2 water quality conservation and investment in wastewater infrastructure; (3 the use of alternative water resources like the recycling of wastewater and desalination; (4

  10. An open-source computational and data resource to analyze digital maps of immunopeptidomes

    Energy Technology Data Exchange (ETDEWEB)

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J.; Schuster, Heiko; Ternette, Nicola; Alpizar, Adan; Schittenhelm, Ralf B.; Ramarathinam, Sri Harsha; Lindestam-Arlehamn, Cecilia S.; Koh, Ching Chiek; Gillet, Ludovic; Rabsteyn, Armin; Navarro, Pedro; Kim, Sangtae; Lam, Henry; Sturm, Theo; Marcilla, Miguel; Sette, Alessandro; Campbell, David; Deutsch, Eric W.; Moritz, Robert L.; Purcell, Anthony; Rammensee, Hans-Georg; Stevanovic, Stevan; Aebersold, Ruedi

    2015-07-08

    We present a novel proteomics-based workflow and an open source data and computational resource for reproducibly identifying and quantifying HLA-associated peptides at high-throughput. The provided resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra and the analysis of quantitative digital maps of HLA peptidomes generated by SWATH mass spectrometry (MS). This is the first community-based study towards the development of a robust platform for the reproducible and quantitative measurement of HLA peptidomes, an essential step towards the design of efficient immunotherapies.

  11. IMPROVING FAULT TOLERANT RESOURCE OPTIMIZED AWARE JOB SCHEDULING FOR GRID COMPUTING

    Directory of Open Access Journals (Sweden)

    K. Nirmala Devi

    2014-01-01

    Full Text Available Workflow brokers of existing Grid Scheduling Systems are lack of cooperation mechanism which causes inefficient schedules of application distributed resources and it also worsens the utilization of various resources including network bandwidth and computational cycles. Furthermore considering the literature, all of these existing brokering systems primarily evolved around models of centralized hierarchical or client/server. In such models, vital responsibility such as resource discovery is delegated to the centralized server machines, thus they are associated with well-known disadvantages regarding single point of failure, scalability and network congestion at links that are leading to the server. In order to overcome these issues, we implement a new approach for decentralized cooperative workflow scheduling in a dynamically distributed resource sharing environment of Grids. The various actors in the system namely the users who belong to multiple control domains, workflow brokers and resources work together enabling a single cooperative resource sharing environment. But this approach ignored the fact that each grid site may have its own fault-tolerance strategy because each site is itself an autonomous domain. For instance, if a grid site handles the job check-pointing mechanism, each computation node must have the ability of periodical transmission of transient state of the job execution by computational node to the server. When there is a failure of job, it will migrate to another computational node and resume from the last stored checkpoint. A Glow worm Swarm Optimization (GSO for job scheduling is used to address the issue of heterogeneity in fault-tolerance of computational grid but Weighted GSO that overcomes the position update imperfections of general GSO in a more efficient manner shown during comparison analysis. This system supports four kinds of fault-tolerance mechanisms, including the job migration, job retry, check-pointing and

  12. Tracking the Flow of Resources in Electronic Waste - The Case of End-of-Life Computer Hard Disk Drives.

    Science.gov (United States)

    Habib, Komal; Parajuly, Keshav; Wenzel, Henrik

    2015-10-20

    Recovery of resources, in particular, metals, from waste flows is widely seen as a prioritized option to reduce their potential supply constraints in the future. The current waste electrical and electronic equipment (WEEE) treatment system is more focused on bulk metals, where the recycling rate of specialty metals, such as rare earths, is negligible compared to their increasing use in modern products, such as electronics. This study investigates the challenges in recovering these resources in the existing WEEE treatment system. It is illustrated by following the material flows of resources in a conventional WEEE treatment plant in Denmark. Computer hard disk drives (HDDs) containing neodymium-iron-boron (NdFeB) magnets were selected as the case product for this experiment. The resulting output fractions were tracked until their final treatment in order to estimate the recovery potential of rare earth elements (REEs) and other resources contained in HDDs. The results further show that out of the 244 kg of HDDs treated, 212 kg comprising mainly of aluminum and steel can be finally recovered from the metallurgic process. The results further demonstrate the complete loss of REEs in the existing shredding-based WEEE treatment processes. Dismantling and separate processing of NdFeB magnets from their end-use products can be a more preferred option over shredding. However, it remains a technological and logistic challenge for the existing system.

  13. The ATLAS Computing Agora: a resource web site for citizen science projects

    CERN Document Server

    Bourdarios, Claire; The ATLAS collaboration

    2016-01-01

    The ATLAS collaboration has recently setup a number of citizen science projects which have a strong IT component and could not have been envisaged without the growth of general public computing resources and network connectivity: event simulation through volunteer computing, algorithms improvement via Machine Learning challenges, event display analysis on citizen science platforms, use of open data, etc. Most of the interactions with volunteers are handled through message boards, but specific outreach material was also developed, giving an enhanced visibility to the ATLAS software and computing techniques, challenges and community. In this talk the Atlas Computing Agora (ACA) web platform will be presented as well as some of the specific material developed for some of the projects.

  14. Current Control and Performance Evaluation of Converter Interfaced Distribution Resources in Grid Connected Mode

    Directory of Open Access Journals (Sweden)

    SINGH Alka

    2012-10-01

    Full Text Available Use of distributed resources is growing in developing countries like India and in developed nations too. The increased acceptance of suchresources is mainly due to their modularity, increased reliability, good power quality and environment friendly operation. These are currently being interfaced to the existing systems using voltage source inverters (VSC’s. The control of such distributed resources is significantly different than the conventional power systems mainly because the VSC’s have no inertia unlike the synchronous generators.This paper deals with the Matlab modeling and design of control aspects of one such distributed source feeding a common load. A grid connected supply is also available. The control algorithm is developed for real and reactive power sharing of the load between thedistributed source and the grid. The developed control scheme is tested for linear (R-L load as well as nonlinear loads. With suitable modifications, the control algorithm can be extended for several distributed resources connected in parallel.

  15. The domestic resource gap and current transaction deficit in Indonesia in 2010-2014

    Directory of Open Access Journals (Sweden)

    Anhulaila M. Palampanga

    2017-05-01

    Full Text Available The purpose of this study is to determine the relationship between domestic financial resource gaps and current account balance in Indonesia by using data from 2010 to 2014. Gaps in the domestic economy are classified into three types: 1 the domestic absorptive capacity of the national income gap (GNP, 2 gross national savings and investment gap, 3 private sector gap (private saving minus private investment, and public sector gap (tax minus government spending. By using a concept of open economy that is described in a theoretical framework, the study results show that: 1 the gap absorption of domestic and GNP, 2 the gap between gross national saving and gross national investments, 3 the gap in private sector and government sector resulting in deficit in the current account during Indonesia on 2010-2014 periods.

  16. Critical phenomena in communication/computation networks with various topologies and suboptimal to optimal resource allocation

    Science.gov (United States)

    Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi

    2015-01-01

    We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.

  17. The Usage of Informal Computer Based Communication in the Context of Organization’s Technological Resources

    OpenAIRE

    Steponas Jonušauskas; Agota Giedrė Raišienė

    2011-01-01

    Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization’s technological resources. Methodology—meta analysis, survey and descriptive analysis. Findings. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication wid...

  18. On state-dependant sampling for nonlinear controlled systems sharing limited computational resources

    OpenAIRE

    Alamir, Mazen

    2007-01-01

    21 pages. soumis à la revue "IEEE Transactions on Automatic Control"; International audience; In this paper, a framework for dynamic monitoring of sampling periods for nonlinear controlled systems is proposed. This framework is particularly adapted to the context of controlled systems sharing limited computational resources. The proposed scheme can be used in a cascaded structure with any feedback scheduling design. Illustrative examples are given to assess the efficiency of the proposed fram...

  19. Collocational Relations in Japanese Language Textbooks and Computer-Assisted Language Learning Resources

    Directory of Open Access Journals (Sweden)

    Irena SRDANOVIĆ

    2011-05-01

    Full Text Available In this paper, we explore presence of collocational relations in the computer-assisted language learning systems and other language resources for the Japanese language, on one side, and, in the Japanese language learning textbooks and wordlists, on the other side. After introducing how important it is to learn collocational relations in a foreign language, we examine their coverage in the various learners’ resources for the Japanese language. We particularly concentrate on a few collocations at the beginner’s level, where we demonstrate their treatment across various resources. A special attention is paid to what is referred to as unpredictable collocations, which have a bigger foreign language learning-burden than the predictable ones.

  20. A 20-Year High-Resolution Wave Resource Assessment of Japan with Wave-Current Interactions

    Science.gov (United States)

    Webb, A.; Waseda, T.; Kiyomatsu, K.

    2016-02-01

    Energy harvested from surface ocean waves and tidal currents has the potential to be a significant source of green energy, particularly for countries with extensive coastlines such as Japan. As part of a larger marine renewable energy project*, The University of Tokyo (in cooperation with JAMSTEC) has conducted a state-of-the-art wave resource assessment (with uncertainty estimates) to assist with wave generator site identification and construction in Japan. This assessment will be publicly available and is based on a large-scale NOAA WAVEWATCH III (version 4.18) simulation using NCEP and JAMSTEC forcings. It includes several key components to improve model skill: a 20-year simulation to reduce aleatory uncertainty, a four-nested-layer approach to resolve a 1 km shoreline, and finite-depth and current effects included in all wave power density calculations. This latter component is particularly important for regions near strong currents such as the Kuroshio. Here, we will analyze the different wave power density equations, discuss the model setup, and present results from the 20-year assessment (with a focus on the role of wave-current interactions). Time permitting, a comparison will also be made with simulations using JMA MSM 5 km winds. *New Energy and Industrial Technology Development Organization (NEDO): "Research on the Framework and Infrastructure of Marine Renewable Energy; an Energy Potential Assessment"

  1. A survey and taxonomy on energy efficient resource allocation techniques for cloud computing systems

    Energy Technology Data Exchange (ETDEWEB)

    Hameed, Abdul; Khoshkbarforoushha, Alireza; Ranjan, Rajiv; Jayaraman, Prem Prakash; Kolodziej, Joanna; Balaji, Pavan; Zeadally, Sherali; Malluhi, Qutaibah Marwan; Tziritas, Nikos; Vishnu, Abhinav; Khan, Samee U.; Zomaya, Albert

    2014-06-06

    In a cloud computing paradigm, energy efficient allocation of different virtualized ICT resources (servers, storage disks, and networks, and the like) is a complex problem due to the presence of heterogeneous application (e.g., content delivery networks, MapReduce, web applications, and the like) workloads having contentious allocation requirements in terms of ICT resource capacities (e.g., network bandwidth, processing speed, response time, etc.). Several recent papers have tried to address the issue of improving energy efficiency in allocating cloud resources to applications with varying degree of success. However, to the best of our knowledge there is no published literature on this subject that clearly articulates the research problem and provides research taxonomy for succinct classification of existing techniques. Hence, the main aim of this paper is to identify open challenges associated with energy efficient resource allocation. In this regard, the study, first, outlines the problem and existing hardware and software-based techniques available for this purpose. Furthermore, available techniques already presented in the literature are summarized based on the energy-efficient research dimension taxonomy. The advantages and disadvantages of the existing techniques are comprehensively analyzed against the proposed research dimension taxonomy namely: resource adaption policy, objective function, allocation method, allocation operation, and interoperability.

  2. Optimizing qubit resources for quantum chemistry simulations in second quantization on a quantum computer

    Science.gov (United States)

    Moll, Nikolaj; Fuhrer, Andreas; Staar, Peter; Tavernelli, Ivano

    2016-07-01

    Quantum chemistry simulations on a quantum computer suffer from the overhead needed for encoding the Fermionic problem in a system of qubits. By exploiting the block diagonality of a Fermionic Hamiltonian, we show that the number of required qubits can be reduced while the number of terms in the Hamiltonian will increase. All operations for this reduction can be performed in operator space. The scheme is conceived as a pre-computational step that would be performed prior to the actual quantum simulation. We apply this scheme to reduce the number of qubits necessary to simulate both the Hamiltonian of the two-site Fermi-Hubbard model and the hydrogen molecule. Both quantum systems can then be simulated with a two-qubit quantum computer. Despite the increase in the number of Hamiltonian terms, the scheme still remains a useful tool to reduce the dimensionality of specific quantum systems for quantum simulators with a limited number of resources.

  3. Incorporating core hysteresis properties in three-dimensional computations of transformer inrush current forces

    Science.gov (United States)

    Adly, A. A.; Hanafy, H. H.

    2009-04-01

    It is well known that transformer inrush currents depend upon the core properties, residual flux, switching instant, and the overall circuit parameters. Large transient inrush currents introduce abnormal electromagnetic forces which may destroy the transformer windings. This paper presents an approach through which core hysteresis may be incorporated in three-dimensional computations of transformer inrush current forces. Details of the approach, measurements, and simulations for a shell-type transformer are given in the paper.

  4. Exploring the current application of professional competencies in human resource management in the South African context

    Directory of Open Access Journals (Sweden)

    Nico Schutte

    2015-03-01

    Full Text Available Orientation: Human research (HR practitioners have an important role to play in the sustainability and competitiveness of organisations. Yet their strategic contribution and the value they add remain unrecognised.Research purpose: The main objective of this research was to explore the extent to which HR practitioners are currently allowed to display HR competencies in the workplace, and whether any significant differences exist between perceived HR competencies, based on the respondents’ demographic characteristics.Motivation for the study: Limited empirical research exists on the extent to which HR practitioners are allowed to display key competencies in the South African workplace.Research approach, design, and method: A quantitative research approach was followed. A Human Resource Management Professional Competence Questionnaire was administered to HR practitioners and managers (N = 481.Main findings: The results showed that HR competencies are poorly applied in selected South African workplaces. The competencies that were indicated as having the poorest application were talent management, HR metrics, HR business knowledge, and innovation. The white ethic group experienced a poorer application of all human research management (HRM competencies compared to the black African ethnic group.Practical/managerial implications: The findings of the research highlighted the need for management to evaluate the current application of HR practices in the workplace and also the extent to which HR professionals are involved as strategic business partners.Contribution/value-add: This research highlights the need for the current application of HR competencies in South African workplaces to be improved.

  5. Provable Data Possession of Resource-constrained Mobile Devices in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jian Yang

    2011-07-01

    Full Text Available Benefited from cloud storage services, users can save their cost of buying expensive storage and application servers, as well as deploying and maintaining applications. Meanwhile they lost the physical control of their data. So effective methods are needed to verify the correctness of the data stored at cloud servers, which are the research issues the Provable Data Possession (PDP faced. The most important features in PDP are: 1 supporting for public, unlimited numbers of times of verification; 2 supporting for dynamic data update; 3 efficiency of storage space and computing. In mobile cloud computing, mobile end-users also need the PDP service. However, the computing workloads and storage burden of client in existing PDP schemes are too heavy to be directly used by the resource-constrained mobile devices. To solve this problem, with the integration of the trusted computing technology, this paper proposes a novel public PDP scheme, in which the trusted third-party agent (TPA takes over most of the calculations from the mobile end-users. By using bilinear signature and Merkle hash tree (MHT, the scheme aggregates the verification tokens of the data file into one small signature to reduce communication and storage burden. MHT is also helpful to support dynamic data update. In our framework, the mobile terminal devices only need to generate some secret keys and random numbers with the help of trusted platform model (TPM chips, and the needed computing workload and storage space is fit for mobile devices. Our scheme realizes provable secure storage service for resource-constrained mobile devices in mobile cloud computing.

  6. A novel agent based autonomous and service composition framework for cost optimization of resource provisioning in cloud computing

    Directory of Open Access Journals (Sweden)

    Aarti Singh

    2017-01-01

    Full Text Available A cloud computing environment offers a simplified, centralized platform or resources for use when needed at a low cost. One of the key functionalities of this type of computing is to allocate the resources on an individual demand. However, with the expanding requirements of cloud user, the need of efficient resource allocation is also emerging. The main role of service provider is to effectively distribute and share the resources which otherwise would result into resource wastage. In addition to the user getting the appropriate service according to request, the cost of respective resource is also optimized. In order to surmount the mentioned shortcomings and perform optimized resource allocation, this research proposes a new Agent based Automated Service Composition (A2SC algorithm comprising of request processing and automated service composition phases and is not only responsible for searching comprehensive services but also considers reducing the cost of virtual machines which are consumed by on-demand services only.

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  8. Exploring Graphics Processing Unit (GPU Resource Sharing Efficiency for High Performance Computing

    Directory of Open Access Journals (Sweden)

    Teng Li

    2013-11-01

    Full Text Available The increasing incorporation of Graphics Processing Units (GPUs as accelerators has been one of the forefront High Performance Computing (HPC trends and provides unprecedented performance; however, the prevalent adoption of the Single-Program Multiple-Data (SPMD programming model brings with it challenges of resource underutilization. In other words, under SPMD, every CPU needs GPU capability available to it. However, since CPUs generally outnumber GPUs, the asymmetric resource distribution gives rise to overall computing resource underutilization. In this paper, we propose to efficiently share the GPU under SPMD and formally define a series of GPU sharing scenarios. We provide performance-modeling analysis for each sharing scenario with accurate experimentation validation. With the modeling basis, we further conduct experimental studies to explore potential GPU sharing efficiency improvements from multiple perspectives. Both further theoretical and experimental GPU sharing performance analysis and results are presented. Our results not only demonstrate the significant performance gain for SPMD programs with the proposed efficient GPU sharing, but also the further improved sharing efficiency with the optimization techniques based on our accurate modeling.

  9. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    Science.gov (United States)

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  10. An Architecture of IoT Service Delegation and Resource Allocation Based on Collaboration between Fog and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Aymen Abdullah Alsaffar

    2016-01-01

    Full Text Available Despite the wide utilization of cloud computing (e.g., services, applications, and resources, some of the services, applications, and smart devices are not able to fully benefit from this attractive cloud computing paradigm due to the following issues: (1 smart devices might be lacking in their capacity (e.g., processing, memory, storage, battery, and resource allocation, (2 they might be lacking in their network resources, and (3 the high network latency to centralized server in cloud might not be efficient for delay-sensitive application, services, and resource allocations requests. Fog computing is promising paradigm that can extend cloud resources to edge of network, solving the abovementioned issue. As a result, in this work, we propose an architecture of IoT service delegation and resource allocation based on collaboration between fog and cloud computing. We provide new algorithm that is decision rules of linearized decision tree based on three conditions (services size, completion time, and VMs capacity for managing and delegating user request in order to balance workload. Moreover, we propose algorithm to allocate resources to meet service level agreement (SLA and quality of services (QoS as well as optimizing big data distribution in fog and cloud computing. Our simulation result shows that our proposed approach can efficiently balance workload, improve resource allocation efficiently, optimize big data distribution, and show better performance than other existing methods.

  11. A Safety Resource Allocation Mechanism against Connection Fault for Vehicular Cloud Computing

    Directory of Open Access Journals (Sweden)

    Tianpeng Ye

    2016-01-01

    Full Text Available The Intelligent Transportation System (ITS becomes an important component of the smart city toward safer roads, better traffic control, and on-demand service by utilizing and processing the information collected from sensors of vehicles and road side infrastructure. In ITS, Vehicular Cloud Computing (VCC is a novel technology balancing the requirement of complex services and the limited capability of on-board computers. However, the behaviors of the vehicles in VCC are dynamic, random, and complex. Thus, one of the key safety issues is the frequent disconnections between the vehicle and the Vehicular Cloud (VC when this vehicle is computing for a service. More important, the connection fault will disturb seriously the normal services of VCC and impact the safety works of the transportation. In this paper, a safety resource allocation mechanism is proposed against connection fault in VCC by using a modified workflow with prediction capability. We firstly propose the probability model for the vehicle movement which satisfies the high dynamics and real-time requirements of VCC. And then we propose a Prediction-based Reliability Maximization Algorithm (PRMA to realize the safety resource allocation for VCC. The evaluation shows that our mechanism can improve the reliability and guarantee the real-time performance of the VCC.

  12. The impact of antiretroviral therapy in resource-limited settings and current HIV therapeutics.

    Science.gov (United States)

    Kumarasamy, N

    2016-04-01

    Four million people of the global total of 35 million with HIV infection are from South-East Asia. ART is currently utilized by 15 million people and has led to a dramatic decline in the mortality rate, including those in low- and middle-income countries. A reduction in sexually transmitted HIV and in comorbidities including tuberculosis has also followed. Current recommendations for the initiation of antiretroviral therapy in people who are HIV+ are essentially to initiate ART irrespective of CD4 cell count and clinical stage. The frequency of HIV testing should be culturally specific and based on the HIV incidence in different key populations but phasing in viral load technology in LMIC is an urgent priority and this needs resources and capacity. With the availability of simplified potent ART regimens, persons with HIV now live longer. The recent WHO treatment guidelines recommending routine HIV testing and earlier initiation of treatment should be the stepping stone for ending the AIDS epidemic and to meet the UNAIDS mission of 90*90*90.

  13. Context-aware computing-based reducing cost of service method in resource discovery and interaction

    Institute of Scientific and Technical Information of China (English)

    TANG Shan-cheng; HOU Yi-bin

    2004-01-01

    Reducing cost of service is an important goal for resource discovery and interaction technologies. The shortcomings of transhipment-method and hibernation-method are to increase holistic cost of service and to slower resource discovery respectively. To overcome these shortcomings, a context-aware computing-based method is developed. This method, firstly,analyzes the courses of devices using resource discovery and interaction technologies to identify some types of context related to reducing cost of service, then, chooses effective methods such as stopping broadcast and hibernation to reduce cost of service according to information supplied by the context but not the transhipment-method's simple hibernations. The results of experiments indicate that under the worst condition this method overcomes the shortcomings of transhipment-method, makes the "poor" devices hibernate longer than hibernation-method to reduce cost of service more effectively, and discovers resources faster than hibernation-method; under the best condition it is far better than hibernation-method in all aspects.

  14. Resources and Approaches for Teaching Quantitative and Computational Skills in the Geosciences and Allied Fields

    Science.gov (United States)

    Orr, C. H.; Mcfadden, R. R.; Manduca, C. A.; Kempler, L. A.

    2016-12-01

    Teaching with data, simulations, and models in the geosciences can increase many facets of student success in the classroom, and in the workforce. Teaching undergraduates about programming and improving students' quantitative and computational skills expands their perception of Geoscience beyond field-based studies. Processing data and developing quantitative models are critically important for Geoscience students. Students need to be able to perform calculations, analyze data, create numerical models and visualizations, and more deeply understand complex systems—all essential aspects of modern science. These skills require students to have comfort and skill with languages and tools such as MATLAB. To achieve comfort and skill, computational and quantitative thinking must build over a 4-year degree program across courses and disciplines. However, in courses focused on Geoscience content it can be challenging to get students comfortable with using computational methods to answers Geoscience questions. To help bridge this gap, we have partnered with MathWorks to develop two workshops focused on collecting and developing strategies and resources to help faculty teach students to incorporate data, simulations, and models into the curriculum at the course and program levels. We brought together faculty members from the sciences, including Geoscience and allied fields, who teach computation and quantitative thinking skills using MATLAB to build a resource collection for teaching. These materials, and the outcomes of the workshops are freely available on our website. The workshop outcomes include a collection of teaching activities, essays, and course descriptions that can help faculty incorporate computational skills at the course or program level. The teaching activities include in-class assignments, problem sets, labs, projects, and toolboxes. These activities range from programming assignments to creating and using models. The outcomes also include workshop

  15. The role of the computer in science fair projects: Current status and potential

    Energy Technology Data Exchange (ETDEWEB)

    Trainor, M.S.

    1991-01-01

    The need for more students to enter the field of science is acute in the nation, and science fair projects provide a motivational mechanism to entice students into pursuing scientific careers. Computers play a major role in science today. Because computers are a major source of entertainment for our children, one would expect them to play a significant role in many science fair projects. This study investigated current and potential uses of computers in science fair projects and incorporated an informal case study of scientists, teachers, and students involved in science fair projects from a highly scientific community. Interviews, a survey, and observations were conducted. Results indicated that most projects either do not use or inadequately use computers and that a significant potential for more effective use of computers for science fair projects exists.

  16. Interactive Whiteboards and Computer Games at Highschool Level: Digital Resources for Enhancing Reflection in Teaching and Learning

    DEFF Research Database (Denmark)

    Sorensen, Elsebeth Korsgaard; Poulsen, Mathias; Houmann, Rita

    The general potential of computer games for teaching and learning is becoming widely recognized. In particular, within the application contexts of primary and lower secondary education, the relevance and value and computer games seem more accepted, and the possibility and willingness to incorporate...... computer games as a possible resource at the level of other educational resources seem more frequent. For some reason, however, to apply computer games in processes of teaching and learning at the high school level, seems an almost non-existent event. This paper reports on study of incorporating...

  17. Method to Reduce the Computational Intensity of Offshore Wind Energy Resource Assessments Using Cokriging

    Science.gov (United States)

    Dvorak, M. J.; Boucher, A.; Jacobson, M. Z.

    2009-12-01

    Wind energy represents the fastest growing renewable energy resource, sustaining double digit growth for the past 10 years with approximately 94,000 MW installed by the end of 2007. Although winds over the ocean are generally stronger and often located closer to large urban electric load centers, offshore wind turbines represent about 1% of installed capacity. In order to evaluate the economic potential of an offshore wind resource, wind resource assessments typically involve running large mesoscale model simulations, validated with sparse in-situ meteorological station data. These simulations are computationally expensive limiting their temporal coverage. Although a wealth of other wind data does exist (e.g. QuickSCAT satellite, SAR satellite, radar/SODAR wind profiler, and radiosounde) these data are often ignored or interpolated trivially because of the widely varying spatial and temporal resolution. A spatio-temporal cokriging approach with non-parametric covariances was developed to interpolate these empirical data and compare it with previously validated surface winds output by the PSU/NCAR MM5 for coastal California. The spatio-temporal covariance model is assumed to be the product of a spatial and a temporal covariance component. The temporal covariance is derived from in-situ wind speed measurements at 10 minutes intervals measured by offshore buoys and variograms are calculated non-parametrically using a FFT. Spatial covariance tables are created using MM5 or QuikSCAT data with a similar 2D FFT method. The cokriging system was initially validated by predicting “missing” hours of PSU/NCAR MM5 data and has displayed reasonable skill. QuikSCAT satellite winds were also substituted for MM5 data when calculating the spatial covariance, with the goal of reducing the computer time needed to accurately predict a wind energy resource.

  18. Elastic Extension of a CMS Computing Centre Resources on External Clouds

    Science.gov (United States)

    Codispoti, G.; Di Maria, R.; Aiftimiei, C.; Bonacorsi, D.; Calligola, P.; Ciaschini, V.; Costantini, A.; Dal Pra, S.; DeGirolamo, D.; Grandi, C.; Michelotto, D.; Panella, M.; Peco, G.; Sapunenko, V.; Sgaravatto, M.; Taneja, S.; Zizzi, G.

    2016-10-01

    After the successful LHC data taking in Run-I and in view of the future runs, the LHC experiments are facing new challenges in the design and operation of the computing facilities. The computing infrastructure for Run-II is dimensioned to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, CMS - along the lines followed by other LHC experiments - is exploring the opportunity to access Cloud resources provided by external partners or commercial providers. Specific use cases have already been explored and successfully exploited during Long Shutdown 1 (LS1) and the first part of Run 2. In this work we present the proof of concept of the elastic extension of a CMS site, specifically the Bologna Tier-3, on an external OpenStack infrastructure. We focus on the “Cloud Bursting” of a CMS Grid site using a newly designed LSF configuration that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on the OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage. The amount of resources allocated thus can be elastically modeled to cope up with the needs of CMS experiment and local users. Moreover, a direct access/integration of OpenStack resources to the CMS workload management system is explored. In this paper we present this approach, we report on the performances of the on-demand allocated resources, and we discuss the lessons learned and the next steps.

  19. Future Performance Trend Indicators: A Current Value Approach to Human Resources Accounting. Report V: The Value Attribution Process. Technical Report.

    Science.gov (United States)

    Lapointe, Jean B.; And Others

    The development of future performance trend indicators is based on the current value approach to human resource accounting. The value attribution portion of the current value approach is used to estimate the dollar value of observed changes in the state of the human organization. The procedure for value attribution includes: prediction of changes…

  20. Public-Resource Computing: Un nuevo paradigma para la computación y la ciencia

    OpenAIRE

    2006-01-01

    En este artículo se explora el concepto de Computación de Recursos Públicos (Public-Resource Computing), una idea que se ha venido desarrollando con gran éxito desde hace algunos años en la comunidad científica y que consiste en el aprovechamiento de los recursos de computación que se encuentran disponibles en los millones de PC que existen en el mundo conectados a internet. Se discute el proyecto SETI@home, el más exitoso representante de este concepto, y se describe la plataforma BOINC (Ber...

  1. The model of localized business community economic development under limited financial resources: computer model and experiment

    Directory of Open Access Journals (Sweden)

    Berg Dmitry

    2016-01-01

    Full Text Available Globalization processes now affect and are affected by most of organizations, different type resources, and the natural environment. One of the main restrictions initiated by these processes is the financial one: money turnover in global markets leads to its concentration in the certain financial centers, and local business communities suffer from the money lack. This work discusses the advantages of complementary currency introduction into a local economics. By the computer simulation with the engineered program model and the real economic experiment it was proved that the complementary currency does not compete with the traditional currency, furthermore, it acts in compliance with it, providing conditions for the sustainable business community development.

  2. [Current status of mangrove germplasm resources and key techniques for mangrove seedling propagation in China].

    Science.gov (United States)

    Hu, Hong-You; Chen, Shun-Yang; Wang, Wen-Qing; Dong, Ke-Zuan; Lin, Guang-Hui

    2012-04-01

    Mangrove germplasm and nursery operation are the foundations of all mangrove ecological restoration projects. Based on the existing literatures and our own experiences, and by using cluster analysis and other methods, this paper assessed the current status of the mangrove germplasm resources and the key techniques for mangrove seedlings propagation in China. In China, the mangrove communities could be divided into 4 types, including low temperature tolerant widespread type, widespread type, thermophilic widespread type, and tropical type, and the mangrove distribution sites could be divided into 5 regions, i. e., eastern Hainan coast, Beibuwan Gulf coast, Pearl River estuary and eastern Guangdong coast, southern Fujian and Taiwan coast, and eastern Fujian and southern Zhejiang coast. The mangroves in Beibuwan Gulf coast region took up 75.3% of the total mangrove germplasm resources in the country. At present, the percentage of the mangrove species applied for seedling propagation in China was estimated at 52.6%, most of which were of viviparous species. The six key steps in mangrove nursery operation included the selection of proper seedling propagation methods, the collection and storage of seeds or propagules, the ways of raising seedlings, the management of water and salinity, the control of diseases and pests, and the prevention of cold damage during winter. The structure, functions, and applieations of the present five types of mangrove nurseries, including dry land nursery, mangrove tidal nursery, mudflat nursery, Jiwei pond nursery, and Spartina mudflat nursery, were also analyzed, which could provide guidance for the integrated management of mangrove ecological restoration engineering.

  3. Uniform physical theory of diffraction equivalent edge currents for implementation in general computer codes

    DEFF Research Database (Denmark)

    Johansen, Peter Meincke

    1996-01-01

    New uniform closed-form expressions for physical theory of diffraction equivalent edge currents are derived for truncated incremental wedge strips. In contrast to previously reported expressions, the new expressions are well-behaved for all directions of incidence and observation and take a finit...... value for zero strip length. Consequently, the new equivalent edge currents are, to the knowledge of the author, the first that are well-suited for implementation in general computer codes...

  4. The Usage of Informal Computer Based Communication in the Context of Organization’s Technological Resources

    Directory of Open Access Journals (Sweden)

    Steponas Jonušauskas

    2011-12-01

    Full Text Available Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization’s technological resources.Methodology—meta analysis, survey and descriptive analysis.Findings. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the individuals’ recognition of reality, creates general context of environment between talkers, and strengthens interpersonal attraction. For these reasons, informal communication is desirable and even necessary in organizations because it helps to ensure efficient functioning of the enterprise. However, communicating through electronic channels suppresses informal connections or addresses them to the outside of the organization. So, electronic communication is not beneficial for developing ties in informal organizational network. The empirical research showed, thatsignificant part of courts administration staff is prone to use technological resources of their office for informal communication. Representatives of courts administration choose friends for computer based communication much more often than colleagues (72% and 63%respectively. 93%of the research respondents use an additional e-mail box serviced by commercial providers for non work communication. High intensity of informal electronic communication with friends and familiars shows that workers of court administration are used to meet their psycho emotional needs outside the work place. The survey confirmed conclusion of the theoretical analysis: computer based communication is not beneficial for developing informal contacts between workers. In order for the informal communication could carry out its functions and technological recourses of organization would be used effectively, staff

  5. The Usage of Informal Computer Based Communication in the Context of Organization’s Technological Resources

    Directory of Open Access Journals (Sweden)

    Agota Giedrė Raišienė

    2013-08-01

    Full Text Available Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization’s technological resources. Methodology—meta analysis, survey and descriptive analysis. Findings. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the individuals’ recognition of reality, creates general context of environment between talkers, and strengthens interpersonal attraction. For these reasons, informal communication is desirable and even necessary in organizations because it helps to ensure efficient functioning of the enterprise. However, communicating through electronic channels suppresses informal connections or addresses them to the outside of the organization. So, electronic communication is not beneficial for developing ties in informal organizational network. The empirical research showed, that significant part of courts administration staff is prone to use technological resources of their office for informal communication. Representatives of courts administration choose friends for computer based communication much more often than colleagues (72% and 63%respectively. 93%of the research respondents use an additional e-mail box serviced by commercial providers for non work communication. High intensity of informal electronic communication with friends and familiars shows that workers of court administration are used to meet their psycho emotional needs outside the work place. The survey confirmed conclusion of the theoretical analysis: computer based communication is not beneficial for developing informal contacts between workers. In order for the informal communication could carry out its functions and technological recourses of organization would be used effectively, staff

  6. Current Problems in Developing the Natural Resource Potential of the Russian Exclave in the Baltic

    Science.gov (United States)

    Fedorov, Gennady M.; Gritsenko, Vladimir A.; Dedkov, Viktor P.; Zotov, Sergey I.; Chernyshkov, Pavel P.

    2016-01-01

    The compact Kaliningrad region boasts relatively favourable environmental conditions and a remarkable diversity of natural resources. This article seeks to compare the natural resources of the exclave and other Russian regions. The authors examine recent statistical data to estimate the region's natural and resource potential, analyse its…

  7. Resource Efficient Hardware Architecture for Fast Computation of Running Max/Min Filters

    Directory of Open Access Journals (Sweden)

    Cesar Torres-Huitzil

    2013-01-01

    Full Text Available Running max/min filters on rectangular kernels are widely used in many digital signal and image processing applications. Filtering with a k×k kernel requires of k2−1 comparisons per sample for a direct implementation; thus, performance scales expensively with the kernel size k. Faster computations can be achieved by kernel decomposition and using constant time one-dimensional algorithms on custom hardware. This paper presents a hardware architecture for real-time computation of running max/min filters based on the van Herk/Gil-Werman (HGW algorithm. The proposed architecture design uses less computation and memory resources than previously reported architectures when targeted to Field Programmable Gate Array (FPGA devices. Implementation results show that the architecture is able to compute max/min filters, on 1024×1024 images with up to 255×255 kernels, in around 8.4 milliseconds, 120 frames per second, at a clock frequency of 250 MHz. The implementation is highly scalable for the kernel size with good performance/area tradeoff suitable for embedded applications. The applicability of the architecture is shown for local adaptive image thresholding.

  8. Integrating resource efficiency and EU State aid. An evaluation of resource efficiency considerations in the current EU State aid framework

    Energy Technology Data Exchange (ETDEWEB)

    Bennink, D.; Faber, J.; Smit, M. [CE Delft, Delft (Netherlands); Goba, V. [SIA Estonian, Latvian and Lithuanian Environment ELLE, Tallinn (Estonia); Miller, K.; Williams, E. [AEA Technology plc, London (United Kingdom)

    2012-10-15

    This study, for the European Commission, analyses the issues that need to be addressed in the revision of the EU State aid framework to ensure that they do not hinder environmental, resource efficiency and sustainable development goals. In some cases, State aid can be considered an environmentally harmful subsidy (EHS). The study analyses (1) the extent to which the Environmental Aid Guidelines (EAG) need to be changed to take into account recent European environmental policy developments; (2) existing and potential resource efficiency considerations in a) the Regional Aid Guidelines; b) the Research, Development and Innovation (RDI) Guidelines and c) the Agriculture and Forestry Guidelines; assesses cases and schemes using these guidelines to identify whether resource efficiency considerations are taken into account. The study also considers the social, environmental and economic impacts of these cases and schemes. It develops recommendations for the review of the EAG and a number of horizontal guidelines. One of the conclusions of the analysis is that the way in which multiple objectives and impacts are balanced, when deciding to approve state aid, is unclear. Also, EU member states are not required to provide information on certain types of (estimated) impacts. To guarantee that multiple objectives and impacts are sufficiently balanced, it is recommended that the State aid framework prescribes that applicants identify social, economic and environmental objectives and impacts and describe how these are taken into account in the procedure of balancing multiple (conflicting) objectives. Objectives and impacts should be quantified as much as possible, for example by making use of the method of external cost calculation laid down in 'the Handbook on estimation of external costs in the transport Sector'. The results of the study are used by the European Commission as an input for evaluating and improving the EU State aid framework.

  9. 3D computation of non-linear eddy currents: Variational method and superconducting cubic bulk

    Science.gov (United States)

    Pardo, Enric; Kapolka, Milan

    2017-09-01

    Computing the electric eddy currents in non-linear materials, such as superconductors, is not straightforward. The design of superconducting magnets and power applications needs electromagnetic computer modeling, being in many cases a three-dimensional (3D) problem. Since 3D problems require high computing times, novel time-efficient modeling tools are highly desirable. This article presents a novel computing modeling method based on a variational principle. The self-programmed implementation uses an original minimization method, which divides the sample into sectors. This speeds-up the computations with no loss of accuracy, while enabling efficient parallelization. This method could also be applied to model transients in linear materials or networks of non-linear electrical elements. As example, we analyze the magnetization currents of a cubic superconductor. This 3D situation remains unknown, in spite of the fact that it is often met in material characterization and bulk applications. We found that below the penetration field and in part of the sample, current flux lines are not rectangular and significantly bend in the direction parallel to the applied field. In conclusion, the presented numerical method is able to time-efficiently solve fully 3D situations without loss of accuracy.

  10. Computation of magnetic fields within source regions of ionospheric and magnetospheric currents

    DEFF Research Database (Denmark)

    Engels, U.; Olsen, Nils

    1998-01-01

    A general method of computing the magnetic effect caused by a predetermined three-dimensional external current density is presented. It takes advantage of the representation of solenoidal vector fields in terms of toroidal and poloidal modes expressed by two independent series of spherical harmon...

  11. Electrical safety in spinal cord stimulation: current density analysis by computer modeling

    NARCIS (Netherlands)

    Wesselink, W.A.; Holsheimer, J.

    1995-01-01

    The possibility of tissue damage in spinal cord stimulation was investigated in a computer modeling study. A decrease of the electrode area in monopolar stimulation resulted in an increase of the current density at the electrode surface. When comparing the modeling results with experimental data

  12. Harmonic Analysis of Currents and Voltages Obtained in the Result of Computational Experiment

    Directory of Open Access Journals (Sweden)

    I. V. Novash

    2011-01-01

    Full Text Available The paper considers a methodology for execution of a harmonic analysis of current and voltage numerical values obtained in the result of a computational experiment and saved in an external data file. The harmonic analysis has been carried out in the Mathcad mathematical packet environment.

  13. Topics in Current Science Research: Closing the Achievement Gap for Under Resourced Students of Color

    Science.gov (United States)

    Loya Villalpando, Alvaro; Daal, Miguel; Phipps, Arran; Speller, Danielle; Sadoulet, Bernard; Winheld, Rachel; Cryogenic Dark Matter Search Collaboration

    2015-04-01

    Topics in Current Science Research (TCSR) is a five-week summer course offered at the University of California, Berkeley through a collaboration between the Level Playing Field Institute's Summer Math and Science Honors Academy (SMASH) Program and the Cryogenic Dark Matter Search (CDMS) group at UC Berkeley. SMASH is an academic enrichment program geared towards under-resourced, high school students of color. The goals of the course are to expand the students' conception of STEM, to teach the students that science is a method of inquiry and not just a collection of facts that are taught in school, and to expose the scholars to critical thinking within a scientific setting. The course's curriculum engages the scholars in hands-on scientific research, project proposal writing, and presentation of their scientific work to their peers as well as to a panel of UC Berkeley scientists. In this talk, we describe the course and the impact it has had on previous scholars, we discuss how the course's pedagogy has evolved over the past 10 years to enhance students' perception and understanding of science, and we present previous participants' reflections and feedback about the course and its success in providing high school students a genuine research experience at the university level.

  14. Monitoring of Computing Resource Use of Active Software Releases in ATLAS

    CERN Document Server

    Limosani, Antonio; The ATLAS collaboration

    2016-01-01

    The LHC is the world's most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the Tier0 at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as "MemoryMonitor", to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed...

  15. Monitoring of Computing Resource Use of Active Software Releases at ATLAS

    CERN Document Server

    Limosani, Antonio; The ATLAS collaboration

    2017-01-01

    The LHC is the world's most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the Tier0 at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as "MemoryMonitor", to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed...

  16. Internet resources for dentistry: computer, Internet, reference, and sites for enhancing personal productivity of the dental professional.

    Science.gov (United States)

    Guest, G F

    2000-08-15

    At the onset of the new millennium the Internet has become the new standard means of distributing information. In the last two to three years there has been an explosion of e-commerce with hundreds of new web sites being created every minute. For most corporate entities, a web site is as essential as the phone book listing used to be. Twenty years ago technologist directed how computer-based systems were utilized. Now it is the end users of personal computers that have gained expertise and drive the functionality of software applications. The computer, initially invented for mathematical functions, has transitioned from this role to an integrated communications device that provides the portal to the digital world. The Web needs to be used by healthcare professionals, not only for professional activities, but also for instant access to information and services "just when they need it." This will facilitate the longitudinal use of information as society continues to gain better information access skills. With the demand for current "just in time" information and the standards established by Internet protocols, reference sources of information may be maintained in dynamic fashion. News services have been available through the Internet for several years, but now reference materials such as online journals and digital textbooks have become available and have the potential to change the traditional publishing industry. The pace of change should make us consider Will Rogers' advice, "It isn't good enough to be moving in the right direction. If you are not moving fast enough, you can still get run over!" The intent of this article is to complement previous articles on Internet Resources published in this journal, by presenting information about web sites that present information on computer and Internet technologies, reference materials, news information, and information that lets us improve personal productivity. Neither the author, nor the Journal endorses any of the

  17. Water Resource Impacts Embedded in the Western US Electrical Energy Trade; Current Patterns and Adaptation to Future Drought

    Science.gov (United States)

    Adams, E. A.; Herron, S.; Qiu, Y.; Tidwell, V. C.; Ruddell, B. L.

    2013-12-01

    Water resources are a key element in the global coupled natural-human (CNH) system, because they are tightly coupled with the world's social, environmental, and economic subsystems, and because water resources are under increasing pressure worldwide. A fundamental adaptive tool used especially by cities to overcome local water resource scarcity is the outsourcing of water resource impacts through substitutionary economic trade. This is generally understood as the indirect component of a water footprint, and as ';virtual water' trade. This work employs generalized CNH methods to reveal the trade in water resource impacts embedded in electrical energy within the Western US power grid, and utilizes a general equilibrium economic trade model combined with drought and demand growth constraints to estimate the future status of this trade. Trade in embedded water resource impacts currently increases total water used for electricity production in the Western US and shifts water use to more water-limited States. Extreme drought and large increases in electrical energy demand increase the need for embedded water resource impact trade, while motivating a shift to more water-efficient generation technologies and more water-abundant generating locations. Cities are the largest users of electrical energy, and in the 21st Century will outsource a larger fraction of their water resource impacts through trade. This trade exposes cities to risks associated with disruption of long-distance transmission and distant hydrological droughts.

  18. Computation of reduced energy input current stimuli for neuron phase models.

    Science.gov (United States)

    Anyalebechi, Jason; Koelling, Melinda E; Miller, Damon A

    2014-01-01

    A regularly spiking neuron can be studied using a phase model. The effect of an input stimulus current on the phase time derivative is captured by a phase response curve. This paper adapts a technique that was previously applied to conductance-based models to discover optimal input stimulus currents for phase models. First, the neuron phase response θ(t) due to an input stimulus current i(t) is computed using a phase model. The resulting θ(t) is taken to be a reference phase r(t). Second, an optimal input stimulus current i(*)(t) is computed to minimize a weighted sum of the square-integral `energy' of i(*)(t) and the tracking error between the reference phase r(t) and the phase response due to i(*)(t). The balance between the conflicting requirements of energy and tracking error minimization is controlled by a single parameter. The generated optimal current i(*)t) is then compared to the input current i(t) which was used to generate the reference phase r(t). This technique was applied to two neuron phase models; in each case, the current i(*)(t) generates a phase response similar to the reference phase r(t), and the optimal current i(*)(t) has a lower `energy' than the square-integral of i(t). For constant i(t), the optimal current i(*)(t) need not be constant in time. In fact, i(*)(t) is large (possibly even larger than i(t)) for regions where the phase response curve indicates a stronger sensitivity to the input stimulus current, and smaller in regions of reduced sensitivity.

  19. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems

    Directory of Open Access Journals (Sweden)

    Shoaib Ehsan

    2015-07-01

    Full Text Available The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF, allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video. Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44% in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  20. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems.

    Science.gov (United States)

    Ehsan, Shoaib; Clark, Adrian F; Naveed ur Rehman; McDonald-Maier, Klaus D

    2015-07-10

    The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  1. Research of cloud computing resource scheduling model%云环境下资源调度模型研究

    Institute of Scientific and Technical Information of China (English)

    刘赛; 李绪蓉; 万麟瑞; 陈韬

    2013-01-01

    In the cloud computing environment, resource scheduling management is one of the key technologies. This paper describes a cloud computing resource scheduling model and explains the relationship between entities in the resource scheduling process of cloud computing and cloud computing environments. According to the physical server resource properties, a scheduling model that comprehensively considers the cloud computing resources loads is established, and the artificial and automatic virtual machine migration technology is used to balance the load of the physical servers in the cloud computing environment. The experimental results show that this resource scheduling model not only supports balancing the resource load but also improves the virtualization degree and flexibility degree of the resource pool. Finally, the future research directions are discussed.%云计算环境下资源调度管理是云计算的关键技术之一.介绍了一种云计算下资源调度模型,阐述了云计算资源调度流程和云计算环境下实体之间的关系.根据物理服务器的资源属性,建立了一种综合考虑云计算资源负载的调度模型,利用人工加自动的虚拟机迁移技术实现云计算中物理服务器的负载均衡.通过仿真实验分析和比较,该资源调度模型不但可以很好地实现资源负载均衡,而且可以提高资源池虚拟化和弹性化程度.最后展望了下一步的研究方向.

  2. Counting on COUNTER: The Current State of E-Resource Usage Data in Libraries

    Science.gov (United States)

    Welker, Josh

    2012-01-01

    Any librarian who has managed electronic resources has experienced the--for want of words--"joy" of gathering and analyzing usage statistics. Such statistics are important for evaluating the effectiveness of resources and for making important budgeting decisions. Unfortunately, the data are usually tedious to collect, inconsistently organized, of…

  3. Information Resources Construction of Digital Library Based on Cloud Computing%基于云计算的数字图书馆信息资源建设

    Institute of Scientific and Technical Information of China (English)

    欧裕美

    2014-01-01

    介绍了数字图书馆信息资源建设现状,阐述了云计算的海量信息存储技术,讨论了云计算给数字图书馆信息资源建设带来的变革,探讨了基于云计算的数字图书馆信息资源建设面临的问题。%This paper introduces the current status of information resources construction of digital library, expounds the massive information storage technology of coud computing, discusses the changes brought about by the cloud computing to digital library, and probes into some problems existing in information resources construction of digital library.

  4. 人力资源规划计算机辅助预测模型的设计%Computer Aided Prediction Model Design of Human Resources Planning

    Institute of Scientific and Technical Information of China (English)

    俞明; 余浩洋

    2013-01-01

    From the current situation of human resource planning, combined with the content and process of human resources plan, it designs the model of computer aided design prediction, and explains the basic structure and mathematic model. It designs an application model of human resources planning. It has done a total and classified planning, and has analyzed the results.%  从人力资源规划现状出发,结合人力资源规划的内容和步骤,进行了计算机辅助预测模型的设计,说明了其基本结构和数学模型。设计了人力资源规划的应用示例,进行了总量规划和分类规划,并对规划结果进行了分析,提出解决策略。

  5. 17 CFR 270.2a-4 - Definition of “current net asset value” for use in computing periodically the current price of...

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Definition of âcurrent net asset valueâ for use in computing periodically the current price of redeemable security. 270.2a-4... AND REGULATIONS, INVESTMENT COMPANY ACT OF 1940 § 270.2a-4 Definition of “current net asset value” for...

  6. Impact of currents on surface flux computations and their feedback on dynamics at regional scales

    Science.gov (United States)

    Olita, A.; Iermano, I.; Fazioli, L.; Ribotti, A.; Tedesco, C.; Pessini, F.; Sorgente, R.

    2015-08-01

    A twin numerical experiment was conducted in the seas around the island of Sardinia (Western Mediterranean) to assess the impact, at regional and coastal scales, of the use of relative winds (i.e., taking into account ocean surface currents) in the computation of heat and momentum fluxes through standard (Fairall et al., 2003) bulk formulas. The Regional Ocean Modelling System (ROMS) was implemented at 3 km resolution in order to well resolve mesoscale processes, which are known to have a large influence in the dynamics of the area. Small changes (few percent points) in terms of spatially averaged fluxes correspond to quite large differences of such quantities (about 15 %) in spatial terms and in terms of kinetics (more than 20 %). As a consequence, wind power input P is also reduced by ~ 14 % on average. Quantitative validation with satellite SST suggests that such a modification of the fluxes improves the model solution especially in the western side of the domain, where mesoscale activity (as suggested by eddy kinetic energy) is stronger. Surface currents change both in their stable and fluctuating part. In particular, the path and intensity of the Algerian Current and of the Western Sardinia Current (WSC) are impacted by the modification in fluxes. Both total and eddy kinetic energies of the surface current field are reduced in the experiment where fluxes took into account the surface currents. The main dynamical correction is observed in the SW area, where the different location and strength of the eddies influence the path and intensity of the WSC. Our results suggest that, even at local scales and in temperate regions, it would be preferable to take into account such a contribution in flux computations. The modification of the original code, substantially cost-less in terms of numerical computation, improves the model response in terms of surface fluxes (SST validated) and it also likely improves the dynamics as suggested by qualitative comparison with

  7. Application of a Resource Theory for Magic States to Fault-Tolerant Quantum Computing.

    Science.gov (United States)

    Howard, Mark; Campbell, Earl

    2017-03-03

    Motivated by their necessity for most fault-tolerant quantum computation schemes, we formulate a resource theory for magic states. First, we show that robustness of magic is a well-behaved magic monotone that operationally quantifies the classical simulation overhead for a Gottesman-Knill-type scheme using ancillary magic states. Our framework subsequently finds immediate application in the task of synthesizing non-Clifford gates using magic states. When magic states are interspersed with Clifford gates, Pauli measurements, and stabilizer ancillas-the most general synthesis scenario-then the class of synthesizable unitaries is hard to characterize. Our techniques can place nontrivial lower bounds on the number of magic states required for implementing a given target unitary. Guided by these results, we have found new and optimal examples of such synthesis.

  8. Using multiple metaphors and multimodalities as a semiotic resource when teaching year 2 students computational strategies

    Science.gov (United States)

    Mildenhall, Paula; Sherriff, Barbara

    2017-06-01

    Recent research indicates that using multimodal learning experiences can be effective in teaching mathematics. Using a social semiotic lens within a participationist framework, this paper reports on a professional learning collaboration with a primary school teacher designed to explore the use of metaphors and modalities in mathematics instruction. This video case study was conducted in a year 2 classroom over two terms, with the focus on building children's understanding of computational strategies. The findings revealed that the teacher was able to successfully plan both multimodal and multiple metaphor learning experiences that acted as semiotic resources to support the children's understanding of abstract mathematics. The study also led to implications for teaching when using multiple metaphors and multimodalities.

  9. The Current Status of Germplum Database: a Tool for Characterization of Plum Genetic Resources in Romania

    Directory of Open Access Journals (Sweden)

    Monica Harta

    2016-11-01

    Full Text Available In Romania, Prunus genetic resources are kept in collections of varieties, populations and biotypes, mainly located in research and development institutes or fruit growing stations and, in the last years, by some private enterprises. Creating the experimental model for the Germplum database based on phenotypic descriptors and SSR molecular markers analysis is an important and topical objective for the efficient characterization of genetic resources and also for establishing a public-private partnership for the effective management of plum germplasm resources in Romania. The technical development of the Germplum database was completed and data will be added continuously after characterizing each new accession.

  10. Benefit-cost analysis of DOE's Current Federal Program to increase hydrothermal resource utilization. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1981-12-10

    The impact of DOE's Current Federal Program on the commercialization of hydrothermal resources between 1980 and 2000 is analyzed. The hydrothermal resources of the United States and the types of DOE activities used to stimulate the development of these resources for both electric power and direct heat use are described briefly. The No Federal Program and the Current Federal Program are then described in terms of funding levels and the resultant market penetration estimates through 2000. These market penetration estimates are also compared to other geothermal utilization forecasts. The direct benefits of the Current Federal Program are next presented for electric power and direct heat use applications. An analysis of the external impacts associated with the additional hydrothermal resource development resulting from the Current Federal Program is also provided. Included are environmental effects, national security/balance-of-payments improvements, socioeconomic impacts and materials requirements. A summary of the analysis integrating the direct benefits, external impacts and DOE program costs concludes the report.

  11. On-line current feed and computer aided control tactics for automatic balancing head

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In the designed automatic balancing head,a non-contact induction transformer is used to deliver driving energy to solve the problem of current fed and controlling on-line.Computer controlled automatic balancing experiments with phase-magnitude control tactics were performed on a flexible rotor system.Results of the experiments prove that the energy feeding method and the control tactics are effective in the automatic balancing head for vibration controlling.

  12. User-Computer Interface Technology: An Assessment of the Current State of the Art

    Science.gov (United States)

    1988-06-01

    a model of the current state of the world. A similar effort is underway at the Computing Research Laboratory at New Mexico S-’ State University. One...process by validating or modifying requirements. Rapid prototyping techniques * such as storyboarding offer relatively low cost approaches to bringing the...Research Laboratory, New Mexico State University. Bannon, Liam J. (1986). Issues in design: some notes. In D. A. Norman and S. W. Draper (Eds.), User

  13. Efficient Nash Equilibrium Resource Allocation Based on Game Theory Mechanism in Cloud Computing by Using Auction.

    Science.gov (United States)

    Nezarat, Amin; Dastghaibifard, G H

    2015-01-01

    One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer's utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider.

  14. Current Status and Development Trend of Cloud Computing%云计算的现状与发展趋势

    Institute of Scientific and Technical Information of China (English)

    孙洪建

    2012-01-01

    云计算作为一个全新的概念产业日益受到世界各国政府和企业界的重视。全社会不仅对它投入了巨大的关注,更为了它的快速发展投入了大量的人、财、物。云计算带来的计算机应用方式的革命被业内专家寄于厚望,更有人将其称为继信息技术革命之后的第四次工业革命。文章对云计算在目前的发展状况进行了简要概括;重点对云计算目前发展阶段存在的各种瓶颈问题进行了归纳总结,对其成因进行了剖析;对云计算今后的发展、应用趋势进行了展望。%Cloud computing, as a brand new industry of conceptualization, gets increasingly attention from governments and enterprises around the world. Moreover the society pays great attention its rapid development and invests a lot of human power, financial and physical resources. The revolution of computer application mode brought by cloud computing has been placed great expectation by the experts, even more the-fourth-industrial-revolution after the-technology-revolution named by someone. This paper briefly summarizes the current situation of cloud computing development in China, focuses on conclusion of its development bottleneck at present stage, and analysis of its cause, and finally, this paper looks into the future of development and application trend.

  15. Invasive alien plants and water resources in South Africa: current understanding, predictive ability and research challenges

    CSIR Research Space (South Africa)

    Gorgens, AHM

    2004-01-01

    Full Text Available Predictions that invasive alien Plants would use significant amounts of water were a major factor in the establishment of South Africa's Working for Water programme, which aims to protect water resources by clearing these plants. The predictions...

  16. Decision Making for Natural Resources and Watershed Management: Current Thinking and Approaches

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This document overviews representative research related to natural resources andwatershed management, and gives directions to readers regarding publicly available...

  17. Report: EPA’s Distribution of Superfund Human Resources Does Not Support Current Regional Workload

    Science.gov (United States)

    Report #17-P-0397, September 19, 2017. Due to insufficient human resources to cover all Superfund site work, some regions have had to slow down or discontinue their efforts to protect human health and the environment.

  18. Managing Carbon Regulatory Risk in Utility Resource Planning:Current Practices in the Western United States

    Energy Technology Data Exchange (ETDEWEB)

    Barbose, Galen; Wiser, Ryan; Phadke, Amol; Goldman, Charles

    2008-05-16

    Concerns about global climate change have substantially increased the likelihood that future policy will seek to minimize carbon dioxide emissions. Assuch, even today, electric utilities are making resource planning and investment decisions that consider the possible implications of these future carbon regulations. In this article, we examine the manner in which utilities assess the financial risks associated with future carbon regulations within their long-term resource plans. We base our analysis on a review of the most recent resource plans filed by fifteen electric utilities in the Western United States. Virtually all of these utilities made some effort to quantitatively evaluate the potential cost of future carbon regulations when analyzing alternate supply- and demand-side resource options for meeting customer load. Even without Federal climate regulation in the U.S., the prospect of that regulation is already having an impact on utility decision-making and resource choices. That said, the methods and assumptions used by utilities to analyze carbon regulatory risk, and the impact of that analysis on their choice of a particular resource strategy, vary considerably, revealing a number of opportunities for analytic improvement. Though our review focuses on a subset of U.S. electric utilities, this work holds implications for all electric utilities and energy policymakers who are seeking to minimize the compliance costs associated with future carbon regulations

  19. Managing Carbon Regulatory Risk in Utility Resource Planning: Current Practices in the Western United States

    Energy Technology Data Exchange (ETDEWEB)

    Barbose, Galen; Wiser, Ryan; Phadke, Amol; Goldman, Charles

    2008-07-11

    Concerns about global climate change have substantially increased the likelihood that future policy will seek to minimize carbon dioxide emissions. As such, even today, electric utilities are making resource planning and investment decisions that consider the possible implications of these future carbon regulations. In this article, we examine the manner in which utilities assess the financial risks associated with future carbon regulations within their long-term resource plans. We base our analysis on a review of the most recent resource plans filed by fifteen electric utilities in the Western United States. Virtually all of these utilities made some effort to quantitatively evaluate the potential cost of future carbon regulations when analyzing alternate supply- and demand-side resource options for meeting customer load. Even without Federal climate regulation in the U.S., the prospect of that regulation is already having an impact on utility decision-making and resource choices. That said, the methods and assumptions used by utilities to analyze carbon regulatory risk, and the impact of that analysis on their choice of a particular resource strategy, vary considerably, revealing a number of opportunities for analytic improvement. Though our review focuses on a subset of U.S. electric utilities, this work holds implications for all electric utilities and energy policymakers who are seeking to minimize the compliance costs associated with future carbon regulations.

  20. Problems of software financial resources agrarian sector in the current economic conditions of management

    Directory of Open Access Journals (Sweden)

    Grischuk Nadiya Viktorivna

    2016-12-01

    Full Text Available Research of financial science on questions providing of financial resources does not exhaust and needs a further study that acquires new descriptions and vectors of development constantly, what costing illuminations in the conditions of present time. Research of the state of provision of financial resources agrarian to the sector of economy with allocating of main segment – loan and attracted financial resources, today topically. In the article the essence funds are considered sources of agricultural enterprises financial resources and problems associated with the formation and use of financial resources in the modern world. Also the problems arising in improving the process of raising funds agricultural enterprises. Revealed that an effective tool to attract financial resources is the issue of convertible bonds and the introduction of agricultural receipts. It is well-proven that in the conditions of unstable environment forward development of the system of agrarian relations must be carried out on the basis of the government programs, and normatively-legal adjusting that take into account not only the existent state of affairs at the market of agroindustrial products but also economic provision of enterprises national agrarian to the sector.

  1. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    Science.gov (United States)

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  2. IMPROVING RESOURCE UTILIZATION USING QoS BASED LOAD BALANCING ALGORITHM FOR MULTIPLE WORKFLOWS IN IAAS CLOUD COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    L. Shakkeera

    2013-06-01

    Full Text Available loud computing is the extension of parallel computing, distributed computing and grid computing. It provides secure, quick, convenient data storage and net computing services through the internet. The services are available to user in pay per-use-on-demand model. The main aim of using resources from cloud is to reduce the cost and to increase the performance in terms of request response time. Thus, optimizing the resource usage through efficient load balancing strategy is crucial. The main aim of this paper is to develop and implement an Optimized Load balancing algorithm in IaaS virtual cloud environment that aims to utilize the virtual cloud resources efficiently. It minimizes the cost of the applications by effectively using cloud resources and identifies the virtual cloud resources that must be suitable for all the applications. The web application is created with many modules. These modules are considered as tasks and these tasks are submitted to the load balancing server. The server which consists our load balancing policies redirect the tasks to the corresponding virtual machines created by KVM virtual machine manager as per the load balancing algorithm. If the size of the database inside the machine exceeds then the load balancing algorithm uses the other virtual machines for further incoming request. The load balancing strategy are evaluated for various QoS performance metrics like cost, average execution times, throughput, CPU usage, disk space, memory usage, network transmission and reception rate, resource utilization rate and scheduling success rate for the number of virtual machines and it improves the scalability among resources using load balancing techniques.

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  4. Impact of currents on surface fluxes computation and their feedback on coastal dynamics

    Directory of Open Access Journals (Sweden)

    A. Olita

    2015-01-01

    Full Text Available A twin numerical experiment was conducted in the seas of Sardinia (Western Mediterranean to assess the impact, at coastal scales, of the use of relative winds (i.e. taking into account ocean surface currents in the computation of heat and momentum fluxes through bulk formulas. The model, the Regional Ocean Modeling System (ROMS, was implemented at 2 km of resolution in order to well resolve (sub-mesoscale dynamics. Small changes (1–2% in terms of spatially-averaged fluxes correspond to quite large spatial differences of such quantities (up to 15–20% and to comparably significant differences in terms of mean velocities of the surface currents. Wind power input of the wind stress to the ocean surface P results also reduced by a 15%, especially where surface currents are stronger. Quantitative validation with satellite SST suggests that such a modification on the fluxes improves the model solution especially in areas of cyclonic circulation, where the heat fluxes correction is predominant in respect to the dynamical correction. Surface currents changes above all in their fluctuating part, while the stable part of the flow show changes mainly in magnitude and less in its path. Both total and eddy kinetic energies of the surface current field results reduced in the experiment where fluxes took into account for surface currents. Dynamically, the largest correction is observed in the SW area where anticyclonic eddies approach the continental slope. This reduction also impacts the vertical dynamics and specifically the local upwelling that results diminished both in spatial extension as well in magnitude. Simulations suggest that, even at local scales and in temperate regions, it is preferable to take into account for such a component in fluxes computation. Results also confirm the tight relationship between local coastal upwelling and eddy-slope interactions in the area.

  5. A Critical Assessment of the Resource Depletion Potential of Current and Future Lithium-Ion Batteries

    Directory of Open Access Journals (Sweden)

    Jens F. Peters

    2016-12-01

    Full Text Available Resource depletion aspects are repeatedly used as an argument for a shift towards new battery technologies. However, whether serious shortages due to the increased demand for traction and stationary batteries can actually be expected is subject to an ongoing discussion. In order to identify the principal drivers of resource depletion for battery production, we assess different lithium-ion battery types and a new lithium-free battery technology (sodium-ion under this aspect, applying different assessment methodologies. The findings show that very different results are obtained with existing impact assessment methodologies, which hinders clear interpretation. While cobalt, nickel and copper can generally be considered as critical metals, the magnitude of their depletion impacts in comparison with that of other battery materials like lithium, aluminum or manganese differs substantially. A high importance is also found for indirect resource depletion effects caused by the co-extraction of metals from mixed ores. Remarkably, the resource depletion potential per kg of produced battery is driven only partially by the electrode materials and thus depends comparably little on the battery chemistry itself. One of the key drivers for resource depletion seems to be the metals (and co-products in electronic parts required for the battery management system, a component rather independent from the actual battery chemistry. However, when assessing the batteries on a capacity basis (per kWh storage capacity, a high-energy density also turns out to be relevant, since it reduces the mass of battery required for providing one kWh, and thus the associated resource depletion impacts.

  6. Cross stratum resources protection in fog-computing-based radio over fiber networks for 5G services

    Science.gov (United States)

    Guo, Shaoyong; Shao, Sujie; Wang, Yao; Yang, Hui

    2017-09-01

    In order to meet the requirement of internet of things (IoT) and 5G, the cloud radio access network is a paradigm which converges all base stations computational resources into a cloud baseband unit (BBU) pool, while the distributed radio frequency signals are collected by remote radio head (RRH). A precondition for centralized processing in the BBU pool is an interconnection fronthaul network with high capacity and low delay. However, it has become more complex and frequent in the interaction between RRH and BBU and resource scheduling among BBUs in cloud. Cloud radio over fiber network has been proposed in our previous work already. In order to overcome the complexity and latency, in this paper, we first present a novel cross stratum resources protection (CSRP) architecture in fog-computing-based radio over fiber networks (F-RoFN) for 5G services. Additionally, a cross stratum protection (CSP) scheme considering the network survivability is introduced in the proposed architecture. The CSRP with CSP scheme can effectively pull the remote processing resource locally to implement the cooperative radio resource management, enhance the responsiveness and resilience to the dynamic end-to-end 5G service demands, and globally optimize optical network, wireless and fog resources. The feasibility and efficiency of the proposed architecture with CSP scheme are verified on our software defined networking testbed in terms of service latency, transmission success rate, resource occupation rate and blocking probability.

  7. Building an application for computing the resource requests such as disk, CPU, and tape and studying the time evolution of computing model

    CERN Document Server

    Noormandipour, Mohammad Reza

    2017-01-01

    The goal of this project was building an application to calculate the computing resources needed by the LHCb experiment for data processing and analysis, and to predict their evolution in future years. The source code was developed in the Python programming language and the application built and developed in CERN GitLab. This application will facilitate the calculation of resources required by LHCb in both qualitative and quantitative aspects. The granularity of computations is improved to a weekly basis, in contrast with the yearly basis used so far. The LHCb computing model will benefit from the new possibilities and options added, as the new predictions and calculations are aimed at giving more realistic and accurate estimates.

  8. Resource Scheduling Strategy of SLA and QoS Under Cloud Computing%云环境下顾及SLA及QoS的资源调度策略

    Institute of Scientific and Technical Information of China (English)

    朱倩

    2016-01-01

    Considering that cloud computing does not require users paying attention to the bottomed system implementation, we take the technology of cloud computing as currently more popular distributed computing based services. However, efficient re-source allocation can reduce the excessive waste of resources, and increase user satisfaction by reducing cost, so as to improve the system performance. This paper realizes the accurate prediction on system performance with virtual technology in cloud computing platform, discussing the resource scheduling strategy based on Virtual Machine( VM) and ervice Level Agreement ( SLA) in a cloud environment from the service point of view. Simulation results show that the scheduling strategy is an effective method to improve the utilization of system resources, which has a certain practical value.%考虑到云技术是当前较为流行基于服务的分布式计算及其不需要用户关注底层的系统实现。有效的资源调配一方面能减少资源的过度浪费,或者减少成本以增加用户的满意度,最终提升系统的性能。本文通过对云计算平台资源的虚拟化技术,实现系统性能需求的精确预算。从服务的角度,探讨一种云环境下基于Virtual Machine ( VM)的顾及Service Level Agreement ( SLA)及Quality of Service ( QoS)的资源调度策略。模拟实验结果表明,本资源的调度策略是一种提高系统资源利用率的有效手段,具有一定的实用价值。

  9. Water Resources and Agricultural Water Use in the North China Plain: Current Status and Management Options

    Science.gov (United States)

    Serious water deficits with deteriorating environmental quality are threatening agricultural sustainability in the North China Plain (NCP). This paper addresses spatial and temporal availability of water resources in the NCP, and identifies the effects of soil management, irrigation and crop genetic...

  10. Evolutionary algorithms and other metaheuristics in water resources: Current status, research challenges and future directions

    NARCIS (Netherlands)

    Maier, H.R.; Kapelan, Z.; Kasprzyk, J.; Kollat, J.; Matott, L.S.; Cunha, M.C.; Dandy, G.C.; Gibbs, M.S.; Keedwell, E.; Marchi, A.; Ostfeld, A.; Savic, D.; Solomatine, D.P.; Vrugt, J.A.; Zecchin, A.C.; Minsker, B.S.; Barbour, E.J.; Kuczera, G.; Pasha, F.; Castelletti, A.; Giuliani, M.; Reed, P.M.

    2014-01-01

    The development and application of evolutionary algorithms (EAs) and other metaheuristics for the optimisation of water resources systems has been an active research field for over two decades. Research to date has emphasized algorithmic improvements and individual applications in specific areas (e.

  11. "Amazing Space": Creating Educational Resources from Current Scientific Research Results from the Hubble Space Telescope.

    Science.gov (United States)

    Christian, C. A.; Eisenhamer, B.; Eisenhamer, Jonathan; Teays, Terry

    2001-01-01

    Introduces the Amazing Space program which is designed to enhance student mathematics, science, and technology skills using recent data and results from the National Aeronautics and Space Administration's (NASA) Hubble Space Telescope mission. Explains the process of designing multi-media resources in a five-week summer workshop that partners…

  12. Evolutionary algorithms and other metaheuristics in water resources: Current status, research challenges and future directions

    NARCIS (Netherlands)

    Maier, H.R.; Kapelan, Z.; Kasprzyk, J.; Kollat, J.; Matott, L.S.; Cunha, M.C.; Dandy, G.C.; Gibbs, M.S.; Keedwell, E.; Marchi, A.; Ostfeld, A.; Savic, D.; Solomatine, D.P.; Vrugt, J.A.; Zecchin, A.C.; Minsker, B.S.; Barbour, E.J.; Kuczera, G.; Pasha, F.; Castelletti, A.; Giuliani, M.; Reed, P.M.

    2014-01-01

    The development and application of evolutionary algorithms (EAs) and other metaheuristics for the optimisation of water resources systems has been an active research field for over two decades. Research to date has emphasized algorithmic improvements and individual applications in specific areas

  13. Social media as an open-learning resource in medical education: current perspectives.

    Science.gov (United States)

    Sutherland, S; Jalali, A

    2017-01-01

    Numerous studies evaluate the use of social media as an open-learning resource in education, but there is a little published knowledge of empirical evidence that such open-learning resources produce educative outcomes, particularly with regard to student performance. This study undertook a systematic review of the published literature in medical education to determine the state of the evidence as to empirical studies that conduct an evaluation or research regarding social media and open-learning resources. The authors searched MEDLINE, ERIC, Embase, PubMed, Scopus, and Google Scholar from 2012 to 2017. This search included using keywords related to social media, medical education, research, and evaluation, while restricting the search to peer reviewed, English language articles only. To meet inclusion criteria, manuscripts had to employ evaluative methods and undertake empirical research. Empirical work designed to evaluate the impact of social media as an open-learning resource in medical education is limited as only 13 studies met inclusion criteria. The majority of these studies used undergraduate medical education as the backdrop to investigate open-learning resources, such as Facebook, Twitter, and YouTube. YouTube appears to have little educational value due to the unsupervised nature of content added on a daily basis. Overall, extant reviews have demonstrated that we know a considerable amount about social media use, although to date, its impacts remain unclear. There is a paucity of outcome-based, empirical studies assessing the impact of social media in medical education. The few empirical studies identified tend to focus on evaluating the affective outcomes of social media and medical education as opposed to understanding any linkages between social media and performance outcomes. Given the potential for social media use in medical education, more empirical evaluative studies are required to determine educational value.

  14. Cost and Performance-Based Resource Selection Scheme for Asynchronous Replicated System in Utility-Based Computing Environment

    Directory of Open Access Journals (Sweden)

    Wan Nor Shuhadah Wan Nik

    2017-04-01

    Full Text Available A resource selection problem for asynchronous replicated systems in utility-based computing environment is addressed in this paper. The needs for a special attention on this problem lies on the fact that most of the existing replication scheme in this computing system whether implicitly support synchronous replication and/or only consider read-only job. The problem is undoubtedly complex to be solved as two main issues need to be concerned simultaneously, i.e. 1 the difficulty on predicting the performance of the resources in terms of job response time, and 2 an efficient mechanism must be employed in order to measure the trade-off between the performance and the monetary cost incurred on resources so that minimum cost is preserved while providing low job response time. Therefore, a simple yet efficient algorithm that deals with the complexity of resource selection problem in utility-based computing systems is proposed in this paper. The problem is formulated as a Multi Criteria Decision Making (MCDM problem. The advantages of the algorithm are two-folds. On one fold, it hides the complexity of resource selection process without neglecting important components that affect job response time. The difficulty on estimating job response time is captured by representing them in terms of different QoS criteria levels at each resource. On the other fold, this representation further relaxed the complexity in measuring the trade-offs between the performance and the monetary cost incurred on resources. The experiments proved that our proposed resource selection scheme achieves an appealing result with good system performance and low monetary cost as compared to existing algorithms.

  15. 云计算异构资源整合的分析与应用%Analysis and Application of Cloud Computing Integration of Heterogeneous Resources

    Institute of Scientific and Technical Information of China (English)

    吴金龙

    2012-01-01

    Cloud computing is a computing model of the Internet-based public participation. For information dis.aster recovery hardware and software status of the Shanghai Center of State Grid Corporation, as well as the practical problems faced in the disaster recovery business, the proposed framework for the technical means to regulate the management tools, integrated applications address the integration of heterogeneous resources. Constructed in the introduction of cloud computing on the basis of the integration of heterogeneous resources layer, focusing described the key issues of the resource model, resource access specification, and operation and maintenance management system interface, to design and build a cloud computing resource management platform, currently has a comprehensive sewer minicomputers, servers and storage devices, and achieved significant economic and management benefits, and also pointed out that: optimize the integration of hardware and software resources is only part of the information integration, the limited role of the individual hardware and software integration, only organically dynamic deployment and application software integration and resources together, support each other, in order to obtain the maximum benefits.%云计算是一种基于互联网大众参与的计算模式。针对国家电网公司信息灾备上海中心的软硬件现状以及在灾备业务中面l临的实际问题,提出了以架构为技术手段,以规范为管理原则,综合解决异构资源整合的应用方案。在介绍构建云计算异构资源整合层的基础上,叙述了资源模型、资源接人规范以及运维管理系统接口等问题,设计构建了云计算资源管理平台,全面纳管目前拥有的小型机、服务器以及存储设备,取得了明显的经济和管理效益。实践表明:软硬件资源优化只是信息整合的一部分,单独的软硬件整合作用是有限的,只有与应用软件整

  16. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    Energy Technology Data Exchange (ETDEWEB)

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.

    1996-08-01

    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  17. Validating Vegetable Production Unit (VPU) Plants, Protocols, Procedures and Requirements (P3R) using Currently Existing Flight Resources

    Science.gov (United States)

    Bingham, Gail; Bates, Scott; Bugbee, Bruce; Garland, Jay; Podolski, Igor; Levinskikh, Rita; Sychev, Vladimir; Gushin, Vadim

    2009-01-01

    Validating Vegetable Production Unit (VPU) Plants, Protocols, Procedures and Requirements (P3R) Using Currently Existing Flight Resources (Lada-VPU-P3R) is a study to advance the technology required for plant growth in microgravity and to research related food safety issues. Lada-VPU-P3R also investigates the non-nutritional value to the flight crew of developing plants on-orbit. The Lada-VPU-P3R uses the Lada hardware on the ISS and falls under a cooperative agreement between National Aeronautics and Space Administration (NASA) and the Russian Federal Space Association (FSA). Research Summary: Validating Vegetable Production Unit (VPU) Plants, Protocols, Procedures and Requirements (P3R) Using Currently Existing Flight Resources (Lada-VPU-P3R) will optimize hardware and

  18. Computer-aided diagnosis in radiological imaging: current status and future challenges

    Science.gov (United States)

    Doi, Kunio

    2009-10-01

    Computer-aided diagnosis (CAD) has become one of the major research subjects in medical imaging and diagnostic radiology. Many different types of CAD schemes are being developed for detection and/or characterization of various lesions in medical imaging, including conventional projection radiography, CT, MRI, and ultrasound imaging. Commercial systems for detection of breast lesions on mammograms have been developed and have received FDA approval for clinical use. CAD may be defined as a diagnosis made by a physician who takes into account the computer output as a "second opinion". The purpose of CAD is to improve the quality and productivity of physicians in their interpretation of radiologic images. The quality of their work can be improved in terms of the accuracy and consistency of their radiologic diagnoses. In addition, the productivity of radiologists is expected to be improved by a reduction in the time required for their image readings. The computer output is derived from quantitative analysis of radiologic images by use of various methods and techniques in computer vision, artificial intelligence, and artificial neural networks (ANNs). The computer output may indicate a number of important parameters, for example, the locations of potential lesions such as lung cancer and breast cancer, the likelihood of malignancy of detected lesions, and the likelihood of various diseases based on differential diagnosis in a given image and clinical parameters. In this review article, the basic concept of CAD is first defined, and the current status of CAD research is then described. In addition, the potential of CAD in the future is discussed and predicted.

  19. How to Make the Best Use of Limited Computer Resources in French Primary Schools.

    Science.gov (United States)

    Parmentier, Christophe

    1988-01-01

    Discusses computer science developments in French primary schools and describes strategies for using computers in the classroom most efficiently. Highlights include the use of computer networks; software; artificial intelligence and expert systems; computer-assisted learning (CAL) and intelligent CAL; computer peripherals; simulation; and teaching…

  20. Current Status and Future Perspective of Nuclear Energy Human Resource Development

    Science.gov (United States)

    Yamamoto, Shinji

    In recent years, expectations for nuclear energy have been increasing in Japan because of its role and responsibility as a key power source, the contribution it can make to a global nuclear renaissance, the need for energy security, and the importance of combating global warming. Ensuring and fostering good human resources is essential if the nuclear industry is to maintain itself and expand its scale. There are obstacles, however, in doing so : a declining birth rate, job-hunting problem, the wave of retirements in 2007, the declining popularity of engineering departments and particularly nuclear-related subjects, a weakening of nuclear education, and deteriorating research facilities and equipment. While nuclear-related academic, industrial and governmental parties share this recognition and are cooperating and collaborating, all organizations are expected similarly to continue their own wholehearted efforts at human resource development.

  1. Computer programs for the acquisition and analysis of eddy-current array probe data

    Energy Technology Data Exchange (ETDEWEB)

    Pate, J.R.; Dodd, C.V.

    1996-07-01

    Objective of the Improved Eddy-Curent ISI (in-service inspection) for Steam Generators Tubing program is to upgrade and validate eddy-current inspections, including probes, instrumentation, and data processing techniques for ISI of new, used, and repaired steam generator tubes; to improve defect detection, classification and characterization as affected by diameter and thickness variations, denting, probe wobble, tube sheet, tube supports, copper and sludge deposits, even when defect types and other variables occur in combination; to transfer this advanced technology to NRC`s mobile NDE laboratory and staff. This report documents computer programs that were developed for acquisition of eddy-current data from specially designed 16-coil array probes. Complete code as well as instructions for use are provided.

  2. Nanocrystalline material in toroidal cores for current transformer: analytical study and computational simulations

    Directory of Open Access Journals (Sweden)

    Benedito Antonio Luciano

    2005-12-01

    Full Text Available Based on electrical and magnetic properties, such as saturation magnetization, initial permeability, and coercivity, in this work are presented some considerations about the possibilities of applications of nanocrystalline alloys in toroidal cores for current transformers. It is discussed how the magnetic characteristics of the core material affect the performance of the current transformer. From the magnetic characterization and the computational simulations, using the finite element method (FEM, it has been verified that, at the typical CT operation value of flux density, the nanocrystalline alloys properties reinforce the hypothesis that the use of these materials in measurement CT cores can reduce the ratio and phase errors and can also improve its accuracy class.

  3. An improved current potential method for fast computation of stellarator coil shapes

    CERN Document Server

    Landreman, Matt

    2016-01-01

    Several fast methods for computing stellarator coil shapes are compared, including the classical NESCOIL procedure [Merkel, Nucl. Fusion 27, 867 (1987)], its generalization using truncated singular value decomposition, and a Tikhonov regularization approach we call REGCOIL in which the squared current density is included in the objective function. Considering W7-X and NCSX geometries, and for any desired level of regularization, we find the REGCOIL approach simultaneously achieves lower surface-averaged and maximum values of both current density (on the coil winding surface) and normal magnetic field (on the desired plasma surface). This approach therefore can simultaneously improve the free-boundary reconstruction of the target plasma shape while substantially increasing the minimum distances between coils, preventing collisions between coils while improving access for ports and maintenance. The REGCOIL method also allows finer control over the level of regularization, and it eliminates two pathologies of NE...

  4. An Improved Electron Pre-Sheath Model for TSS-1R Current Enhancement Computations

    Directory of Open Access Journals (Sweden)

    Chunpei Cai

    2017-03-01

    Full Text Available This report presents improvements of investigations on the Tethered Satellite System (TSS-1R electron current enhancement due to magnetic limited collections. New analytical expressions are obtained for the potential and temperature changes across the pre-sheath. The mathematical treatments in this work are more rigorous than one past approach. More experimental measurements collected in the ionosphere during the TSS-1R mission are adopted for validations. The relations developed in this work offer two bounding curves for these data points quite successfully; the average of these two curves is close to the curve-fitting results for the measurements; and an average of 2.95 times larger than the Parker-Murphy theory is revealed. The results indicate that including the pre-sheath analysis is important to compute the electron current enhancement due to magnetic limitations.

  5. A Practitioner Model of the Use of Computer-Based Tools and Resources to Support Mathematics Teaching and Learning.

    Science.gov (United States)

    Ruthven, Kenneth; Hennessy, Sara

    2002-01-01

    Analyzes the pedagogical ideas underpinning teachers' accounts of the successful use of computer-based tools and resources to support the teaching and learning of mathematics. Organizes central themes to form a pedagogical model capable of informing the use of such technologies in classroom teaching and generating theoretical conjectures for…

  6. ANALYSIS OF THE CURRENT STATE OF MANAGEMENT OF FINANCIAL RESOURCES VIA THE TREASURY SYSTEM

    OpenAIRE

    Курганська, Е. І.

    2016-01-01

    The article analyzes modern financial management through the category of treasury. The attention is paid to the specifics of treasury control and complexity of its structure and financial management system.  The basic problem in the management of budgetary resource is functioning of treasury system, especially in political crisis. It was revealed that the economic crisis significantly influences budgetary institutions that have accumulated budgetary funds. The treasure system is considered as...

  7. Resource-sparing and cost-effective strategies in current management of breast cancer

    Directory of Open Access Journals (Sweden)

    Munshi Anusheel

    2009-01-01

    Full Text Available Breast cancer is the leading cause of death in women throughout the world. There have been significant advances in the practice of breast oncology over the past few years. However, most of these advances have an associated price tag or are resource intensive. The present article discusses means to achieve cost-effectiveness in the treatment of breast cancer, while retaining the benefits of the modern anticancer approaches.

  8. Forest Resources and Timber Production of Ghana : Current Instruments for Sustainable Development

    OpenAIRE

    Mark Aferdi, Dadebo; Takeo, Shinohara; United Graduate School of Agricultural Sciences, Kagoshima Universlty; Faculty of Agriculture, University of the Ryukyus

    1999-01-01

    The decline of the natural tropical high forest has reached a critical stage in Ghana's forestry history. Timber resources are overexploited, degraded and further production prospects are questionable and of concern to forest management. The objective of this paper is to discuss some of the institutional measures and development instruments being taken in Ghana towards the feasibility of achieving sustainable management of the high forest for timber and other commodity products, as well as co...

  9. Data Resources for the Computer-Guided Discovery of Bioactive Natural Products.

    Science.gov (United States)

    Chen, Ya; de Bruyn Kops, Christina; Kirchmair, Johannes

    2017-08-30

    Natural products from plants, animals, marine life, fungi, bacteria, and other organisms are an important resource for modern drug discovery. Their biological relevance and structural diversity make natural products good starting points for drug design. Natural product-based drug discovery can benefit greatly from computational approaches, which are a valuable precursor or supplementary method to in vitro testing. We present an overview of 25 virtual and 31 physical natural product libraries that are useful for applications in cheminformatics, in particular virtual screening. The overview includes detailed information about each library, the extent of its structural information, and the overlap between different sources of natural products. In terms of chemical structures, there is a large overlap between freely available and commercial virtual natural product libraries. Of particular interest for drug discovery is that at least ten percent of known natural products are readily purchasable and many more natural products and derivatives are available through on-demand sourcing, extraction and synthesis services. Many of the readily purchasable natural products are of small size and hence of relevance to fragment-based drug discovery. There are also an increasing number of macrocyclic natural products and derivatives becoming available for screening.

  10. Disposal of waste computer hard disk drive: data destruction and resources recycling.

    Science.gov (United States)

    Yan, Guoqing; Xue, Mianqiang; Xu, Zhenming

    2013-06-01

    An increasing quantity of discarded computers is accompanied by a sharp increase in the number of hard disk drives to be eliminated. A waste hard disk drive is a special form of waste electrical and electronic equipment because it holds large amounts of information that is closely connected with its user. Therefore, the treatment of waste hard disk drives is an urgent issue in terms of data security, environmental protection and sustainable development. In the present study the degaussing method was adopted to destroy the residual data on the waste hard disk drives and the housing of the disks was used as an example to explore the coating removal process, which is the most important pretreatment for aluminium alloy recycling. The key operation points of the degaussing determined were: (1) keep the platter plate parallel with the magnetic field direction; and (2) the enlargement of magnetic field intensity B and action time t can lead to a significant upgrade in the degaussing effect. The coating removal experiment indicated that heating the waste hard disk drives housing at a temperature of 400 °C for 24 min was the optimum condition. A novel integrated technique for the treatment of waste hard disk drives is proposed herein. This technique offers the possibility of destroying residual data, recycling the recovered resources and disposing of the disks in an environmentally friendly manner.

  11. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE Model of Water Resources and Water Environments

    Directory of Open Access Journals (Sweden)

    Guohua Fang

    2016-09-01

    Full Text Available To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and output sources of the National Economic Production Department. Secondly, an extended Social Accounting Matrix (SAM of Jiangsu province is developed to simulate various scenarios. By changing values of the discharge fees (increased by 50%, 100% and 150%, three scenarios are simulated to examine their influence on the overall economy and each industry. The simulation results show that an increased fee will have a negative impact on Gross Domestic Product (GDP. However, waste water may be effectively controlled. Also, this study demonstrates that along with the economic costs, the increase of the discharge fee will lead to the upgrading of industrial structures from a situation of heavy pollution to one of light pollution which is beneficial to the sustainable development of the economy and the protection of the environment.

  12. A new compensation current real-time computing method for power active filter based on double linear construction algorithm

    Institute of Scientific and Technical Information of China (English)

    LI; Zicheng; SUN; Yukun

    2006-01-01

    Considering the detection principle that "when load current is periodic current, the integral in a cycle for absolute value of load current subtracting fundamental active current is the least", harmonic current real-time detection methods for power active filter are proposed based on direct computation, simple iterative algorithm and optimal iterative algorithm. According to the direct computation method, the amplitude of the fundamental active current can be accurately calculated when load current is placed in stable state. The simple iterative algorithm and the optimal iterative algorithm provide an idea about judging the state of load current. On the basis of the direct computation method, the simple iterative algorithm, the optimal iterative algorithm and precise definition of the basic concepts such as the true amplitude of the fundamental active current when load current is placed in varying state, etc., the double linear construction idea is proposed in which the amplitude of the fundamental active current at the moment of the sample is accurately calculated by using the first linear construction and the condition which disposes the next sample is created by using the second linear construction. On the basis of the double linear construction idea, a harmonic current real-time detection method for power active filter is proposed based on the double linear construction algorithm. This method has the characteristics of small computing quantity, fine real-time performance, being capable of accurately calculating the amplitude of the fundamental active current and so on.

  13. Social media as an open-learning resource in medical education: current perspectives

    Directory of Open Access Journals (Sweden)

    Sutherland S

    2017-06-01

    Full Text Available S Sutherland,1 A Jalali2 1Department of Critical Care, The Ottawa Hospital, ²Division of Clinical and Functional Anatomy, Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada Purpose: Numerous studies evaluate the use of social media as an open-learning resource in education, but there is a little published knowledge of empirical evidence that such open-learning resources produce educative outcomes, particularly with regard to student performance. This study undertook a systematic review of the published literature in medical education to determine the state of the evidence as to empirical studies that conduct an evaluation or research regarding social media and open-learning resources.Methods: The authors searched MEDLINE, ERIC, Embase, PubMed, Scopus, and Google Scholar from 2012 to 2017. This search included using keywords related to social media, medical education, research, and evaluation, while restricting the search to peer reviewed, English language articles only. To meet inclusion criteria, manuscripts had to employ evaluative methods and undertake empirical research.Results: Empirical work designed to evaluate the impact of social media as an open-learning resource in medical education is limited as only 13 studies met inclusion criteria. The majority of these studies used undergraduate medical education as the backdrop to investigate open-learning resources, such as Facebook, Twitter, and YouTube. YouTube appears to have little educational value due to the unsupervised nature of content added on a daily basis. Overall, extant reviews have demonstrated that we know a considerable amount about social media use, although to date, its impacts remain unclear.Conclusion: There is a paucity of outcome-based, empirical studies assessing the impact of social media in medical education. The few empirical studies identified tend to focus on evaluating the affective outcomes of social media and medical education as opposed to

  14. Current direction, fish shellfish resource, and other data from moored current meter casts and other instruments in the Gulf of Mexico during the Brine Disposal project, 21 June 1978 - 24 June 1981 (NODC Accession 8200027)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Current direction, fish shellfish resource, and other data were collected using moored current meter casts and other instruments in the Gulf of Mexico from June 18,...

  15. MALDI imaging mass spectrometry: statistical data analysis and current computational challenges

    Directory of Open Access Journals (Sweden)

    Alexandrov Theodore

    2012-11-01

    Full Text Available Abstract Matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF imaging mass spectrometry, also called MALDI-imaging, is a label-free bioanalytical technique used for spatially-resolved chemical analysis of a sample. Usually, MALDI-imaging is exploited for analysis of a specially prepared tissue section thaw mounted onto glass slide. A tremendous development of the MALDI-imaging technique has been observed during the last decade. Currently, it is one of the most promising innovative measurement techniques in biochemistry and a powerful and versatile tool for spatially-resolved chemical analysis of diverse sample types ranging from biological and plant tissues to bio and polymer thin films. In this paper, we outline computational methods for analyzing MALDI-imaging data with the emphasis on multivariate statistical methods, discuss their pros and cons, and give recommendations on their application. The methods of unsupervised data mining as well as supervised classification methods for biomarker discovery are elucidated. We also present a high-throughput computational pipeline for interpretation of MALDI-imaging data using spatial segmentation. Finally, we discuss current challenges associated with the statistical analysis of MALDI-imaging data.

  16. Current strategies for improving access and adherence to antiretroviral therapies in resource-limited settings

    Directory of Open Access Journals (Sweden)

    Scanlon ML

    2013-01-01

    Full Text Available Michael L Scanlon,1,2 Rachel C Vreeman1,21Department of Pediatrics, Indiana University School of Medicine, Indianapolis, IN, USA; 2USAID, Academic Model Providing Access to Healthcare (AMPATH Partnership, Eldoret, KenyaAbstract: The rollout of antiretroviral therapy (ART significantly reduced human immunodeficiency virus (HIV-related morbidity and mortality, but good clinical outcomes depend on access and adherence to treatment. In resource-limited settings, where over 90% of the world’s HIV-infected population resides, data on barriers to treatment are emerging that contribute to low rates of uptake in HIV testing, linkage to and retention in HIV care systems, and suboptimal adherence rates to therapy. A review of the literature reveals limited evidence to inform strategies to improve access and adherence with the majority of studies from sub-Saharan Africa. Data from observational studies and randomized controlled trials support home-based, mobile and antenatal care HIV testing, task-shifting from doctor-based to nurse-based and lower level provider care, and adherence support through education, counseling and mobile phone messaging services. Strategies with more limited evidence include targeted HIV testing for couples and family members of ART patients, decentralization of HIV care, including through home- and community-based ART programs, and adherence promotion through peer health workers, treatment supporters, and directly observed therapy. There is little evidence for improving access and adherence among vulnerable groups such as women, children and adolescents, and other high-risk populations and for addressing major barriers. Overall, studies are few in number and suffer from methodological issues. Recommendations for further research include health information technology, social-level factors like HIV stigma, and new research directions in cost-effectiveness, operations, and implementation. Findings from this review make a

  17. Integrating GRID Tools to Build a Computing Resource Broker:Activities of DataGrid WP1

    Institute of Scientific and Technical Information of China (English)

    C.Anglano; S.Barale; 等

    2001-01-01

    Resources on a computational Grid are geographically istributed,heterogeneous in nature,owned by different individuals or organizations with their own scheduling policies,have different access cost models with dynamically varying loads and availability conditions.This maker traditional approaches to workload management,load balancing and scheduling inappropriate.The first work package(WP1)of the EU-funded DataGrid project is adddressing the issue of optimzing the distribution of jobs onto Grid resources based on a knowledge of the status and characteristics of these resources that is necessarily out-of -date(collected in a finite amonut of time at a very loosely coupled site).We describe the DataGrid approach in integrating existing software components(from Condor,GGlobus,etc.)to build a Grid Resource Broker,and the early efforts to define a workable scheduling strategy.

  18. Construction and assessment of hierarchical edge elements for three-dimensional computations of eddy currents

    Energy Technology Data Exchange (ETDEWEB)

    Midtgaard, Ole-Morten

    1997-12-31

    This thesis considers the feasibility of doing calculations to optimize electrical machines without the need to build expensive prototypes. It deals with the construction and assessment of new, hierarchical, hexahedral edge elements for three-dimensional computations of eddy currents with the electric vector potential formulation. The new elements, five in all, gave up to second-order approximations for both the magnetic field and the current density. Theoretical arguments showed these elements to be more economical for a given polynomial order of the approximated fields than the serendipity family of nodal elements. Further it was pointed out how the support of a source field computed by using edge elements could be made very small provided that a proper spanning tree was used in the edge element mesh. This was exploited for the voltage forcing technique, where source fields were used as basis functions, with unknown total currents in voltage forced conductors as degrees of freedom. The practical assessment of the edge elements proved the accuracy to improve with increasing polynomial order, both for local and global quantities. The most economical element was, however, one giving only complete first-order approximations for both fields. Further, the edge elements turned out to be better than the nodal elements also in practice. For the voltage forcing technique, source field basis functions which had small support, resulted in large reduction of the CPU-time for solving the main equation system, compared to source fields which had large support. The new elements can be used in a p-type adaptive scheme, and they should also be applicable for other tangentially continuous field problems. 67 refs., 34 figs., 10 tabs.

  19. Doctors as managers of healthcare resources in Nigeria: Evolving roles and current challenges.

    Science.gov (United States)

    Ojo, Temitope Olumuyiwa; Akinwumi, Adebowale Femi

    2015-01-01

    Over the years, medical practice in Nigeria has evolved in scope and practice, in terms of changing disease patterns, patients' needs, and social expectations. In addition, there is a growing sentiment especially among the general public and some health workers that most doctors are bad managers. Besides drawing examples from some doctors in top management positions that have performed less creditably, critics also harp on the fact that more needs to be done to improve the training of doctors in health management. This article describes the role of doctors in this changing scene of practice and highlights the core areas where doctors' managerial competencies are required to improve the quality of healthcare delivery. Areas such as health care financing, essential drugs and supplies management, and human resource management are emphasized. Resources to be managed and various skills needed to function effectively at the different levels of management are also discussed. To ensure that doctors are well-skilled in managerial competencies, the article concludes by suggesting a curriculum review at undergraduate and postgraduate levels of medical training to include newer but relevant courses on health management in addition to the existing ones, whereas also advocating that doctors be incentivized to go for professional training in health management and not only in the core clinical specialties.

  20. Doctors as managers of healthcare resources in Nigeria: Evolving roles and current challenges

    Directory of Open Access Journals (Sweden)

    Temitope Olumuyiwa Ojo

    2015-01-01

    Full Text Available Over the years, medical practice in Nigeria has evolved in scope and practice, in terms of changing disease patterns, patients' needs, and social expectations. In addition, there is a growing sentiment especially among the general public and some health workers that most doctors are bad managers. Besides drawing examples from some doctors in top management positions that have performed less creditably, critics also harp on the fact that more needs to be done to improve the training of doctors in health management. This article describes the role of doctors in this changing scene of practice and highlights the core areas where doctors' managerial competencies are required to improve the quality of healthcare delivery. Areas such as health care financing, essential drugs and supplies management, and human resource management are emphasized. Resources to be managed and various skills needed to function effectively at the different levels of management are also discussed. To ensure that doctors are well-skilled in managerial competencies, the article concludes by suggesting a curriculum review at undergraduate and postgraduate levels of medical training to include newer but relevant courses on health management in addition to the existing ones, whereas also advocating that doctors be incentivized to go for professional training in health management and not only in the core clinical specialties.

  1. Water resources research program nearshore currents at Point Beach, Wisconsin (1974--1975)

    Energy Technology Data Exchange (ETDEWEB)

    Saunders, K.D.; Van Loon, L.; Tome, C.; Harrison, W.

    1976-03-01

    Coastal currents of Lake Michigan were monitored at stations 0.4, 1.1, and 3.8 km from shore along a coast-perpendicular transect located 1 km south of the Wisconsin Electric Power Company's nuclear generating station. Complete specifications are given for current meter calibrations, electronics used, winter and summer mooring configurations, and for the performance of each meter and mooring assembly. Local winds were monitored at the power plant intake using a mechanical, MRI wind system. The following types of graphs are presented for current and wind observations: (1) U, V flow components versus time, (2) specific kinetic energy versus time, (3) flow speeds and directions versus time, (4) composite velocity histograms and associated U, V-component histograms, and (5) progressive vector diagrams. A linear, optimal-filter design technique (Wiener filter) is used to estimate the efficiency of wind-driven, linear, nearshore current-prediction models. The poor predictions of the models probably relate to such factors as lack of input for superimposed lake currents, time-base errors, and wind-stress formulas.

  2. Applied Research in Human Resource Management System Computer%人力资源管理系统中计算机应用研究

    Institute of Scientific and Technical Information of China (English)

    葛航

    2014-01-01

    Information technology continues to develop, businesses are increasingly using computer technology management busi-ness, human resource management as the foundation for enterprise development management module to enhance the management efficiency has greatly changed. Based on the current status of the analysis of human resource management, and describes the applica-tion of computer technology in human resources management in the hope of contributing to the development of enterprises.%信息化技术不断发展,各行各业都在逐渐应用计算机技术管理企业,人力资源管理作为企业发展的基础管理模块,对于管理效率的提升也有很大改变。该文根据目前我国人力资源管理现状进行分析,并且阐述了计算机技术在人力管理中的应用,希望为企业发展做出贡献。

  3. Iterative reconstruction for quantitative computed tomography analysis of emphysema: consistent results using different tube currents

    Directory of Open Access Journals (Sweden)

    Yamashiro T

    2015-02-01

    Full Text Available Tsuneo Yamashiro,1 Tetsuhiro Miyara,1 Osamu Honda,2 Noriyuki Tomiyama,2 Yoshiharu Ohno,3 Satoshi Noma,4 Sadayuki Murayama1 On behalf of the ACTIve Study Group 1Department of Radiology, Graduate School of Medical Science, University of the Ryukyus, Nishihara, Okinawa, Japan; 2Department of Radiology, Osaka University Graduate School of Medicine, Suita, Osaka, Japan; 3Department of Radiology, Kobe University Graduate School of Medicine, Kobe, Hyogo, Japan; 4Department of Radiology, Tenri Hospital, Tenri, Nara, Japan Purpose: To assess the advantages of iterative reconstruction for quantitative computed tomography (CT analysis of pulmonary emphysema. Materials and methods: Twenty-two patients with pulmonary emphysema underwent chest CT imaging using identical scanners with three different tube currents: 240, 120, and 60 mA. Scan data were converted to CT images using Adaptive Iterative Dose Reduction using Three Dimensional Processing (AIDR3D and a conventional filtered-back projection mode. Thus, six scans with and without AIDR3D were generated per patient. All other scanning and reconstruction settings were fixed. The percent low attenuation area (LAA%; < -950 Hounsfield units and the lung density 15th percentile were automatically measured using a commercial workstation. Comparisons of LAA% and 15th percentile results between scans with and without using AIDR3D were made by Wilcoxon signed-rank tests. Associations between body weight and measurement errors among these scans were evaluated by Spearman rank correlation analysis. Results: Overall, scan series without AIDR3D had higher LAA% and lower 15th percentile values than those with AIDR3D at each tube current (P<0.0001. For scan series without AIDR3D, lower tube currents resulted in higher LAA% values and lower 15th percentiles. The extent of emphysema was significantly different between each pair among scans when not using AIDR3D (LAA%, P<0.0001; 15th percentile, P<0.01, but was not

  4. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

    Energy Technology Data Exchange (ETDEWEB)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); Zwahlen, Daniel [Kantonsspital Graubuenden, Department of Radiotherapy, Chur (Switzerland); Bodis, Stephan [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); University Hospital Zurich, Department of Radiation Oncology, Zurich (Switzerland)

    2016-09-15

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology ''Quantification of Radiation Therapy Infrastructure and Staffing'' guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO ''Health Economics in Radiation Oncology'' (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland. (orig.) [German] Ziel dieser Studie war es, den aktuellen Stand der Infrastruktur und Personalausstattung der

  5. Computation of groundwater resources and recharge in Chithar River Basin, South India.

    Science.gov (United States)

    Subramani, T; Babu, Savithri; Elango, L

    2013-01-01

    Groundwater recharge and available groundwater resources in Chithar River basin, Tamil Nadu, India spread over an area of 1,722 km(2) have been estimated by considering various hydrological, geological, and hydrogeological parameters, such as rainfall infiltration, drainage, geomorphic units, land use, rock types, depth of weathered and fractured zones, nature of soil, water level fluctuation, saturated thickness of aquifer, and groundwater abstraction. The digital ground elevation models indicate that the regional slope of the basin is towards east. The Proterozoic (Post-Archaean) basement of the study area consists of quartzite, calc-granulite, crystalline limestone, charnockite, and biotite gneiss with or without garnet. Three major soil types were identified namely, black cotton, deep red, and red sandy soils. The rainfall intensity gradually decreases from west to east. Groundwater occurs under water table conditions in the weathered zone and fluctuates between 0 and 25 m. The water table gains maximum during January after northeast monsoon and attains low during October. Groundwater abstraction for domestic/stock and irrigational needs in Chithar River basin has been estimated as 148.84 MCM (million m(3)). Groundwater recharge due to monsoon rainfall infiltration has been estimated as 170.05 MCM based on the water level rise during monsoon period. It is also estimated as 173.9 MCM using rainfall infiltration factor. An amount of 53.8 MCM of water is contributed to groundwater from surface water bodies. Recharge of groundwater due to return flow from irrigation has been computed as 147.6 MCM. The static groundwater reserve in Chithar River basin is estimated as 466.66 MCM and the dynamic reserve is about 187.7 MCM. In the present scenario, the aquifer is under safe condition for extraction of groundwater for domestic and irrigation purposes. If the existing water bodies are maintained properly, the extraction rate can be increased in future about 10% to 15%.

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  7. INCREASE THE RESOURCE OF CURRENT COLLECTOR ELEMENTS OF THE ELECTRIFIED HIGH-SPEED TRANSPORT IN OPERATING CONDITIONS

    Directory of Open Access Journals (Sweden)

    Y. L. Bolshakov

    2015-07-01

    Full Text Available Purpose. The paper is aimed to determinate the main ways of increasing resource efficiency and exploitation of coal current collector surface inserts of speed electric rolling stock. Methodology. The research is based on the technique relies on the use of theory regulations of technical systems reliability, electromechanical processes, theory of statistics. Findings. The existing approaches to the production of current collector surface inserts in Europe and Ukraine were considered, a number of information sources was analyzed. The most effective ways of increasing current carrying capacity and wear resistance of current collector elements were determined. It has been established that the existing system for determining the quality of manufacturing of current collector elements have a number of drawbacks that make it difficult to control the input and makes diagnosing current collecting elements in operation impossible. On the basis of the facts, for the needs of the locomotive depot, we propose a new booth allowing avoiding the existing difficulties with diagnosing of current collector elements. During the study were established pervasive transgressions of technological standards of service pantographs. Originality. Based on the results of operational research carried out on the basis of the locomotive depot, obtained depending, based on which, it is proposed to introduce an operative diagnosing system of the current collector elements state during operation. In the course of a comparative analysis of existing and perspective development directions of current collector elements with high load current carrying capacity and durability, were definited constructive conditions for the optimal ratio is inserts. It was established that a significant proportion of failures occur due to imperfect maintenance system for which, on the basis of operational data, recommendations were developed. Practical value. Obtained results of the information sources

  8. Computer Assisted Surgery and Current Trends in Orthopaedics Research and Total Joint Replacements

    Science.gov (United States)

    Amirouche, Farid

    2008-06-01

    Musculoskeletal research has brought about revolutionary changes in our ability to perform high precision surgery in joint replacement procedures. Recent advances in computer assisted surgery as well better materials have lead to reduced wear and greatly enhanced the quality of life of patients. The new surgical techniques to reduce the size of the incision and damage to underlying structures have been the primary advance toward this goal. These new techniques are known as MIS or Minimally Invasive Surgery. Total hip and knee Arthoplasties are at all time high reaching 1.2 million surgeries per year in the USA. Primary joint failures are usually due to osteoarthristis, rheumatoid arthritis, osteocronis and other inflammatory arthritis conditions. The methods for THR and TKA are critical to initial stability and longevity of the prostheses. This research aims at understanding the fundamental mechanics of the joint Arthoplasty and providing an insight into current challenges in patient specific fitting, fixing, and stability. Both experimental and analytical work will be presented. We will examine Cementless total hip arthroplasty success in the last 10 years and how computer assisted navigation is playing in the follow up studies. Cementless total hip arthroplasty attains permanent fixation by the ingrowth of bone into a porous coated surface. Loosening of an ingrown total hip arthroplasty occurs as a result of osteolysis of the periprosthetic bone and degradation of the bone prosthetic interface. The osteolytic process occurs as a result of polyethylene wear particles produced by the metal polyethylene articulation of the prosthesis. The total hip arthroplasty is a congruent joint and the submicron wear particles produced are phagocytized by macrophages initiating an inflammatory cascade. This cascade produces cytokines ultimately implicated in osteolysis. Resulting bone loss both on the acetabular and femoral sides eventually leads to component instability. As

  9. Computational dosimetry for grounded and ungrounded human models due to contact current

    Science.gov (United States)

    Chan, Kwok Hung; Hattori, Junya; Laakso, Ilkka; Hirata, Akimasa; Taki, Masao

    2013-08-01

    This study presents the computational dosimetry of contact currents for grounded and ungrounded human models. The uncertainty of the quasi-static (QS) approximation of the in situ electric field induced in a grounded/ungrounded human body due to the contact current is first estimated. Different scenarios of cylindrical and anatomical human body models are considered, and the results are compared with the full-wave analysis. In the QS analysis, the induced field in the grounded cylindrical model is calculated by the QS finite-difference time-domain (QS-FDTD) method, and compared with the analytical solution. Because no analytical solution is available for the grounded/ungrounded anatomical human body model, the results of the QS-FDTD method are then compared with those of the conventional FDTD method. The upper frequency limit for the QS approximation in the contact current dosimetry is found to be 3 MHz, with a relative local error of less than 10%. The error increases above this frequency, which can be attributed to the neglect of the displacement current. The QS or conventional FDTD method is used for the dosimetry of induced electric field and/or specific absorption rate (SAR) for a contact current injected into the index finger of a human body model in the frequency range from 10 Hz to 100 MHz. The in situ electric fields or SAR are compared with the basic restrictions in the international guidelines/standards. The maximum electric field or the 99th percentile value of the electric fields appear not only in the fat and muscle tissues of the finger, but also around the wrist, forearm, and the upper arm. Some discrepancies are observed between the basic restrictions for the electric field and SAR and the reference levels for the contact current, especially in the extremities. These discrepancies are shown by an equation that relates the current density, tissue conductivity, and induced electric field in the finger with a cross-sectional area of 1 cm2.

  10. Data Security, Privacy, Availability and Integrity in Cloud Computing: Issues and Current Solutions

    Directory of Open Access Journals (Sweden)

    Sultan Aldossary

    2016-04-01

    Full Text Available Cloud computing changed the world around us. Now people are moving their data to the cloud since data is getting bigger and needs to be accessible from many devices. Therefore, storing the data on the cloud becomes a norm. However, there are many issues that counter data stored in the cloud starting from virtual machine which is the mean to share resources in cloud and ending on cloud storage itself issues. In this paper, we present those issues that are preventing people from adopting the cloud and give a survey on solutions that have been done to minimize risks of these issues. For example, the data stored in the cloud needs to be confidential, preserving integrity and available. Moreover, sharing the data stored in the cloud among many users is still an issue since the cloud service provider is untrustworthy to manage authentication and authorization. In this paper, we list issues related to data stored in cloud storage and solutions to those issues which differ from other papers which focus on cloud as general.

  11. Editorial: Special issue on resources for the computer security and information assurance curriculum: Issue 1Curriculum Editorial Comments, Volume 1 and Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Frincke, Deb; Ouderkirk, Steven J.; Popovsky, Barbara

    2006-12-28

    This is a pair of articles to be used as the cover editorials for a special edition of the Journal of Educational Resources in Computing (JERIC) Special Edition on Resources for the Computer Security and Information Assurance Curriculum, volumes 1 and 2.

  12. Current status of dental caries diagnosis using cone beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Park, Young Seok; Ahn, Jin Soo; Kwon, Ho Beom; Lee, Seung Pyo [School of Dentistry, Seoul National University, Seoul (Korea, Republic of)

    2011-06-15

    The purpose of this article is to review the current status of dental caries diagnosis using cone beam computed tomography (CBCT). An online PubMed search was performed to identify studies on caries research using CBCT. Despite its usefulness, there were inherent limitations in the detection of caries lesions through conventional radiograph mainly due to the two-dimensional (2D) representation of caries lesions. Several efforts were made to investigate the three-dimensional (3D) image of lesion, only to gain little popularity. Recently, CBCT was introduced and has been used for diagnosis of caries in several reports. Some of them maintained the superiority of CBCT systems, however it is still under controversies. The CBCT systems are promising, however they should not be considered as a primary choice of caries diagnosis in everyday practice yet. Further studies under more standardized condition should be performed in the near future.

  13. Computer Algorithms and Architectures for Three-Dimensional Eddy-Current Nondestructive Evaluation. Volume 3. Chapters 6-11

    Science.gov (United States)

    1989-01-20

    mflC FILE. OOR SA/TR-2/89 A003: FINAL REPORT COMPUTER ALGORITHMS AND ARCHITECTURES FOR THREE-DIMENSIONAL EDDY-CURRENT NONDESTRUCTIVE EVALUATION CD...J., Ullman, J., The Design and Analysis of Computer Algorithms , Addison-Wesley Publishing Company, 1974. [A2] Anderson, B., Moore, J., Optimal...actual data. DC- 17 I I I I [All Aho, A., Hopcroft, J., Ullman, J., The Design and Analysis of Computer Algorithms , Addison-Wesley Publishing Company

  14. Computer Algorithms and Architectures for Three-Dimensional Eddy-Current Nondestructive Evaluation. Volume 1. Executive Summary

    Science.gov (United States)

    1989-01-20

    LLAA6 .l iI -SA/TR-2/89 A003: FINAL REPORT * COMPUTER ALGORITHMS AND ARCHITECTURES N FOR THREE-DIMENSIONAL EDDY-CURRENT NONDESTRUCTIVE EVALUATION...Ciasuication) COMPUTER ALGORITHMS AND ARCHITECTURES FOR THREE-DIMENSIONAL EDD~j~~JRRN iv ummary Q PERSONAL AUTriOR(S) SBAHASCAE 1 3a. TYPE Of REPORT

  15. Multimodality Neuromonitoring in Pediatric Neurocritical Care: Review of the Current Resources

    Science.gov (United States)

    Tovar-Spinoza, Zulma

    2015-01-01

    Brain insults in children represent a daily challenge in neurocritical care. Having a constant grasp on various parameters in the pediatric injured brain may affect the patient’s outcome. Currently, new advances provide clinicians with the ability to utilize several modalities to monitor brain function. This multi-modal approach allows real-time information, leading to faster responses in management and furthermore avoiding secondary insults in the injured brain.  PMID:26719828

  16. Current practice in software development for computational neuroscience and how to improve it.

    Science.gov (United States)

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  17. Current practice in software development for computational neuroscience and how to improve it.

    Directory of Open Access Journals (Sweden)

    Marc-Oliver Gewaltig

    2014-01-01

    Full Text Available Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  18. Current Status of Human Resource Training Program for Fostering RIBiomics Professionals

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong-Eun; Jang, Beom-Su; Choi, Dae Seong [Advanced Radiation Technology Institute, Korea Atomic Energy Research Institute, Jeongeup 580-185 (Korea, Republic of); Park, Tai-jin [Radiation Research Division, Korea Radioisotope Association, Seoul 132-822, Republic of Korea (Korea, Republic of); Park, Sang Hyun [Advanced Radiation Technology Institute, Korea Atomic Energy Research Institute, Jeongeup 580-185 (Korea, Republic of); Radiation Research Division, Korea Radioisotope Association, Seoul 132-822, Republic of Korea (Korea, Republic of); Department of Radiobiotechnology and Applied Radioisotope Science, Korea University of Science and Technology, Deajeon 305-350 (Korea, Republic of)

    2015-07-01

    RI-Biomics is a state-of-the-art radiation fusion technology for evaluating in-vivo dynamics such as absorption, distribution, metabolism and excretion (ADME) of new drug candidates and biomaterials using radioisotope (RI), and quantitative evaluation of their efficacy via molecular imaging techniques and animal models. The RI-Biomics center is the sole comprehensive research and experiment complex in Korea that can simultaneously perform the radio-synthesis of drug candidate with radioisotope, analysis, and molecular imaging evaluation with animal model. Molecular imaging techniques, including nuclear imaging (SPECT and PET), near-infrared fluorescent (NIRF) imaging, and magnetic resonance imaging (MRI), are the cutting-edge technologies for evaluating drug candidates. Since they allow in vivo real-time imaging of the diseased site, monitoring the biodistribution of drug and determining the optimal therapeutic efficacy following treatments, we have integrated RI-ADME and molecular imaging to provide useful information for drug evaluation and to accelerate the development of new drugs and biomaterials. The RI-Biomics center was established with total investment of 18 million $ during four years from 2009 to 2012 in order to develop a comprehensive analyzing system using RI for new drug development as an axis for national growth in the next generation. The RI-Biomics center has labeling synthesis facility for the radiosynthesis of drug candidate with radioisotope such as Tc-99m, I-125, I-131, F-18, H-3 and C-14 using hot cell. It also includes RI-general analysis facilities, such as Radio-HPLC, LC/MS, GC/MS, gamma counter that can analyzing the radio-synthesized materials, and animal image analysis facilities that developed small animal imaging equipment such as SPECT/PET/CT, 7 T MRI, in-vivo optical imaging system and others. In order to achieve the system to verify safety and effectiveness of the new drugs using RI, it is necessary to establish a human resource

  19. Historical Perspective of Traditional Indigenous Medical Practices: The Current Renaissance and Conservation of Herbal Resources

    Directory of Open Access Journals (Sweden)

    Si-Yuan Pan

    2014-01-01

    Full Text Available In recent years, increasing numbers of people have been choosing herbal medicines or products to improve their health conditions, either alone or in combination with others. Herbs are staging a comeback and herbal “renaissance” occurs all over the world. According to the World Health Organization, 75% of the world’s populations are using herbs for basic healthcare needs. Since the dawn of mankind, in fact, the use of herbs/plants has offered an effective medicine for the treatment of illnesses. Moreover, many conventional/pharmaceutical drugs are derived directly from both nature and traditional remedies distributed around the world. Up to now, the practice of herbal medicine entails the use of more than 53,000 species, and a number of these are facing the threat of extinction due to overexploitation. This paper aims to provide a review of the history and status quo of Chinese, Indian, and Arabic herbal medicines in terms of their significant contribution to the health promotion in present-day over-populated and aging societies. Attention will be focused on the depletion of plant resources on earth in meeting the increasing demand for herbs.

  20. Historical perspective of traditional indigenous medical practices: the current renaissance and conservation of herbal resources.

    Science.gov (United States)

    Pan, Si-Yuan; Litscher, Gerhard; Gao, Si-Hua; Zhou, Shu-Feng; Yu, Zhi-Ling; Chen, Hou-Qi; Zhang, Shuo-Feng; Tang, Min-Ke; Sun, Jian-Ning; Ko, Kam-Ming

    2014-01-01

    In recent years, increasing numbers of people have been choosing herbal medicines or products to improve their health conditions, either alone or in combination with others. Herbs are staging a comeback and herbal "renaissance" occurs all over the world. According to the World Health Organization, 75% of the world's populations are using herbs for basic healthcare needs. Since the dawn of mankind, in fact, the use of herbs/plants has offered an effective medicine for the treatment of illnesses. Moreover, many conventional/pharmaceutical drugs are derived directly from both nature and traditional remedies distributed around the world. Up to now, the practice of herbal medicine entails the use of more than 53,000 species, and a number of these are facing the threat of extinction due to overexploitation. This paper aims to provide a review of the history and status quo of Chinese, Indian, and Arabic herbal medicines in terms of their significant contribution to the health promotion in present-day over-populated and aging societies. Attention will be focused on the depletion of plant resources on earth in meeting the increasing demand for herbs.

  1. Water resources climate change projections using supervised nonlinear and multivariate soft computing techniques

    Science.gov (United States)

    Sarhadi, Ali; Burn, Donald H.; Johnson, Fiona; Mehrotra, Raj; Sharma, Ashish

    2016-05-01

    Accurate projection of global warming on the probabilistic behavior of hydro-climate variables is one of the main challenges in climate change impact assessment studies. Due to the complexity of climate-associated processes, different sources of uncertainty influence the projected behavior of hydro-climate variables in regression-based statistical downscaling procedures. The current study presents a comprehensive methodology to improve the predictive power of the procedure to provide improved projections. It does this by minimizing the uncertainty sources arising from the high-dimensionality of atmospheric predictors, the complex and nonlinear relationships between hydro-climate predictands and atmospheric predictors, as well as the biases that exist in climate model simulations. To address the impact of the high dimensional feature spaces, a supervised nonlinear dimensionality reduction algorithm is presented that is able to capture the nonlinear variability among projectors through extracting a sequence of principal components that have maximal dependency with the target hydro-climate variables. Two soft-computing nonlinear machine-learning methods, Support Vector Regression (SVR) and Relevance Vector Machine (RVM), are engaged to capture the nonlinear relationships between predictand and atmospheric predictors. To correct the spatial and temporal biases over multiple time scales in the GCM predictands, the Multivariate Recursive Nesting Bias Correction (MRNBC) approach is used. The results demonstrate that this combined approach significantly improves the downscaling procedure in terms of precipitation projection.

  2. A New Approach for a Better Load Balancing and a Better Distribution of Resources in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Abdellah IDRISSI

    2015-10-01

    Full Text Available Cloud computing is a new paradigm where data and services of Information Technology are provided via the Internet by using remote servers. It represents a new way of delivering computing resources allowing access to the network on demand. Cloud computing consists of several services, each of which can hold several tasks. As the problem of scheduling tasks is an NP-complete problem, the task management can be an important element in the technology of cloud computing. To optimize the performance of virtual machines hosted in cloud computing, several algorithms of scheduling tasks have been proposed. In this paper, we present an approach allowing to solve the problem optimally and to take into account the QoS constraints based on the different user requests. This technique, based on the Branch and Bound algorithm, allows to assign tasks to different virtual machines while ensuring load balance and a better distribution of resources. The experimental results show that our approach gives very promising results for an effective tasks planning.

  3. Computer Resources for Schools: Notes for Teachers and Students. [Educational Activities Kit.

    Science.gov (United States)

    Computer Museum, Boston, MA.

    This kit features an introduction to the Computer Museum, a history of computer technology, and notes on how a computer works including hardware and software. A total of 20 exhibits are described with brief questions for use as a preview of the exhibit or as ideas for post-visit discussions. There are 24 classroom activities about the history and…

  4. An improved current potential method for fast computation of stellarator coil shapes

    Science.gov (United States)

    Landreman, Matt

    2017-04-01

    Several fast methods for computing stellarator coil shapes are compared, including the classical NESCOIL procedure (Merkel 1987 Nucl. Fusion 27 867), its generalization using truncated singular value decomposition, and a Tikhonov regularization approach we call REGCOIL in which the squared current density is included in the objective function. Considering W7-X and NCSX geometries, and for any desired level of regularization, we find the REGCOIL approach simultaneously achieves lower surface-averaged and maximum values of both current density (on the coil winding surface) and normal magnetic field (on the desired plasma surface). This approach therefore can simultaneously improve the free-boundary reconstruction of the target plasma shape while substantially increasing the minimum distances between coils, preventing collisions between coils while improving access for ports and maintenance. The REGCOIL method also allows finer control over the level of regularization, it preserves convexity to ensure the local optimum found is the global optimum, and it eliminates two pathologies of NESCOIL: the resulting coil shapes become independent of the arbitrary choice of angles used to parameterize the coil surface, and the resulting coil shapes converge rather than diverge as Fourier resolution is increased. We therefore contend that REGCOIL should be used instead of NESCOIL for applications in which a fast and robust method for coil calculation is needed, such as when targeting coil complexity in fixed-boundary plasma optimization, or for scoping new stellarator geometries.

  5. Planning Committee for a National Resource for Computation in Chemistry. Final report, October 1, 1974--June 30, 1977

    Energy Technology Data Exchange (ETDEWEB)

    Bigeleisen, Jacob; Berne, Bruce J.; Coton, F. Albert; Scheraga, Harold A.; Simmons, Howard E.; Snyder, Lawrence C.; Wiberg, Kenneth B.; Wipke, W. Todd

    1978-11-01

    The Planning Committee for a National Resource for Computation in Chemistry (NRCC) was charged with the responsibility of formulating recommendations regarding organizational structure for an NRCC including the composition, size, and responsibilities of its policy board, the relationship of such a board to the operating structure of the NRCC, to federal funding agencies, and to user groups; desirable priorities, growth rates, and levels of operations for the first several years; and facilities, access and site requirements for such a Resource. By means of site visits, questionnaires, and a workshop, the Committee sought advice from a wide range of potential users and organizations interested in chemical computation. Chemical kinetics, crystallography, macromolecular science, nonnumerical methods, physical organic chemistry, quantum chemistry, and statistical mechanics are covered.

  6. Current political commitments’ challenges for ex situ conservation of plant genetic resources for food and agriculture

    Directory of Open Access Journals (Sweden)

    Maria-Mihaela ANTOFIE

    2011-11-01

    Full Text Available This article is an overview regarding capacity building needs for supporting political commitments’ implementation and furthermore, the development of new political, technical and scientific measures for ensuring the proper conservation of biodiversity and considering in a cost-effective way ex situ conservation tools and methods. Domesticated and wild species, threatened and not threatened native species belonging to the natural capital, due to anthropic pressure and climate change may be drastically affected for their status of conservation in their ecosystems of origin. Thus, ex situ conservation is important to be taken into consideration for ensuring the proper conservation of native species. Still, ex situ conservation is a tool which is in use for many activities for many years such as: research, trade, industry, medicine, pharmaceuticals and agriculture. Romania needs to further develop its specific legislation framework in specific domains such as trade of exotic and native threatened species as well as for other domains such as zoos and aquaria, seeds exchange between botanical gardens, bioprospecting, wild threatened species rescue, capture and reintroduction, collection, access for benefit sharing. Also for agriculture should be developed ex situ conservationmeasures closely connected with breeding programmes dedicated to plant genetic resources for food and agriculture (i.e. gene banks conservation, breeding programmes, on farm conservation. Only by harmonizing at the legal level, based on science, all these specific domains, extremely sensitive, dealing with ex situ conservation it will be possible in the future to secure food and ecosanogenesis ensuring the appropriate status of in situ conservation of biodiversity as a whole. As it is not possible to apply conservation measures, either in situ either ex situ either both, to all species it is appropriate to further develop strategic tools for prioritizing our efforts in a cost

  7. Groundwater resources of the Devils Postpile National Monument—Current conditions and future vulnerabilities

    Science.gov (United States)

    Evans, William C.; Bergfeld, Deborah

    2017-06-15

    This study presents an extensive database on groundwater conditions in and around Devils Postpile National Monument. The database contains chemical analyses of springs and the monument water-supply well, including major-ion chemistry, trace element chemistry, and the first information on a list of organic compounds known as emerging contaminants. Diurnal, seasonal, and annual variations in groundwater discharge and chemistry are evaluated from data collected at five main monitoring sites, where streams carry the aggregate flow from entire groups of springs. These springs drain the Mammoth Mountain area and, during the fall months, contribute a significant fraction of the San Joaquin River flow within the monument. The period of this study, from fall 2012 to fall 2015, includes some of the driest years on record, though the seasonal variability observed in 2013 might have been near normal. The spring-fed streams generally flowed at rates well below those observed during a sequence of wet years in the late 1990s. However, persistence of flow and reasonably stable water chemistry through the recent dry years are indicative of a sizeable groundwater system that should provide a reliable resource during similar droughts in the future. Only a few emerging contaminants were detected at trace levels below 1 microgram per liter (μg/L), suggesting that local human visitation is not degrading groundwater quality. No indication of salt from the ski area on the north side of Mammoth Mountain could be found in any of the groundwaters. Chemical data instead show that natural mineral water, such as that discharged from local soda springs, is the main source of anomalous chloride in the monument supply well and in the San Joaquin River. The results of the study are used to develop a set of recommendations for future monitoring to enable detection of deleterious impacts to groundwater quality and quantity

  8. GEISA-97 spectroscopic database system related information resources: current status and perspectives

    Science.gov (United States)

    Chursin, Alexei A.; Jacquinet-Husson, N.; Lefevre, G.; Scott, Noelle A.; Chedin, Alain

    2000-01-01

    This paper presents the recently developed information content diffusion facilities, e.g. the WWW-server of GEISA, MS DOS, WINDOWS-95/NT, and UNIX software packages, associated with the 1997 version of the GEISA-(Gestion et Etude des Informations Spectroscopiques Atmospheriques; word translation: Management and Study of Atmospheric Spectroscopic Information) infrared spectroscopic databank developed at LMD (Laboratoire de Meteorologie Dynamique, France). GEISA-97 individual lines file involves 42 molecules (96 isotopic species) and contains 1,346,266 entries, between 0 and 22,656 cm-1. GEISA-97 also has a catalog of cross-sections at different temperatures and pressures for species (such as chlorofluorocarbons) with complex spectra. The current version of the GEISA-97 cross- section databank contains 4,716,743 entries related to 23 molecules between 555 and 1700 cm-1.

  9. The ISMAR high frequency coastal radar network: Monitoring surface currents for management of marine resources

    DEFF Research Database (Denmark)

    Carlson, Daniel Frazier

    2015-01-01

    The Institute of Marine Sciences (ISMAR) of the National Research Council of Italy (CNR) established a High Frequency (HF) Coastal Radar Network for the measurement of the velocity of surface currents in coastal seas. The network consists of four HF radar systems located on the coast of the Gargano...... of geospatial data, a netCDF architecture has been defined on the basis of the Radiowave Operators Working Group (US ROWG) recommendations and compliant to the Climate and Forecast (CF) Metadata Conventions CF-1.6. The hourly netCDF files are automatically attached to a Thematic Real-time Environmental...... by the ISMAR HF radar network are presently used in a number of applications, ranging from oil spill and SAR to fishery and coastal management applications....

  10. Computation of a Single-phase Shell-Type Transformer Windings Forces Caused by Inrush and Short-circuit Currents

    Directory of Open Access Journals (Sweden)

    M. B.B. Sharifian

    2008-01-01

    Full Text Available This research studies the forces on the windings of transformer due to inrush current. These forces are compared with the corresponding forces due to short-circuit of the windings. Two-dimensional finite element computation of a single-phase shell-type transformer is carried out based on the maximum permissible inrush current value where its amplitude is the same as the rated short-circuit current. To verify the computation results, they are compared with those recently obtained using Artificial Neural Network (ANN.

  11. Current state of allocation of oral health human resources in northern China and future needs.

    Science.gov (United States)

    Zhang, Y; Lu, Z; Cheng, R; Liu, L

    2015-11-01

    The aim of the present investigation was to describe the distribution, structure and allocation of oral health services personnel, evaluate oral health service capacity and predict the needs for oral health services in northern China over the coming 10 years. The questionnaires were sent to all the dental medical institutions included in this study directly from the Sanitation Bureau and the Health Supervision Station. All the institutions and dental personnel were asked to fill out the questionnaires, and then, the questionnaires were collected through postal service and email. In Liaoning Province, which is in northern China, there are a total of 5617 dentists in total, 87.8% of whom are located in urban areas. Dentists in rural areas were found to be less educated and specialized. The ratio of dentists to nurses to technicians was about 6:2:1, and the ratio of dentists to total population was 1:7682. It was predicted that, in 2020, the number of dentists could reach 13 207. This would meet the area's needs for oral health services. Currently, in northern China, the oral health infrastructure suffers from an insufficient number of dental professionals, disproportionate distribution and inappropriate structure. To improve social equity, it is necessary to adjust the distribution of dental personnel capable of performing for oral health services. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. Resource discovery algorithm based on hierarchical model and Conscious search in Grid computing system

    Directory of Open Access Journals (Sweden)

    Nasim Nickbakhsh

    2017-03-01

    Full Text Available The distributed system of Grid subscribes the non-homogenous sources at a vast level in a dynamic manner. The resource discovery manner is very influential on the efficiency and of quality the system functionality. The “Bitmap” model is based on the hierarchical and conscious search model that allows for less traffic and low number of messages in relation to other methods in this respect. This proposed method is based on the hierarchical and conscious search model that enhances the Bitmap method with the objective to reduce traffic, reduce the load of resource management processing, reduce the number of emerged messages due to resource discovery and increase the resource according speed. The proposed method and the Bitmap method are simulated through Arena tool. This proposed model is abbreviated as RNTL.

  13. A Simple and Resource-efficient Setup for the Computer-aided Drug Design Laboratory.

    Science.gov (United States)

    Moretti, Loris; Sartori, Luca

    2016-10-01

    Undertaking modelling investigations for Computer-Aided Drug Design (CADD) requires a proper environment. In principle, this could be done on a single computer, but the reality of a drug discovery program requires robustness and high-throughput computing (HTC) to efficiently support the research. Therefore, a more capable alternative is needed but its implementation has no widespread solution. Here, the realization of such a computing facility is discussed, from general layout to technical details all aspects are covered. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively on such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.

  15. Energy-Efficient Management of Data Center Resources for Cloud Computing: A Vision, Architectural Elements, and Open Challenges

    CERN Document Server

    Buyya, Rajkumar; Abawajy, Jemal

    2010-01-01

    Cloud computing is offering utility-oriented IT services to users worldwide. Based on a pay-as-you-go model, it enables hosting of pervasive applications from consumer, scientific, and business domains. However, data centers hosting Cloud applications consume huge amounts of energy, contributing to high operational costs and carbon footprints to the environment. Therefore, we need Green Cloud computing solutions that can not only save energy for the environment but also reduce operational costs. This paper presents vision, challenges, and architectural elements for energy-efficient management of Cloud computing environments. We focus on the development of dynamic resource provisioning and allocation algorithms that consider the synergy between various data center infrastructures (i.e., the hardware, power units, cooling and software), and holistically work to boost data center energy efficiency and performance. In particular, this paper proposes (a) architectural principles for energy-efficient management of ...

  16. Computational intelligence in gait research: a perspective on current applications and future challenges.

    Science.gov (United States)

    Lai, Daniel T H; Begg, Rezaul K; Palaniswami, Marimuthu

    2009-09-01

    Our mobility is an important daily requirement so much so that any disruption to it severely degrades our perceived quality of life. Studies in gait and human movement sciences, therefore, play a significant role in maintaining the well-being of our mobility. Current gait analysis involves numerous interdependent gait parameters that are difficult to adequately interpret due to the large volume of recorded data and lengthy assessment times in gait laboratories. A proposed solution to these problems is computational intelligence (CI), which is an emerging paradigm in biomedical engineering most notably in pathology detection and prosthesis design. The integration of CI technology in gait systems facilitates studies in disorders caused by lower limb defects, cerebral disorders, and aging effects by learning data relationships through a combination of signal processing and machine learning techniques. Learning paradigms, such as supervised learning, unsupervised learning, and fuzzy and evolutionary algorithms, provide advanced modeling capabilities for biomechanical systems that in the past have relied heavily on statistical analysis. CI offers the ability to investigate nonlinear data relationships, enhance data interpretation, design more efficient diagnostic methods, and extrapolate model functionality. These are envisioned to result in more cost-effective, efficient, and easy-to-use systems, which would address global shortages in medical personnel and rising medical costs. This paper surveys current signal processing and CI methodologies followed by gait applications ranging from normal gait studies and disorder detection to artificial gait simulation. We review recent systems focusing on the existing challenges and issues involved in making them successful. We also examine new research in sensor technologies for gait that could be combined with these intelligent systems to develop more effective healthcare solutions.

  17. Distributed computing in bioinformatics.

    Science.gov (United States)

    Jain, Eric

    2002-01-01

    This paper provides an overview of methods and current applications of distributed computing in bioinformatics. Distributed computing is a strategy of dividing a large workload among multiple computers to reduce processing time, or to make use of resources such as programs and databases that are not available on all computers. Participating computers may be connected either through a local high-speed network or through the Internet.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  19. University Students and Ethics of Computer Technology Usage: Human Resource Development

    Science.gov (United States)

    Iyadat, Waleed; Iyadat, Yousef; Ashour, Rateb; Khasawneh, Samer

    2012-01-01

    The primary purpose of this study was to determine the level of students' awareness about computer technology ethics at the Hashemite University in Jordan. A total of 180 university students participated in the study by completing the questionnaire designed by the researchers, named the Computer Technology Ethics Questionnaire (CTEQ). Results…

  20. Using Free Computational Resources to Illustrate the Drug Design Process in an Undergraduate Medicinal Chemistry Course

    Science.gov (United States)

    Rodrigues, Ricardo P.; Andrade, Saulo F.; Mantoani, Susimaire P.; Eifler-Lima, Vera L.; Silva, Vinicius B.; Kawano, Daniel F.

    2015-01-01

    Advances in, and dissemination of, computer technologies in the field of drug research now enable the use of molecular modeling tools to teach important concepts of drug design to chemistry and pharmacy students. A series of computer laboratories is described to introduce undergraduate students to commonly adopted "in silico" drug design…

  1. University Students and Ethics of Computer Technology Usage: Human Resource Development

    Science.gov (United States)

    Iyadat, Waleed; Iyadat, Yousef; Ashour, Rateb; Khasawneh, Samer

    2012-01-01

    The primary purpose of this study was to determine the level of students' awareness about computer technology ethics at the Hashemite University in Jordan. A total of 180 university students participated in the study by completing the questionnaire designed by the researchers, named the Computer Technology Ethics Questionnaire (CTEQ). Results…

  2. Assessment of knowledge and awareness among radiology personnel regarding current computed tomography technology and radiation dose

    Science.gov (United States)

    Karim, M. K. A.; Hashim, S.; Bradley, D. A.; Bahruddin, N. A.; Ang, W. C.; Salehhon, N.

    2016-03-01

    In this paper, we evaluate the level of knowledge and awareness among 120 radiology personnel working in 7 public hospitals in Johor, Malaysia, concerning Computed Tomography (CT) technology and radiation doses based on a set of questionnaires. Subjects were divided into two groups (Medical profession (Med, n=32) and Allied health profession (AH, n=88). The questionnaires are addressed: (1) demographic data (2) relative radiation dose and (3) knowledge of current CT technology. One-third of respondents from both groups were able to estimate relative radiation dose for routine CT examinations. 68% of the allied health profession personnel knew of the Malaysia regulations entitled ‘Basic Safety Standard (BSS) 2010’, although notably 80% of them had previously attended a radiation protection course. No significant difference (p < 0.05) in mean scores of CT technology knowledge detected between the two groups, with the medical professions producing a mean score of (26.7 ± 2.7) and the allied health professions a mean score of (25.2 ± 4.3). This study points to considerable variation among the respondents concerning their understanding of knowledge and awareness of risks of radiation and CT optimization techniques.

  3. Current knowledge on tumour induction by computed tomography should be carefully used

    Energy Technology Data Exchange (ETDEWEB)

    Candela-Juan, Cristian [La Fe University and Polytechnic Hospital, Radioprotection Department, Valencia (Spain); Montoro, Alegria; Villaescusa, Juan Ignacio [La Fe University and Polytechnic Hospital, Radioprotection Department, Valencia (Spain); IIS La Fe, Biomedical Imaging Research Group GIBI230, Valencia (Spain); Ruiz-Martinez, Enrique; Marti-Bonmati, Luis [IIS La Fe, Biomedical Imaging Research Group GIBI230, Valencia (Spain); La Fe University and Polytechnic Hospital, Department of Radiology, Valencia (Spain)

    2014-03-15

    Risks associated to ionising radiation from medical imaging techniques have focused the attention of the medical society and general population. This risk is aimed to determine the probability that a tumour is induced as a result of a computed tomography (CT) examination since it makes nowadays the biggest contribution to the collective dose. Several models of cancer induction have been reported in the literature, with diametrically different implications. This article reviews those models, focusing on the ones used by the scientific community to estimate CT detriments. Current estimates of the probability that a CT examination induces cancer are reported, highlighting its low magnitude (near the background level) and large sources of uncertainty. From this objective review, it is concluded that epidemiological data with more accurate dosimetric estimates are needed. Prediction of the number of tumours that will be induced in population exposed to ionising radiation should be avoided or, if given, it should be accompanied by a realistic evaluation of its uncertainty and of the advantages of CTs. Otherwise they may have a negative impact in both the medical community and the patients. Reducing doses even more is not justified if that compromises clinical image quality in a necessary investigation. (orig.)

  4. Taper Preparation Variability Compared to Current Taper Standards Using Computed Tomography

    Directory of Open Access Journals (Sweden)

    Richard Gergi

    2012-01-01

    Full Text Available Introduction. The purpose of this study was to compare the taper variation in root canal preparations among Twisted Files and PathFiles-ProTaper .08 tapered rotary files to current standards. Methods. 60 root canals with severe angle of curvature (between 25∘ and 35∘ and short radius (<10 mm were selected. The canals were divided randomly into two groups of 30 each. After preparation with Twisted Files and PathFiles-ProTaper to size 25 taper .08, the diameter was measured using computed tomography (CT at 1, 3, and 16 mm. Canal taper preparation was calculated at the apical third and at the middle-cervical third. Results. Of the 2 file systems, both fell within the ±.05 taper variability. All preparations demonstrated variability when compared to the nominal taper .08. In the apical third, mean taper was significantly different between TF and PathFiles-ProTaper ( value < 0.0001; independent -test. Mean Taper was significantly higher with PathFile-ProTaper. In the middle-cervical third, mean Taper was significantly higher with TF ( value = 0.015; independent -test. Conclusion. Taper preparations of the investigated size 25 taper .08 were favorable but different from the nominal taper.

  5. Teacher Perspectives on the Current State of Computer Technology Integration into the Public School Classroom

    Science.gov (United States)

    Zuniga, Ramiro

    2009-01-01

    Since the introduction of computers into the public school arena over forty years ago, educators have been convinced that the integration of computer technology into the public school classroom will transform education. Joining educators are state and federal governments. Public schools and others involved in the process of computer technology…

  6. Clinical utility of dental cone-beam computed tomography: current perspectives

    Directory of Open Access Journals (Sweden)

    Jaju PP

    2014-04-01

    Full Text Available Prashant P Jaju,1 Sushma P Jaju21Oral Medicine and Radiology, 2Conservative Dentistry and Endodontics, Rishiraj College of Dental Sciences and Research Center, Bhopal, IndiaAbstract: Panoramic radiography and computed tomography were the pillars of maxillofacial diagnosis. With the advent of cone-beam computed tomography, dental practice has seen a paradigm shift. This review article highlights the potential applications of cone-beam computed tomography in the fields of dental implantology and forensic dentistry, and its limitations in maxillofacial diagnosis.Keywords: dental implants, cone-beam computed tomography, panoramic radiography, computed tomography

  7. Toward Production From Gas Hydrates: Current Status, Assessment of Resources, and Simulation-Based Evaluationof Technology and Potential

    Energy Technology Data Exchange (ETDEWEB)

    Reagan, Matthew; Moridis, George J.; Collett, Timothy; Boswell, Ray; Kurihara, M.; Reagan, Matthew T.; Koh, Carolyn; Sloan, E. Dendy

    2008-02-12

    Gas hydrates are a vast energy resource with global distribution in the permafrost and in the oceans. Even if conservative estimates are considered and only a small fraction is recoverable, the sheer size of the resource is so large that it demands evaluation as a potential energy source. In this review paper, we discuss the distribution of natural gas hydrate accumulations, the status of the primary international R&D programs, and the remaining science and technological challenges facing commercialization of production. After a brief examination of gas hydrate accumulations that are well characterized and appear to be models for future development and gas production, we analyze the role of numerical simulation in the assessment of the hydrate production potential, identify the data needs for reliable predictions, evaluate the status of knowledge with regard to these needs, discuss knowledge gaps and their impact, and reach the conclusion that the numerical simulation capabilities are quite advanced and that the related gaps are either not significant or are being addressed. We review the current body of literature relevant to potential productivity from different types of gas hydrate deposits, and determine that there are consistent indications of a large production potential at high rates over long periods from a wide variety of hydrate deposits. Finally, we identify (a) features, conditions, geology and techniques that are desirable in potential production targets, (b) methods to maximize production, and (c) some of the conditions and characteristics that render certain gas hydrate deposits undesirable for production.

  8. Dynamic resource allocation engine for cloud-based real-time video transcoding in mobile cloud computing environments

    Science.gov (United States)

    Adedayo, Bada; Wang, Qi; Alcaraz Calero, Jose M.; Grecos, Christos

    2015-02-01

    The recent explosion in video-related Internet traffic has been driven by the widespread use of smart mobile devices, particularly smartphones with advanced cameras that are able to record high-quality videos. Although many of these devices offer the facility to record videos at different spatial and temporal resolutions, primarily with local storage considerations in mind, most users only ever use the highest quality settings. The vast majority of these devices are optimised for compressing the acquired video using a single built-in codec and have neither the computational resources nor battery reserves to transcode the video to alternative formats. This paper proposes a new low-complexity dynamic resource allocation engine for cloud-based video transcoding services that are both scalable and capable of being delivered in real-time. Firstly, through extensive experimentation, we establish resource requirement benchmarks for a wide range of transcoding tasks. The set of tasks investigated covers the most widely used input formats (encoder type, resolution, amount of motion and frame rate) associated with mobile devices and the most popular output formats derived from a comprehensive set of use cases, e.g. a mobile news reporter directly transmitting videos to the TV audience of various video format requirements, with minimal usage of resources both at the reporter's end and at the cloud infrastructure end for transcoding services.

  9. Methods of resource management and applications in computing systems based on cloud technology

    Directory of Open Access Journals (Sweden)

    Карина Андріївна Мацуєва

    2015-07-01

    Full Text Available This article describes the methods of resource management and applications that are parts of an information system for science research (ISSR. The control model of requests in ISSR is given and results of working real cloud system using the additional module of load distribution programmed in Python are presented 

  10. Recommended Computer End-User Skills for Business Students by Fortune 500 Human Resource Executives.

    Science.gov (United States)

    Zhao, Jensen J.

    1996-01-01

    Human resources executives (83 responses from 380) strongly recommended 11 and recommended 46 end-user skills for business graduates. Core skills included use of keyboard, mouse, microcomputer, and printer; Windows; Excel; and telecommunications functions (electronic mail, Internet, local area networks, downloading). Knowing one application of…

  11. Recommendations for protecting National Library of Medicine Computing and Networking Resources

    Energy Technology Data Exchange (ETDEWEB)

    Feingold, R.

    1994-11-01

    Protecting Information Technology (IT) involves a number of interrelated factors. These include mission, available resources, technologies, existing policies and procedures, internal culture, contemporary threats, and strategic enterprise direction. In the face of this formidable list, a structured approach provides cost effective actions that allow the organization to manage its risks. We face fundamental challenges that will persist for at least the next several years. It is difficult if not impossible to precisely quantify risk. IT threats and vulnerabilities change rapidly and continually. Limited organizational resources combined with mission restraints-such as availability and connectivity requirements-will insure that most systems will not be absolutely secure (if such security were even possible). In short, there is no technical (or administrative) {open_quotes}silver bullet.{close_quotes} Protection is employing a stratified series of recommendations, matching protection levels against information sensitivities. Adaptive and flexible risk management is the key to effective protection of IT resources. The cost of the protection must be kept less than the expected loss, and one must take into account that an adversary will not expend more to attack a resource than the value of its compromise to that adversary. Notwithstanding the difficulty if not impossibility to precisely quantify risk, the aforementioned allows us to avoid the trap of choosing a course of action simply because {open_quotes}it`s safer{close_quotes} or ignoring an area because no one had explored its potential risk. Recommendations for protecting IT resources begins with discussing contemporary threats and vulnerabilities, and then procedures from general to specific preventive measures. From a risk management perspective, it is imperative to understand that today, the vast majority of threats are against UNIX hosts connected to the Internet.

  12. Brain computer interfaces for neurorehabilitation – its current status as a rehabilitation strategy post-stroke.

    Science.gov (United States)

    van Dokkum, L E H; Ward, T; Laffont, I

    2015-02-01

    The idea of using brain computer interfaces (BCI) for rehabilitation emerged relatively recently. Basically, BCI for neurorehabilitation involves the recording and decoding of local brain signals generated by the patient, as he/her tries to perform a particular task (even if imperfect), or during a mental imagery task. The main objective is to promote the recruitment of selected brain areas involved and to facilitate neural plasticity. The recorded signal can be used in several ways: (i) to objectify and strengthen motor imagery-based training, by providing the patient feedback on the imagined motor task, for example, in a virtual environment; (ii) to generate a desired motor task via functional electrical stimulation or rehabilitative robotic orthoses attached to the patient's limb – encouraging and optimizing task execution as well as "closing" the disrupted sensorimotor loop by giving the patient the appropriate sensory feedback; (iii) to understand cerebral reorganizations after lesion, in order to influence or even quantify plasticity-induced changes in brain networks. For example, applying cerebral stimulation to re-equilibrate inter-hemispheric imbalance as shown by functional recording of brain activity during movement may help recovery. Its potential usefulness for a patient population has been demonstrated on various levels and its diverseness in interface applications makes it adaptable to a large population. The position and status of these very new rehabilitation systems should now be considered with respect to our current and more or less validated traditional methods, as well as in the light of the wide range of possible brain damage. The heterogeneity in post-damage expression inevitably complicates the decoding of brain signals and thus their use in pathological conditions, asking for controlled clinical trials.

  13. Coronary computed tomography angiography: overview of technical aspects, current concepts, and perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Chartrand-Lefebvre, C.; Cadrin-Chenevert, A. [Univ. of Montreal Medical Centre, Radiology Dept., Montreal, Quebec (Canada)]. E-mail: chartrandlef@videotron.ca; Bordeleau, E. [Univ. of Montreal Medical Centre, Radiology Dept., Montreal, Quebec (Canada); Hopital Laval, St. Foy, Quebec (Canada); Ugolini, P.; Ouellet, R. [Montreal Inst. of Cardiology, Montreal, Quebec (Canada); Sablayrolles, J.-L. [Centre Cardiologique du Nord, Paris (France); Prenovault, J. [Univ. of Montreal Medical Centre, Radiology Dept., Montreal, Quebec (Canada)

    2007-04-15

    Multidetector-row electrocardiogram (ECC)-gated cardiac computed tomography (CT) will probably be a major noninvasive imaging option in the near future. Recent developments indicate that this new technology is improving rapidly. This article presents an overview of the current concepts, perspectives, and technical capabilities in coronary CT angiography (CTA). We have reviewed the recent literature on the different applications of this technology; of particular note are the many studies that have demonstrated the high negative predictive value (NPV) of coronary CTA, when performed under optimal conditions, for significant stenoses in native coronary arteries. This new technology's level of performance allows it to be used to evaluate the presence of calcified plaques, coronary bypass graft patency, and the origin and course of congenital coronary anomalies. Despite a high NPV, the robustness of the technology is limited by arrhythmias, the requirement of low heart rates, and calcium-related artifacts. Sonic improvements are needed in the imaging of coronary stents, especially the smaller stents, and in the detection and characterization of noncalcified plaques. Further studies are needed to more precisely determine the role of CTA in various symptomatic and asymptomatic patient groups. Clinical testing of 64-slice scanners has recently begun. As the technology improves, so does the spatial and temporal resolution. To date, this is being achieved through the development of systems with an increased number of detectors and shorter gantry rotation time as well as the development of systems equipped with 2 X-ray tubes and the eventual development of flat-panel technology. Thus further improvement of image quality is expected. (author)

  14. Mediating Role of Psychological Resources on the Association Between Childhood Socioeconomic Status and Current Health in the Community Adult Population of Japan.

    Science.gov (United States)

    Kan, Chiemi; Kawakami, Norito; Umeda, Maki

    2015-12-01

    The majority of studies on the role of psychological resources linking childhood socioeconomic status (SES) and adult health have been conducted in Western countries. Empirical evidence for mediation effects of psychological resources is currently lacking in Japan. The purpose of this study was to investigate the mediating effect of psychological resources (mastery and sense of coherence [SOC]) on the association between childhood SES and current health. Analyses were conducted on cross-sectional data (1,497 men and 1,764 women) from the Japanese Study of Stratification, Health, Income, and Neighborhood Study (J-SHINE) in Tokyo. Psychological resources (mastery and SOC), childhood SES (parents' education and perceived childhood SES), and current health of adults (psychological distress measured by K6 and self-rated health) were measured using a self-report questionnaire. Mastery and SOC significantly and independently mediated the association between childhood SES and current health in the total sample after adjusting for age, gender, and respondent education, regardless of type of SES or health outcome indicators. Similar mediation effects were observed for both men and women. A few gender differences were observed; specifically, SOC significantly mediated the association between parents' education and current health only among women, and it mediated the association between perceived childhood SES and current health only among men. Overall, the findings underscore the importance of the mediating role of psychological resources in the association between childhood SES and current health.

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  16. Aggregating Data for Computational Toxicology Applications: The U.S. Environmental Protection Agency (EPA Aggregated Computational Toxicology Resource (ACToR System

    Directory of Open Access Journals (Sweden)

    Elaine A. Cohen Hubal

    2012-02-01

    Full Text Available Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains, ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies, ExpoCastDB (detailed human exposure data from observational studies of selected chemicals, and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways. The EPA DSSTox (Distributed Structure-Searchable Toxicity program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built using open source tools and is freely available to download. This review describes the organization of the data repository and provides selected examples of use cases.

  17. Distributed Factorization Computation on Multiple Volunteered Mobile Resource to Break RSA Key

    Science.gov (United States)

    Jaya, I.; Hardi, S. M.; Tarigan, J. T.; Zamzami, E. M.; Sihombing, P.

    2017-01-01

    Similar to common asymmeric encryption, RSA can be cracked by usmg a series mathematical calculation. The private key used to decrypt the massage can be computed using the public key. However, finding the private key may require a massive amount of calculation. In this paper, we propose a method to perform a distributed computing to calculate RSA’s private key. The proposed method uses multiple volunteered mobile devices to contribute during the calculation process. Our objective is to demonstrate how the use of volunteered computing on mobile devices may be a feasible option to reduce the time required to break a weak RSA encryption and observe the behavior and running time of the application on mobile devices.

  18. Efficient Resource Matching in Heterogeneous Grid Using Resource Vector

    CERN Document Server

    Addepallil, Srirangam V; Barnes, George L; 10.5121/ijcsit.2010.2301

    2010-01-01

    In this paper, a method for efficient scheduling to obtain optimum job throughput in a distributed campus grid environment is presented; Traditional job schedulers determine job scheduling using user and job resource attributes. User attributes are related to current usage, historical usage, user priority and project access. Job resource attributes mainly comprise of soft requirements (compilers, libraries) and hard requirements like memory, storage and interconnect. A job scheduler dispatches jobs to a resource if a job's hard and soft requirements are met by a resource. In current scenario during execution of a job, if a resource becomes unavailable, schedulers are presented with limited options, namely re-queuing job or migrating job to a different resource. Both options are expensive in terms of data and compute time. These situations can be avoided, if the often ignored factor, availability time of a resource in a grid environment is considered. We propose resource rank approach, in which jobs are dispat...

  19. A mathematical model for a distributed attack on targeted resources in a computer network

    Science.gov (United States)

    Haldar, Kaushik; Mishra, Bimal Kumar

    2014-09-01

    A mathematical model has been developed to analyze the spread of a distributed attack on critical targeted resources in a network. The model provides an epidemic framework with two sub-frameworks to consider the difference between the overall behavior of the attacking hosts and the targeted resources. The analysis focuses on obtaining threshold conditions that determine the success or failure of such attacks. Considering the criticality of the systems involved and the strength of the defence mechanism involved, a measure has been suggested that highlights the level of success that has been achieved by the attacker. To understand the overall dynamics of the system in the long run, its equilibrium points have been obtained and their stability has been analyzed, and conditions for their stability have been outlined.

  20. Why We Should No Longer Only Repair, Polish and Iron Current Computer Science Educations.

    Science.gov (United States)

    Gruska, Jozef

    1993-01-01

    Describes shortcomings of computer science/engineering education and explains a new focus on informatics. Highlights include simulation, visualization, algorithmization, design of information processing models, parallel computing, a history of informatics, informatics versus physics and mathematics, and implications for education. (51 references)…

  1. Just Scan It!-Weapon Reconstruction in Computed Tomography on Historical and Current Swiss Military Guns.

    Science.gov (United States)

    Franckenberg, Sabine; Binder, Thomas; Bolliger, Stephan; Thali, Michael J; Ross, Steffen G

    2016-09-01

    Cross-sectional imaging, such as computed tomography, has been increasingly implemented in both historic and recent postmortem forensic investigations. It aids in determining cause and manner of death as well as in correlating injuries to possible weapons. This study illuminates the feasibility of reconstructing guns in computed tomography and gives a distinct overview of historic and recent Swiss Army guns.

  2. Sustainable supply chain management through enterprise resource planning (ERP): a model of sustainable computing

    OpenAIRE

    Broto Rauth Bhardwaj

    2015-01-01

    Green supply chain management (GSCM) is a driver of sustainable strategy. This topic is becoming increasingly important for both academia and industry. With the increasing demand for reducing carbon foot prints, there is a need to study the drivers of sustainable development. There is also need for developing the sustainability model. Using resource based theory (RBT) the present model for sustainable strategy has been developed. On the basis of data collected, the key drivers of sustainabili...

  3. Development of a Computational Framework for Stochastic Co-optimization of Water and Energy Resource Allocations under Climatic Uncertainty

    Science.gov (United States)

    Xuan, Y.; Mahinthakumar, K.; Arumugam, S.; DeCarolis, J.

    2015-12-01

    Owing to the lack of a consistent approach to assimilate probabilistic forecasts for water and energy systems, utilization of climate forecasts for conjunctive management of these two systems is very limited. Prognostic management of these two systems presents a stochastic co-optimization problem that seeks to determine reservoir releases and power allocation strategies while minimizing the expected operational costs subject to probabilistic climate forecast constraints. To address these issues, we propose a high performance computing (HPC) enabled computational framework for stochastic co-optimization of water and energy resource allocations under climate uncertainty. The computational framework embodies a new paradigm shift in which attributes of climate (e.g., precipitation, temperature) and its forecasted probability distribution are employed conjointly to inform seasonal water availability and electricity demand. The HPC enabled cyberinfrastructure framework is developed to perform detailed stochastic analyses, and to better quantify and reduce the uncertainties associated with water and power systems management by utilizing improved hydro-climatic forecasts. In this presentation, our stochastic multi-objective solver extended from Optimus (Optimization Methods for Universal Simulators), is introduced. The solver uses parallel cooperative multi-swarm method to solve for efficient solution of large-scale simulation-optimization problems on parallel supercomputers. The cyberinfrastructure harnesses HPC resources to perform intensive computations using ensemble forecast models of streamflow and power demand. The stochastic multi-objective particle swarm optimizer we developed is used to co-optimize water and power system models under constraints over a large number of ensembles. The framework sheds light on the application of climate forecasts and cyber-innovation framework to improve management and promote the sustainability of water and energy systems.

  4. Computer and Video Games in Family Life: The Digital Divide as a Resource in Intergenerational Interactions

    Science.gov (United States)

    Aarsand, Pal Andre

    2007-01-01

    In this ethnographic study of family life, intergenerational video and computer game activities were videotaped and analysed. Both children and adults invoked the notion of a digital divide, i.e. a generation gap between those who master and do not master digital technology. It is argued that the digital divide was exploited by the children to…

  5. Planning and Development of the Computer Resource at Baylor College of Medicine.

    Science.gov (United States)

    And Others; Ogilvie, W. Buckner, Jr.

    1979-01-01

    Describes the development and implementation of a plan at Baylor College of Medicine for providing computer support for both the administrative and scientific/ research needs of the Baylor community. The cost-effectiveness of this plan is also examined. (Author/CMV)

  6. Computers for All Students: A Strategy for Universal Access to Information Resources.

    Science.gov (United States)

    Resmer, Mark; And Others

    This report proposes a strategy of putting networked computing devices into the hands of all students at institutions of higher education. It outlines the rationale for such a strategy, the options for financing, the required institutional support structure needed, and various implementation approaches. The report concludes that the resultant…

  7. The Portability of Computer-Related Educational Resources: An Overview of Issues and Directions.

    Science.gov (United States)

    Collis, Betty A.; De Diana, Italo

    1990-01-01

    Provides an overview of the articles in this special issue, which deals with the portability, or transferability, of educational computer software. Motivations for portable software relating to cost, personnel, and time are discussed, and factors affecting portability are described, including technical factors, educational factors, social/cultural…

  8. Use of cone beam computed tomography in implant dentistry: current concepts, indications and limitations for clinical practice and research.

    Science.gov (United States)

    Bornstein, Michael M; Horner, Keith; Jacobs, Reinhilde

    2017-02-01

    Diagnostic radiology is an essential component of treatment planning in the field of implant dentistry. This narrative review will present current concepts for the use of cone beam computed tomography imaging, before and after implant placement, in daily clinical practice and research. Guidelines for the selection of three-dimensional imaging will be discussed, and limitations will be highlighted. Current concepts of radiation dose optimization, including novel imaging modalities using low-dose protocols, will be presented. For preoperative cross-sectional imaging, data are still not available which demonstrate that cone beam computed tomography results in fewer intraoperative complications such as nerve damage or bleeding incidents, or that implants inserted using preoperative cone beam computed tomography data sets for planning purposes will exhibit higher survival or success rates. The use of cone beam computed tomography following the insertion of dental implants should be restricted to specific postoperative complications, such as damage of neurovascular structures or postoperative infections in relation to the maxillary sinus. Regarding peri-implantitis, the diagnosis and severity of the disease should be evaluated primarily based on clinical parameters and on radiological findings based on periapical radiographs (two dimensional). The use of cone beam computed tomography scans in clinical research might not yield any evident beneficial effect for the patient included. As many of the cone beam computed tomography scans performed for research have no direct therapeutic consequence, dose optimization measures should be implemented by using appropriate exposure parameters and by reducing the field of view to the actual region of interest.

  9. Ideas on Learning a New Language Intertwined with the Current State of Natural Language Processing and Computational Linguistics

    Science.gov (United States)

    Snyder, Robin M.

    2015-01-01

    In 2014, in conjunction with doing research in natural language processing and attending a global conference on computational linguistics, the author decided to learn a new foreign language, Greek, that uses a non-English character set. This paper/session will present/discuss an overview of the current state of natural language processing and…

  10. Toward production from gas hydrates: Current status, assessment of resources, and simulation-based evaluation of technology and potential

    Science.gov (United States)

    Moridis, G.J.; Collett, T.S.; Boswell, R.; Kurihara, M.; Reagan, M.T.; Koh, C.; Sloan, E.D.

    2009-01-01

    Gas hydrates (GHs) are a vast energy resource with global distribution in the permafrost and in the oceans. Even if conservative estimates are considered and only a small fraction is recoverable, the sheer size of the resource is so large that it demands evaluation as a potential energy source. In this review paper, we discuss the distribution of natural GH accumulations, the status of the primary international research and development (R&D) programs, and the remaining science and technological challenges facing the commercialization of production. After a brief examination of GH accumulations that are well characterized and appear to be models for future development and gas production, we analyze the role of numerical simulation in the assessment of the hydrate-production potential, identify the data needs for reliable predictions, evaluate the status of knowledge with regard to these needs, discuss knowledge gaps and their impact, and reach the conclusion that the numerical-simulation capabilities are quite advanced and that the related gaps either are not significant or are being addressed. We review the current body of literature relevant to potential productivity from different types of GH deposits and determine that there are consistent indications of a large production potential at high rates across long periods from a wide variety of hydrate deposits. Finally, we identify (a) features, conditions, geology and techniques that are desirable in potential production targets; (b) methods to maximize production; and (c) some of the conditions and characteristics that render certain GH deposits undesirable for production. Copyright ?? 2009 Society of Petroleum Engineers.

  11. Synthesis of current data for Hg in areas of geologic resource extraction contamination and aquatic systems in China.

    Science.gov (United States)

    Qiu, Guangle; Feng, Xinbin; Jiang, Guibin

    2012-04-01

    China has become the largest contributor of anthropogenic atmospheric mercury (Hg) in the world owing to its fast growing economy and the largest of populations. Over the last two decades, Hg has become of increasing environmental concern in China and much has been published on its distribution, transportation, methylation, and bioaccumulation in aquatic systems and areas of geologic resource extraction contaminated sites, such as coal-fired power plants, non-ferrous smelters, Hg mining and retorting sites, Au amalgam, landfills, chemical plants, etc.. Environmental compartments, like soil, water, air, and crop from areas of geologic resource extraction contamination, especially from Hg mining regions, exhibit elevated values of total-Hg and MMHg. Risk assessments indicate that the consumption of rice, which has a high bioaccumulation of MMHg, has become the dominant pathway of MMHg exposure of inhabitants living in Hg mining areas. Low concentrations less than 5ngl(-1) in total-Hg can be observed in rivers from remote areas, however, high concentrations that reached 1600ngl(-1) in total-Hg can be found in rivers from industrial and urban areas. The studies of hydropower reservoirs of southwest China indicated the old reservoirs act as net sinks for total-Hg and net sources of MMHg, while newly established ones act as net sinks for both total-Hg and MMHg, which is in sharp contrast to the evolution of biomethylation in reservoirs established in the boreal belt of North America and Eurasia. Fish from those reservoirs have relatively low levels of total-Hg, which do not exceed the maximum total-Hg limit of 0.5mgkg(-1) recommended by WHO. Currently, however, there is still a large data gap regarding Hg even in the areas mentioned above in China, which results in poor understanding of its environmental biogeochemistry. Moreover, for a better understanding of human and environmental health effects caused by the fast growing economy, long-term Hg monitoring campaigns are

  12. Method and apparatus for offloading compute resources to a flash co-processing appliance

    Energy Technology Data Exchange (ETDEWEB)

    Tzelnic, Percy; Faibish, Sorin; Gupta, Uday K.; Bent, John; Grider, Gary Alan; Chen, Hsing -bung

    2015-10-13

    Solid-State Drive (SSD) burst buffer nodes are interposed into a parallel supercomputing cluster to enable fast burst checkpoint of cluster memory to or from nearby interconnected solid-state storage with asynchronous migration between the burst buffer nodes and slower more distant disk storage. The SSD nodes also perform tasks offloaded from the compute nodes or associated with the checkpoint data. For example, the data for the next job is preloaded in the SSD node and very fast uploaded to the respective compute node just before the next job starts. During a job, the SSD nodes perform fast visualization and statistical analysis upon the checkpoint data. The SSD nodes can also perform data reduction and encryption of the checkpoint data.

  13. Selected aspects of security mechanisms for cloud computingcurrent solutions and development perspectives

    OpenAIRE

    Aneta Poniszewska-Maranda

    2014-01-01

    The security aspects of cloud computing, especially the security of data, become more and more important. It is necessary to find and develop the new mechanisms to secure the cloud. The problem presented in the paper concerns the mechanisms for security of cloud computing with special attention paid to aspects of access control in clouds - the state of the art and the perspectives for the future.

  14. Terrestrial hydro-climatic change, lake shrinkage and water resource deterioration: Analysis of current to future drivers across Asia

    Science.gov (United States)

    Jarsjo, J.; Beygi, H.; Thorslund, J.

    2016-12-01

    Due to overlapping effects of different anthropogenic pressures and natural variability, main drivers behind on-going changes in the water cycle have in many cases not been identified, which complicates management of water resources. For instance, in many parts of the world, and not least in semi-arid and arid parts of Asia, lowered groundwater levels and shrinkage of surface water bodies with associated salinization and water quality deterioration constitute great challenges. With the aim to identify main drivers and mechanisms behind such changes, we here combine (i) historical observations of long-term, large scale change, (ii) ensemble projections of expected future change from the climate models of the Coupled Model Intercomparison Project Phase 5 (CMIP 5) and (iii) output from water balance modelling. Our particular focus is on regions near shrinking lakes. For the principal Lake Urmia in Iran, results show that agricultural intensification including irrigation expansion has clearly contributed to the surprisingly rapid water quality deterioration and lake shrinkage, from 10% lake area reduction in 2002 to the current value of about 75% (leaving billion of tons of salt exposed in its basin). Nevertheless, runoff decrease due to climate change has had an even larger effect. For the Aral Sea in Central Asia, where problems accelerated much earlier (in the 1990's), land-use change and irrigation expansion can fully explain the disastrous surface water deficits and water quality problems in the extensive low-lying parts of the basin. However, projections show that climate-driven runoff decrease in the headwaters of the Aral Sea basin may become a dominant driver of continued change in the near-future. More generally, present results illustrate that mitigation measures that compensate only for land-use driven effects may not reverse current trends of decreasing water availability, due to increasingly strong impacts of climate-driven runoff decrease. This has

  15. Resources and costs for microbial sequence analysis evaluated using virtual machines and cloud computing.

    Directory of Open Access Journals (Sweden)

    Samuel V Angiuoli

    Full Text Available BACKGROUND: The widespread popularity of genomic applications is threatened by the "bioinformatics bottleneck" resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. RESULTS: We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2, which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. CONCLUSIONS: Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer invested

  16. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    Science.gov (United States)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  17. New resource for the computation of cartilage biphasic material properties with the interpolant response surface method.

    Science.gov (United States)

    Keenan, Kathryn E; Kourtis, Lampros C; Besier, Thor F; Lindsey, Derek P; Gold, Garry E; Delp, Scott L; Beaupre, Gary S

    2009-08-01

    Cartilage material properties are important for understanding joint function and diseases, but can be challenging to obtain. Three biphasic material properties (aggregate modulus, Poisson's ratio and permeability) can be determined using an analytical or finite element model combined with optimisation to find the material properties values that best reproduce an experimental creep curve. The purpose of this study was to develop an easy-to-use resource to determine biphasic cartilage material properties. A Cartilage Interpolant Response Surface was generated from interpolation of finite element simulations of creep indentation tests. Creep indentation tests were performed on five sites across a tibial plateau. A least-squares residual search of the Cartilage Interpolant Response Surface resulted in a best-fit curve for each experimental condition with corresponding material properties. These sites provided a representative range of aggregate moduli (0.48-1.58 MPa), Poisson's ratio (0.00-0.05) and permeability (1.7 x 10(- 15)-5.4 x 10(- 15) m(4)/N s) values found in human cartilage. The resource is freely available from https://simtk.org/home/va-squish.

  18. Current transformers with nanocrystalline alloy toroidal core: analytical, computational and experimental studies

    Directory of Open Access Journals (Sweden)

    Benedito Antonio Luciano

    2012-10-01

    Full Text Available In this paper are presented theoretical analysis and experimental results concerning the performance of toroidal cores used in current transformers. For most problems concerning transformers design, analytical methods are useful, but numerical methods provide a better understanding of the transformers electromagnetic behaviour. Numerical field solutions may be used to determine the electrical equivalent circuit parameters of toroidal core current transformers. Since the exciting current of current transformers alters the ratio and phase angle of primary and secondary currents, it is made as small as possible though the use of high permeability and low loss magnetic material in the construction of the core. According to experimental results presented in this work, in comparison with others soft magnetic materials, nanocrystalline alloys appear as the best material to be used in toroidal core for current transformers.

  19. Out-patient management and non-attendance in the current economic climate. How best to manage our resources?

    LENUS (Irish Health Repository)

    Hennessy, D

    2010-03-01

    Outpatient non-attendance is a considerable source of inefficiency in the health service, wasting time, resources and potentially lengthening waiting lists, Given the current economic climate, methods need to be employed to reduce non-attendance. The aim was to analyse outpatient non-attendance and determine what factors influence attendance. A prospective audit over a two-month period to a tertiary-referral Urological service was performed to determine the clinical and demographic profile of non-attendees. Of 737 appointments, 148 (20%) patients did not attend (DNA). A benign urological condition was evident in 116 cases (78%). This group of patients also accounted for the majority of new patients not attending 40\\/47, returning patients not attending 101\\/148 and the majority of patients who missed multiple appointments 43\\/49. Patients with benign conditions make up the majority of clinic non-attendance. Consideration may be given to discharging such patients back to their general practitioner after one unexplained non-attendance until other alternatives of follow up are available.

  20. Studying the Earth's Environment from Space: Computer Laboratory Exercised and Instructor Resources

    Science.gov (United States)

    Smith, Elizabeth A.; Alfultis, Michael

    1998-01-01

    Studying the Earth's Environment From Space is a two-year project to develop a suite of CD-ROMs containing Earth System Science curriculum modules for introductory undergraduate science classes. Lecture notes, slides, and computer laboratory exercises, including actual satellite data and software, are being developed in close collaboration with Carla Evans of NASA GSFC Earth Sciences Directorate Scientific and Educational Endeavors (SEE) project. Smith and Alfultis are responsible for the Oceanography and Sea Ice Processes Modules. The GSFC SEE project is responsible for Ozone and Land Vegetation Modules. This document constitutes a report on the first year of activities of Smith and Alfultis' project.

  1. Computer-aided technology for fabricating complete dentures: systematic review of historical background, current status, and future perspectives.

    Science.gov (United States)

    Bidra, Avinash S; Taylor, Thomas D; Agar, John R

    2013-06-01

    Computer-aided technology is an emerging method for fabricating complete dentures. Consolidated information about historical background, current status, and scope for the future is lacking. The purpose of this systematic review was to analyze the existing literature on computer-aided technology for fabricating complete dentures and provide the reader with a historical background, current status, and future perspectives on this emerging technology. An electronic search of the English language literature between the periods of January 1957 and June 2012 was performed by using PubMed/MEDLINE with the following specific search terms: CAD-CAM complete dentures, digital complete dentures, computer dentures, designed dentures, machined dentures, manufactured dentures, milled dentures, and rapid prototyping dentures. Additionally, the search terms were used on the Google search engine to identify current commercial manufacturers and their protocols. A total of 1584 English language titles were obtained from the electronic database, and the systematic application of exclusion criteria resulted in the identification of 8 articles pertaining to computer-aided technology for complete dentures. Since the first published report in 1994, multiple authors have described different theoretical models and protocols for fabricating complete dentures with computer-aided technology. Although no clinical trials or clinical reports were identified in the scientific literature, the Google search engine identified 2 commercial manufacturers in the United States currently fabricating complete dentures with computer-aided design and computer-aided manufacturing (CAD/CAM) technology for clinicians world-wide. These manufacturers have definitive protocols in place and offer exclusive dental materials, techniques, and laboratory support. Their protocols contrast with conventional paradigms for fabricating complete dentures and allow the fabrication of complete dentures in 2 clinical appointments

  2. 构建基于移动云计算的微课教学资源平台%Construction of the micro-lecture teaching resource platform based on mobile cloud computing

    Institute of Scientific and Technical Information of China (English)

    朱静宜

    2015-01-01

    移动云计算是指移动终端通过移动网络以按需、易扩展的方式获得所需的基础设施、平台、软件或应用的一种信息资源服务的交付与使用模式,具有高效的数据存储和计算能力,对微课教学资源平台建设产生了积极的作用.基于目前微课教学资源平台建设的背景,结合移动云计算和微课的特点,分析了教学资源平台总体结构并对其进行了构建.%Mobile cloud computing is a kind of information resource service delivery and usage mode, in which the mobile terminals gain the required infrastructure, platform, software or application through mobile network in an on-demand, scalable way. With efficient data storage and computing power, it has a positive effect on the construction of micro-lecture teaching resources platform. In the background of current micro-lecture teaching resource platform construction, according to the characteristics of mobile cloud computing and micro-lecture, this paper analyzes the architecture of the teaching resource platform and makes it constructed.

  3. Unlocking the Treasures of the Ocean: Current Assessment and Future Perspectives of Seafloor Resources (C.F Gauss Lecture)

    Science.gov (United States)

    Jegen, Marion

    2016-04-01

    Oceans cover 70% of the Earth's surface, and there is reason to believe that the wealth of mineral and carbon resources on the seafloor is similar to deposits on land. While off-shore energy resources such as oil and gas are nowadays regarded as conventional, energy resources in form of methane hydrates and seafloor mineral deposits are yet unconventional and at best marginally economic. However, taking into account global population growth, geopolitics and technological development (both in terms of increasing industrialization and possibility to explore and mine seafloor resources), these resources might play a more fundamental role in the future. Resource assessment and understanding of the geological formation process of resources are topics in marine geosciences with broad relevance to society. The lecture presents an overview of the geophysical exploration of the seafloor and its resource potential. Starting from the link of physical parameter anomalies associated with resources, I will explore marine technological developments on how to sense them remotely from the seafloor. Also the question will be addressed of how well we can actually quantify the amount of resources from geophysical data. The process will be illustrated based on theoretical work as well as case studies from around the world.

  4. On current aspects of finite element computational fluid mechanics for turbulent flows

    Science.gov (United States)

    Baker, A. J.

    1982-01-01

    A set of nonlinear partial differential equations suitable for the description of a class of turbulent three-dimensional flow fields in select geometries is identified. On the basis of the concept of enforcing a penalty constraint to ensure accurate accounting of ordering effects, a finite element numerical solution algorithm is established for the equation set and the theoretical aspects of accuracy, convergence and stability are identified and quantized. Hypermatrix constructions are used to formulate the reduction of the computational aspects of the theory to practice. The robustness of the algorithm, and the computer program embodiment, have been verified for pertinent flow configurations.

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  6. 我国当前人力资源会计研究的几点思考%On Current Human Resources Accounting

    Institute of Scientific and Technical Information of China (English)

    吴长海; 王新; 刘刚

    2012-01-01

    Based on reviewing the current research on the human resource accounting, the article thinks that the current human resources accounting research also has some problems and defects. In view of the current limitations of the study, it puts forward some suggestions for future human resources accounting research, so as to inspire the related research.%文章通过对当前人力资源会计的研究现状的回顾,认为当前人力资源会计研究还存在一些问题和缺陷.针对当前研究的不足,提出几点关于未来人力资源会计研究的几点建议,以期对相关的研究有所启示.

  7. Cardea: Providing Support for Dynamic Resource Access in a Distributed Computing Environment

    Science.gov (United States)

    Lepro, Rebekah

    2003-01-01

    The environment framing the modem authorization process span domains of administration, relies on many different authentication sources, and manages complex attributes as part of the authorization process. Cardea facilitates dynamic access control within this environment as a central function of an inter-operable authorization framework. The system departs from the traditional authorization model by separating the authentication and authorization processes, distributing the responsibility for authorization data and allowing collaborating domains to retain control over their implementation mechanisms. Critical features of the system architecture and its handling of the authorization process differentiate the system from existing authorization components by addressing common needs not adequately addressed by existing systems. Continuing system research seeks to enhance the implementation of the current authorization model employed in Cardea, increase the robustness of current features, further the framework for establishing trust and promote interoperability with existing security mechanisms.

  8. Towards Sustaining Water Resources and Aquatic Ecosystems: Forecasting Watershed Risks to Current and Future Land Use Change

    Science.gov (United States)

    Lohse, K. A.; Newburn, D.; Opperman, J. J.; Brooks, C.; Merenlender, A.

    2005-05-01

    Sustaining aquatic resources requires managing existing threats and anticipating future impacts. Resource managers and planners often have limited understanding of the relative effects of human activities on stream conditions and how these effects will change over time. Here we assess and forecast the relative impacts of land use on sediment concentrations in Mediterranean-climate watersheds in California. We focus on the Russian River basin, which supports threatened salmonid populations vulnerable to high levels of fine sediment. We ask the following questions: (1) What are the relative impacts of three different land uses (urban, exurban and agriculture) on the patterns of fine sediment in streams? (2) What is the relative contribution of past and current changes in land use activities on these patterns? and (3) What are the effects of future development on these sediment levels? First, we characterized land use at the parcel scale to calibrate the relative impacts of exurban and urban land use on stream substrate quality, characterized by the concentration of fine sediment surrounding spawning gravels (`embeddedness') in 105 stream reaches. Second, we built multiple ordinal logistic regression models on a subset of watersheds (n=64) and then evaluated substrate quality predictions against observed data from another set of watersheds (n=41). Finally, we coupled these models with spatially explicit land use change models to project future stream conditions and associated uncertainties under different development scenarios for the year 2010. We found that the percent of urban housing and agriculture were significant predictors of in-stream embeddedness. Model results from parcel-level land use data indicated that changes in development were better predictors of fine sediment than total development in a single time period. In addition, our results indicate that exurban development is an important threat to stream systems; increases in the percent of total exurban

  9. The Impact on Future Guidance Programs of Current Developments in Computer Science, Telecommunications, and Biotechnology.

    Science.gov (United States)

    Mitchell, Lynda K.; Hardy, Philippe L.

    The purpose of this chapter is to envision how the era of technological revolution will affect the guidance, counseling, and student support programs of the future. Advances in computer science, telecommunications, and biotechnology are discussed. These advances have the potential to affect dramatically the services of guidance programs of the…

  10. Assessement of rheumatic diseases with computational radiology: current status and future potential

    DEFF Research Database (Denmark)

    Peloschek, Philipp; Boesen, Mikael; Donner, Rene;

    2009-01-01

    In recent years, several computational image analysis methods to assess disease progression in rheumatic diseases were presented. This review article explains the basics of these methods as well as their potential application in rheumatic disease monitoring, it covers radiography, sonography...... as well as magnetic resonance imaging in quantitative analysis frameworks....

  11. Attentional Resource Allocation and Cultural Modulation in a Computational Model of Ritualized Behavior

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Sørensen, Jesper

    2016-01-01

    ritualized behaviors are perceptually similar across a range of behavioral domains, symbolically mediated experience-dependent information (so-called cultural priors) modulate perception such that communal ceremonies appear coherent and culturally meaningful, while compulsive behaviors remain incoherent and......How do cultural and religious rituals influence human perception and cognition, and what separates the highly patterned behaviors of communal ceremonies from perceptually similar precautionary and compulsive behaviors? These are some of the questions that recent theoretical models and empirical......, in some cases, pathological. In this study, we extend a qualitative model of human action perception and understanding to include ritualized behavior. Based on previous experimental and computational studies, the model was simulated using instrumental and ritualized representations of realistic motor...

  12. Attentional Resource Allocation and Cultural Modulation in a Computational Model of Ritualized Behavior

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Sørensen, Jesper

    2015-01-01

    ritualized behaviors are perceptually similar across a range of behavioral domains, symbolically mediated experience-dependent information (so-called cultural priors) modulate perception such that communal ceremonies appear coherent and culturally meaningful, while compulsive behaviors remain incoherent and......How do cultural and religious rituals influence human perception and cognition, and what separates the highly patterned behaviors of communal ceremonies from perceptually similar precautionary and compulsive behaviors? These are some of the questions that recent theoretical models and empirical......, in some cases, pathological. In this study, we extend a qualitative model of human action perception and understanding to include ritualized behavior. Based on previous experimental and computational studies, the model was simulated using instrumental and ritualized representations of realistic motor...

  13. II - Detector simulation for the LHC and beyond : how to match computing resources and physics requirements

    CERN Document Server

    CERN. Geneva

    2016-01-01

    Detector simulation at the LHC is one of the most computing intensive activities. In these lectures we will show how physics requirements were met for the LHC experiments and extrapolate to future experiments (FCC-hh case). At the LHC, detectors are complex, very precise and ambitious: this implies modern modelisation tools for geometry and response. Events are busy and characterised by an unprecedented energy scale with hundreds of particles to be traced and high energy showers to be accurately simulated. Furthermore, high luminosities imply many events in a bunch crossing and many bunch crossings to be considered at the same time. In addition, backgrounds not directly correlated to bunch crossings have also to be taken into account. Solutions chosen for ATLAS (a mixture of detailed simulation and fast simulation/parameterisation) will be described and CPU and memory figures will be given. An extrapolation to the FCC-hh case will be tried by taking as example the calorimeter simulation.

  14. I - Detector Simulation for the LHC and beyond: how to match computing resources and physics requirements

    CERN Document Server

    CERN. Geneva

    2016-01-01

    Detector simulation at the LHC is one of the most computing intensive activities. In these lectures we will show how physics requirements were met for the LHC experiments and extrapolate to future experiments (FCC-hh case). At the LHC, detectors are complex, very precise and ambitious: this implies modern modelisation tools for geometry and response. Events are busy and characterised by an unprecedented energy scale with hundreds of particles to be traced and high energy showers to be accurately simulated. Furthermore, high luminosities imply many events in a bunch crossing and many bunch crossings to be considered at the same time. In addition, backgrounds not directly correlated to bunch crossings have also to be taken into account. Solutions chosen for ATLAS (a mixture of detailed simulation and fast simulation/parameterisation) will be described and CPU and memory figures will be given. An extrapolation to the FCC-hh case will be tried by taking as example the calorimeter simulation.

  15. Computation of Wave, Tide and Wind Current for the South China Sea Under Tropical Cyclones

    Institute of Scientific and Technical Information of China (English)

    朱良生; 宋运法; 邱章; 陈秀华; 麦波强; 丘耀文; 宋丽莉

    2003-01-01

    Based on the third-generation oceanic wave prediction model (WAVEWATCH Ⅲ),the third-generation nearshore wave calculation model (SWAN) and the mathematical tide, tidal current and cyclone current model, which have been improved, interconnected and expanded, a coupled model of offshore wave, tide and sea current under tropical cyclone surges in the South China Sea has been established. The coupled model is driven by the tropical cyclone field containing the background wind field. In order to test the hindcasting effect of the mathematical model, a comparison has been made between the calculated results and the observational results of waves of 15 cyclone cases, water levels and current velocities of the of 7 cyclones. The results of verification indicate that the calculated and observed results are basically identical.

  16. Computer Calculations of Eddy-Current Power Loss in Rotating Titanium Wheels and Rims in Localized Axial Magnetic Fields

    Energy Technology Data Exchange (ETDEWEB)

    Mayhall, D J; Stein, W; Gronberg, J B

    2006-05-15

    We have performed preliminary computer-based, transient, magnetostatic calculations of the eddy-current power loss in rotating titanium-alloy and aluminum wheels and wheel rims in the predominantly axially-directed, steady magnetic fields of two small, solenoidal coils. These calculations have been undertaken to assess the eddy-current power loss in various possible International Linear Collider (ILC) positron target wheels. They have also been done to validate the simulation code module against known results published in the literature. The commercially available software package used in these calculations is the Maxwell 3D, Version 10, Transient Module from the Ansoft Corporation.

  17. The current status of cone beam computed tomography imaging in orthodontics

    OpenAIRE

    S. Kapila; Conley, R S; Harrell, W E

    2011-01-01

    Cone beam CT (CBCT) has become an increasingly important source of three dimensional (3D) volumetric data in clinical orthodontics since its introduction into dentistry in 1998. The purpose of this manuscript is to highlight the current understanding of, and evidence for, the clinical use of CBCT in orthodontics, and to review the findings to answer clinically relevant questions. Currently available information from studies using CBCT can be organized into five broad categories: 1, the assess...

  18. Optimization of hydrofoil for tidal current turbine based on particle swarm optimization and computational fluid dynamic method

    OpenAIRE

    Zhang De-Sheng; Chen Jian; Shi Wei-Dong; Shi Lei; Geng Lin-Lin

    2016-01-01

    Both efficiency and cavitation performance of the hydrofoil are the key technologies to design the tidal current turbine. In this paper, the hydrofoil efficiency and lift coefficient were improved based on particle swarm optimization method and XFoil codes. The cavitation performance of the optimized hydrofoil was also discussed by the computational fluid dynamic. Numerical results show the efficiency of the optimized hydrofoil was improved 11% ranging from...

  19. Adaptive TrimTree: Green Data Center Networks through Resource Consolidation, Selective Connectedness and Energy Proportional Computing

    Directory of Open Access Journals (Sweden)

    Saima Zafar

    2016-10-01

    Full Text Available A data center is a facility with a group of networked servers used by an organization for storage, management and dissemination of its data. The increase in data center energy consumption over the past several years is staggering, therefore efforts are being initiated to achieve energy efficiency of various components of data centers. One of the main reasons data centers have high energy inefficiency is largely due to the fact that most organizations run their data centers at full capacity 24/7. This results into a number of servers and switches being underutilized or even unutilized, yet working and consuming electricity around the clock. In this paper, we present Adaptive TrimTree; a mechanism that employs a combination of resource consolidation, selective connectedness and energy proportional computing for optimizing energy consumption in a Data Center Network (DCN. Adaptive TrimTree adopts a simple traffic-and-topology-based heuristic to find a minimum power network subset called ‘active network subset’ that satisfies the existing network traffic conditions while switching off the residual unused network components. A ‘passive network subset’ is also identified for redundancy which consists of links and switches that can be required in future and this subset is toggled to sleep state. An energy proportional computing technique is applied to the active network subset for adapting link data rates to workload thus maximizing energy optimization. We have compared our proposed mechanism with fat-tree topology and ElasticTree; a scheme based on resource consolidation. Our simulation results show that our mechanism saves 50%–70% more energy as compared to fat-tree and 19.6% as compared to ElasticTree, with minimal impact on packet loss percentage and delay. Additionally, our mechanism copes better with traffic anomalies and surges due to passive network provision.

  20. Computer Generated Imagery (CGI) Current Technology and Cost Measures Feasibility Study.

    Science.gov (United States)

    1980-09-26

    Commercial Airlines Organization ( ICAO ) publication. Academia offers another potential source of technology infor- mation, especially with respect to...as Computer Graphics World and the Inter- national Commercial Airlines Organization ( ICAO ) publication. In addition, the proceedings of conferences...and its viewing direction. Eyepoint In a CIG ATD, the eyepoint is the simulated single point location of the observer’s eye relative to a monocular

  1. Information management. Computer resources for the occupational and environmental health nurse.

    Science.gov (United States)

    Amann, M C

    1999-12-01

    Occupational and environmental health nurses are responsible for the management of large amounts of very complex information, ranging from individual employee health records to reports that insure corporate compliance. There are four primary tools available to the occupational health nurse to facilitate efficient management and use of health information--occupational health information systems, office support programs, communication systems, and the Internet and intranets. Selection and implementation of an integrated health information system requires the involvement of any organization that uses data processed by the system. A project management approach to implementation and maintenance of a system insures adherence to time lines and attention to details. The internet provides access to a vast amount of information useful to both the occupational health professional and the employee. Intranets are internal systems that may facilitate distribution of health information to employees, maintenance of current health related policies, and more efficient reporting procedures.

  2. Cranial electrotherapy stimulation and transcranial pulsed current stimulation: a computer based high-resolution modeling study.

    Science.gov (United States)

    Datta, Abhishek; Dmochowski, Jacek P; Guleyupoglu, Berkan; Bikson, Marom; Fregni, Felipe

    2013-01-15

    The field of non-invasive brain stimulation has developed significantly over the last two decades. Though two techniques of noninvasive brain stimulation--transcranial direct current stimulation (tDCS) and transcranial magnetic stimulation (TMS)--are becoming established tools for research in neuroscience and for some clinical applications, related techniques that also show some promising clinical results have not been developed at the same pace. One of these related techniques is cranial electrotherapy stimulation (CES), a class of transcranial pulsed current stimulation (tPCS). In order to understand further the mechanisms of CES, we aimed to model CES using a magnetic resonance imaging (MRI)-derived finite element head model including cortical and also subcortical structures. Cortical electric field (current density) peak intensities and distributions were analyzed. We evaluated different electrode configurations of CES including in-ear and over-ear montages. Our results confirm that significant amounts of current pass the skull and reach cortical and subcortical structures. In addition, depending on the montage, induced currents at subcortical areas, such as midbrain, pons, thalamus and hypothalamus are of similar magnitude than that of cortical areas. Incremental variations of electrode position on the head surface also influence which cortical regions are modulated. The high-resolution modeling predictions suggest that details of electrode montage influence current flow through superficial and deep structures. Finally we present laptop based methods for tPCS dose design using dominant frequency and spherical models. These modeling predictions and tools are the first step to advance rational and optimized use of tPCS and CES.

  3. A computational model of the ionic currents, Ca2+ dynamics and action potentials underlying contraction of isolated uterine smooth muscle.

    Directory of Open Access Journals (Sweden)

    Wing-Chiu Tong

    Full Text Available Uterine contractions during labor are discretely regulated by rhythmic action potentials (AP of varying duration and form that serve to determine calcium-dependent force production. We have employed a computational biology approach to develop a fuller understanding of the complexity of excitation-contraction (E-C coupling of uterine smooth muscle cells (USMC. Our overall aim is to establish a mathematical platform of sufficient biophysical detail to quantitatively describe known uterine E-C coupling parameters and thereby inform future empirical investigations of physiological and pathophysiological mechanisms governing normal and dysfunctional labors. From published and unpublished data we construct mathematical models for fourteen ionic currents of USMCs: Ca2+ currents (L- and T-type, Na+ current, an hyperpolarization-activated current, three voltage-gated K+ currents, two Ca2+-activated K+ current, Ca2+-activated Cl current, non-specific cation current, Na+-Ca2+ exchanger, Na+-K+ pump and background current. The magnitudes and kinetics of each current system in a spindle shaped single cell with a specified surface area:volume ratio is described by differential equations, in terms of maximal conductances, electrochemical gradient, voltage-dependent activation/inactivation gating variables and temporal changes in intracellular Ca2+ computed from known Ca2+ fluxes. These quantifications are validated by the reconstruction of the individual experimental ionic currents obtained under voltage-clamp. Phasic contraction is modeled in relation to the time constant of changing [Ca2+]i. This integrated model is validated by its reconstruction of the different USMC AP configurations (spikes, plateau and bursts of spikes, the change from bursting to plateau type AP produced by estradiol and of simultaneous experimental recordings of spontaneous AP, [Ca2+]i and phasic force. In summary, our advanced mathematical model provides a powerful tool to

  4. Analysis on Current Situation and Countermeasure of Domestic Electronic Commerce Logistics in the Internet Age——Based on Resource Dependence Theory

    Directory of Open Access Journals (Sweden)

    Zhang Jiapeng

    2017-01-01

    Full Text Available This paper analyzes the status of electric business logistics in the current Internet era in China, and combines the SWOT analysis with AHP to do the empirical analysis, then puts forward the countermeasure that the electric business logistics resource should be shared based on the resource dependence theory. Through the empirical analysis, it is found that the disadvantages and opportunities of the logistics status are important in the Internet era.The resource sharing strategy based on the resource dependence theory is more scientific. The rational use of Internet technology in electric business logistics industry can achieve “sharing”. It is of great significance for its balanced development, intelligent development and optimization and development.

  5. Current advances in molecular, biochemical, and computational modeling analysis of microalgal triacylglycerol biosynthesis.

    Science.gov (United States)

    Lenka, Sangram K; Carbonaro, Nicole; Park, Rudolph; Miller, Stephen M; Thorpe, Ian; Li, Yantao

    2016-01-01

    Triacylglycerols (TAGs) are highly reduced energy storage molecules ideal for biodiesel production. Microalgal TAG biosynthesis has been studied extensively in recent years, both at the molecular level and systems level through experimental studies and computational modeling. However, discussions of the strategies and products of the experimental and modeling approaches are rarely integrated and summarized together in a way that promotes collaboration among modelers and biologists in this field. In this review, we outline advances toward understanding the cellular and molecular factors regulating TAG biosynthesis in unicellular microalgae with an emphasis on recent studies on rate-limiting steps in fatty acid and TAG synthesis, while also highlighting new insights obtained from the integration of multi-omics datasets with mathematical models. Computational methodologies such as kinetic modeling, metabolic flux analysis, and new variants of flux balance analysis are explained in detail. We discuss how these methods have been used to simulate algae growth and lipid metabolism in response to changing culture conditions and how they have been used in conjunction with experimental validations. Since emerging evidence indicates that TAG synthesis in microalgae operates through coordinated crosstalk between multiple pathways in diverse subcellular destinations including the endoplasmic reticulum and plastids, we discuss new experimental studies and models that incorporate these findings for discovering key regulatory checkpoints. Finally, we describe tools for genetic manipulation of microalgae and their potential for future rational algal strain design. This comprehensive review explores the potential synergistic impact of pathway analysis, computational approaches, and molecular genetic manipulation strategies on improving TAG production in microalgae.

  6. URECA: Efficient Resource Location Middleware for Ubiquitous Environment

    Institute of Scientific and Technical Information of China (English)

    Donggeon Noh; Heonshik Shin

    2008-01-01

    We describe an effective resource location framework for ubiquitous computing environments populated by a diverse set of networks, devices, services and computational entities. Our framework provides context adaptation with the aid of a middleware service to improve the quality of resource location. A resource location protocol suitable to each type of network locates resource effectively by means of dynamic reconfiguration to the current context. Our framework is also refined by support for interoperability between different types of resource location protocols occurring across a hybrid ubiquitous network. These characteristics also reduce the control overhead for resource location, saving resource, decreasing latency and permitting a considerable degree of scalability.

  7. Computational and experimental studies of laser cutting of the current collectors for lithium-ion batteries

    Science.gov (United States)

    Lee, Dongkyoung; Patwa, Rahul; Herfurth, Hans; Mazumder, Jyotirmoy

    2012-07-01

    Sizing electrodes is an important step during Lithium-ion battery manufacturing processes since poor cut edge affects battery performance significantly and sometime leads to fire hazard. Mechanical cutting could result in a poor cut quality with defects. The cutting quality can be improved by using a laser, due to its high energy concentration, fast processing time, small heat-affected zone, and high precision. The cutting quality is highly influenced by operating parameters such as laser power and scanning speed. Thus, we studied a numerical simulation to provide a guideline for achieving clear edge quality. In order to simulate electrodes laser cutting for Lithium-Ion batteries, understanding the behavior of current collectors is crucial. This study focuses on current collectors, such as pure copper and aluminium. Numerical studies utilized a 3D self-consistent mathematical model for laser-material interaction. Observations of penetration time, depth, and threshold during laser cutting processes of current collectors are described. The model is validated experimentally by cutting current collectors and single side-coated electrodes with a single mode fiber laser. The copper laser cutting is laser intensity and interaction time dependent process. The aluminium laser cutting depends more on laser intensity than the interaction time. Numerical and experimental results show good agreement.

  8. Grid Computing

    Science.gov (United States)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  9. Future Performance Trend Indicators: A Current Value Approach to Human Resources Accounting. Report I. Internal Consistencies and Relationships to Performance By Site. Final Report.

    Science.gov (United States)

    Pecorella, Patricia A.; Bowers, David G.

    Analyses preparatory to construction of a suitable file for generating a system of future performance trend indicators are described. Such a system falls into the category of a current value approach to human resources accounting. It requires that there be a substantial body of data which: (1) uses the work group or unit, not the individual, as…

  10. The EGI-Engage EPOS Competence Center - Interoperating heterogeneous AAI mechanisms and Orchestrating distributed computational resources

    Science.gov (United States)

    Bailo, Daniele; Scardaci, Diego; Spinuso, Alessandro; Sterzel, Mariusz; Schwichtenberg, Horst; Gemuend, Andre

    2016-04-01

    manage the use of the subsurface of the Earth. EPOS started its Implementation Phase in October 2015 and is now actively working in order to integrate multidisciplinary data into a single e-infrastructure. Multidisciplinary data are organized and governed by the Thematic Core Services (TCS) - European wide organizations and e-Infrastructure providing community specific data and data products - and are driven by various scientific communities encompassing a wide spectrum of Earth science disciplines. TCS data, data products and services will be integrated into the Integrated Core Services (ICS) system, that will ensure their interoperability and access to these services by the scientific community as well as other users within the society. The EPOS competence center (EPOS CC) goal is to tackle two of the main challenges that the ICS are going to face in the near future, by taking advantage of the technical solutions provided by EGI. In order to do this, we will present the two pilot use cases the EGI-EPOS CC is developing: 1) The AAI pilot, dealing with the provision of transparent and homogeneous access to the ICS infrastructure to users owning different kind of credentials (e.g. eduGain, OpenID Connect, X509 certificates etc.). Here the focus is on the mechanisms which allow the credential delegation. 2) The computational pilot, Improve the back-end services of an existing application in the field of Computational Seismology, developed in the context of the EC funded project VERCE. The application allows the processing and the comparison of data resulting from the simulation of seismic wave propagation following a real earthquake and real measurements recorded by seismographs. While the simulation data is produced directly by the users and stored in a Data Management System, the observations need to be pre-staged from institutional data-services, which are maintained by the community itself. This use case aims at exploiting the EGI FedCloud e-infrastructure for Data

  11. Sectoral Vulnerabilities to Changing Water Resources: Current and Future Tradeoffs between Supply and Demand in the Conterminous U.S

    Science.gov (United States)

    Meldrum, J.; Averyt, K.; Caldwell, P.; Sun, G.; Huber-lee, A. T.; McNulty, S.

    2012-12-01

    Assessing the sustainability of human activities depends, in part, on the availability of water supplies to meet the demands of those activities. Thermoelectric cooling, agriculture, and municipal uses all compete for water supplies, but each sector differs in its characteristic ratio of water consumption versus withdrawals. This creates different implications for contributing to water supply stress and, conversely, vulnerabilities within each sector to changing water supplies. In this study, we use two measures of water stress, relating to water withdrawals and to water consumption, and calculate the role of each of these three sectors in contributing to the two different measures. We estimate water stress with an enhanced version of the Water Supply Stress Index (WaSSI), calculating the ratio of water demand to water supply at the 8-digit Hydrologic Unit Code (HUC) scale (Sun et al. 2008, 2011; Caldwell et al. 2011). Current water supplies are based on an integrated water balance and flow routing model of the conterminous United States, which accounts for surface water supply, groundwater supply, and major return flows. Future supplies are based on simulated regional changes in streamflow in 2050 from an ensemble of 12 climate models (Milly et al. 2005). We estimate water demands separately for agriculture, municipal uses, and thermoelectric cooling, with the first two based on Kenny et al. (2005) and the last on the approach of Averyt et al. (2011). We find substantial regional variation not only in the overall WaSSI for withdrawals and consumption but also in contribution of the three water use sectors to that total. Results suggest that the relative vulnerabilities of different sectors of human activity to water supply stress vary spatially and that policies for alleviating that stress must consider the specific, regional context of the tradeoffs between competing water demands. Ref's: Averyt, K., Fisher, J., Huber-Lee, A., Lewis, A., Macknick, J., Madden, N

  12. A computational investigation of cardiac caveolae as a source of persistent sodium current

    Directory of Open Access Journals (Sweden)

    Ian M. Besse

    2011-11-01

    Full Text Available Recent studies of cholesterol-rich membrane microdomains, called caveolae, reveal that caveolae are reservoirs of recruitable sodium ion channels. Caveolar channels constitute a substantial and previously unrecognized source of sodium current in cardiac cells. In this paper we model for the first time caveolar sodium currents and their contributions to cardiac action potential morphology. We show that the beta-agonist-induced opening of caveolae may have substantial impacts on peak overshoot, maximum upstroke velocity, and ultimately conduction velocity. Additionally, we show that prolonged action potentials and the formation of potentially arrhythmogenic afterdepolarizations, can arise if caveolae open intermittently throughout the action potential. Our simulations suggest that there may exist routes to delayed repolarization, and the arrhythmias associated with such delays, that are independent of channelopathies.

  13. Integrated Computational Materials Engineering of Titanium: Current Capabilities Being Developed Under the Metals Affordability Initiative

    Science.gov (United States)

    Glavicic, M. G.; Venkatesh, V.

    2014-07-01

    A technical review of the titanium model development programs currently funded under the Metals Affordability Initiative is presented. Progress of the "Advanced Titanium Alloy Microstructure and Mechanical Property Modeling" and "ICME of Microtexture Evolution and its Effect on Cold Dwell/High/Low Cycle Fatigue Behavior of Dual Phase Titanium Alloys" will be reviewed followed by a discussion of the future modeling needs of the aerospace industry.

  14. Development, computer simulation and performance testing in sodium of an eddy current flowmeter

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Prashant, E-mail: pacific@igcar.gov.i [Indira Gandhi Centre for Atomic Research, Kalpakkam 603 102 (India); Suresh Kumar, S.; Nashine, B.K.; Veerasamy, R.; Krishnakumar, B.; Kalyanasundaram, P.; Vaidyanathan, G. [Indira Gandhi Centre for Atomic Research, Kalpakkam 603 102 (India)

    2010-03-15

    Sodium is used as a coolant in Liquid Metal Fast Breeder Reactor (LMFBR). Sodium flow measurement is of prime importance both from the operational and safety aspects of a fast reactor. Various types of flowmeters namely permanent magnet, saddle type and eddy current flowmeters are used in FBRs. From the safety point of view flow through the core should be assured under all operating conditions. This requires a flow sensor which can withstand the high temperature sodium environment and can meet the dimensional constraints and be amenable to maintenance. Eddy current flowmeter (ECFM) is one such device which meets these requirements. It is meant for measuring flow in PFBR primary pump and also at the outlets of the fuel sub-assemblies to detect flow blockage. A simulation model of ECFM was made and output of ECFM was predicted for various flowrates and temperatures. The simulation model was validated by testing in a sodium loop. This paper deals with the design, simulation and tests conducted in sodium for the eddy current flowmeter for use in the Prototype Fast Breeder Reactor (PFBR).

  15. The current status of cone beam computed tomography imaging in orthodontics.

    Science.gov (United States)

    Kapila, S; Conley, R S; Harrell, W E

    2011-01-01

    Cone beam CT (CBCT) has become an increasingly important source of three dimensional (3D) volumetric data in clinical orthodontics since its introduction into dentistry in 1998. The purpose of this manuscript is to highlight the current understanding of, and evidence for, the clinical use of CBCT in orthodontics, and to review the findings to answer clinically relevant questions. Currently available information from studies using CBCT can be organized into five broad categories: 1, the assessment of CBCT technology; 2, its use in craniofacial morphometric analyses; 3, incidental and missed findings; 4, analysis of treatment outcomes; and 5, efficacy of CBCT in diagnosis and treatment planning. The findings in these topical areas are summarized, followed by current indications and protocols for the use of CBCT in specific cases. Despite the increasing popularity of CBCT in orthodontics, and its advantages over routine radiography in specific cases, the effects of information derived from these images in altering diagnosis and treatment decisions has not been demonstrated in several types of cases. It has therefore been recommended that CBCT be used in select cases in which conventional radiography cannot supply satisfactory diagnostic information; these include cleft palate patients, assessment of unerupted tooth position, supernumerary teeth, identification of root resorption and for planning orthognathic surgery. The need to image other types of cases should be made on a case-by-case basis following an assessment of benefits vs risks of scanning in these situations.

  16. Impact of tube current modulation on lesion conspicuity index in hi-resolution chest computed tomography

    Science.gov (United States)

    Szczepura, Katy; Tomkinson, David; Manning, David

    2017-03-01

    Tube current modulation is a method employed in the use of CT in an attempt to optimize radiation dose to the patient. The acceptable noise (noise index) can be varied, based on the level of optimization required; higher accepted noise reduces the patient dose. Recent research [1] suggests that measuring the conspicuity index (C.I.) of focal lesions within an image is more reflective of a clinical reader's ability to perceive focal lesions than traditional physical measures such as contrast to noise (CNR) and signal to noise ratio (SNR). Software has been developed and validated to calculate the C.I. in DICOM images. The aim of this work is assess the impact of tube current modulation on conspicuity index and CTDIvol, to indicate the benefits and limitations of tube current modulation on lesion detectability. Method An anthropomorphic chest phantom was used "Lungman" with inserted lesions of varying size and HU (see table below) a range of Hounsfield units and sizes were used to represent the variation in lesion Hounsfield units found. This meant some lesions had negative Hounsfield unit values.

  17. Computational Modeling of Submarine Oil Spill with Current and Wave by FLUENT

    Directory of Open Access Journals (Sweden)

    Wei Li

    2013-05-01

    Full Text Available As the oil spill models are usually based on the sea surface and few researches are for submarine oil spill nowadays, the simulation for submarine pipeline oil spill is discussed by FLUENT to forecast the trajectory of oil. The coupling of pressure and velocity under unsteady-state condition is solved by pressure implicit with splitting of operator’s algorithm and the boundary condition of nonlinear free surface is solved by volume of fluid. The simulation of oil particles motion is carried out. Furthermore, the quantity and trajectory of spilled oil under different operating pressure, current velocities and wave lengths are compared and analyzed. The results show that wave and current have important effects on the location and oil film area on sea surface. The submarine diffusion scope of spilled oil is smaller with larger operating pressure or lower current velocity. With wave length increasing, the water depth influenced by wave, the scope of oil dispersion underwater and the oil film area on surface increase.

  18. The current status of cone beam computed tomography imaging in orthodontics

    Science.gov (United States)

    Kapila, S; Conley, R S; Harrell, W E

    2011-01-01

    Cone beam CT (CBCT) has become an increasingly important source of three dimensional (3D) volumetric data in clinical orthodontics since its introduction into dentistry in 1998. The purpose of this manuscript is to highlight the current understanding of, and evidence for, the clinical use of CBCT in orthodontics, and to review the findings to answer clinically relevant questions. Currently available information from studies using CBCT can be organized into five broad categories: 1, the assessment of CBCT technology; 2, its use in craniofacial morphometric analyses; 3, incidental and missed findings; 4, analysis of treatment outcomes; and 5, efficacy of CBCT in diagnosis and treatment planning. The findings in these topical areas are summarized, followed by current indications and protocols for the use of CBCT in specific cases. Despite the increasing popularity of CBCT in orthodontics, and its advantages over routine radiography in specific cases, the effects of information derived from these images in altering diagnosis and treatment decisions has not been demonstrated in several types of cases. It has therefore been recommended that CBCT be used in select cases in which conventional radiography cannot supply satisfactory diagnostic information; these include cleft palate patients, assessment of unerupted tooth position, supernumerary teeth, identification of root resorption and for planning orthognathic surgery. The need to image other types of cases should be made on a case-by-case basis following an assessment of benefits vs risks of scanning in these situations. PMID:21159912

  19. Computation of currents induced by ELF electric fields in anisotropic human tissues using the Finite Integration Technique (FIT

    Directory of Open Access Journals (Sweden)

    V. C. Motrescu

    2005-01-01

    Full Text Available In the recent years, the task of estimating the currents induced within the human body by environmental electromagnetic fields has received increased attention from scientists around the world. While important progress was made in this direction, the unpredictable behaviour of living biological tissue made it difficult to quantify its reaction to electromagnetic fields and has kept the problem open. A successful alternative to the very difficult one of performing measurements is that of computing the fields within a human body model using numerical methods implemented in a software code. One of the difficulties is represented by the fact that some tissue types exhibit an anisotropic character with respect to their dielectric properties. Our work consists of computing currents induced by extremely low frequency (ELF electric fields in anisotropic muscle tissues using in this respect, a human body model extended with muscle fibre orientations as well as an extended version of the Finite Integration Technique (FIT able to compute fully anisotropic dielectric properties.

  20. The current status of the development of the technology on 3D computer simulation in Japan

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hee Reyoung; Park, Seung Kook; Chung, Un Soo; Jung, Ki Jung

    2002-05-01

    The development background and property of the COSIDA, which is the 3D computer simulation system for the analysis on the dismantling procedure of the nuclear facilities in Japan was reviewed. The function of the visualization on the work area, Kinematics analysis and dismantling scenario analysis, which are the sub systems of the COSIDA, has been investigated. The physical, geometrical and radiological properties were modelled in 2D or 3D in the sub system of the visualization of the work area. In the sub system of the kinematics analysis, the command set on the basic work procedure for the control of the motion of the models at a cyber space was driven. The suitability of the command set was estimated by the application of COSIDA to the programming on the motion of the remote dismantling tools for dismantling the components of the nuclear facilities at cyber space.

  1. Computer Aided-Diagnosis of Prostate Cancer on Multiparametric MRI: A Technical Review of Current Research

    Directory of Open Access Journals (Sweden)

    Shijun Wang

    2014-01-01

    Full Text Available Prostate cancer (PCa is the most commonly diagnosed cancer among men in the United States. In this paper, we survey computer aided-diagnosis (CADx systems that use multiparametric magnetic resonance imaging (MP-MRI for detection and diagnosis of prostate cancer. We review and list mainstream techniques that are commonly utilized in image segmentation, registration, feature extraction, and classification. The performances of 15 state-of-the-art prostate CADx systems are compared through the area under their receiver operating characteristic curves (AUC. Challenges and potential directions to further the research of prostate CADx are discussed in this paper. Further improvements should be investigated to make prostate CADx systems useful in clinical practice.

  2. Computer aided-diagnosis of prostate cancer on multiparametric MRI: a technical review of current research.

    Science.gov (United States)

    Wang, Shijun; Burtt, Karen; Turkbey, Baris; Choyke, Peter; Summers, Ronald M

    2014-01-01

    Prostate cancer (PCa) is the most commonly diagnosed cancer among men in the United States. In this paper, we survey computer aided-diagnosis (CADx) systems that use multiparametric magnetic resonance imaging (MP-MRI) for detection and diagnosis of prostate cancer. We review and list mainstream techniques that are commonly utilized in image segmentation, registration, feature extraction, and classification. The performances of 15 state-of-the-art prostate CADx systems are compared through the area under their receiver operating characteristic curves (AUC). Challenges and potential directions to further the research of prostate CADx are discussed in this paper. Further improvements should be investigated to make prostate CADx systems useful in clinical practice.

  3. Current Advances in the Computational Simulation of the Formation of Low-Mass Stars

    Energy Technology Data Exchange (ETDEWEB)

    Klein, R I; Inutsuka, S; Padoan, P; Tomisaka, K

    2005-10-24

    Developing a theory of low-mass star formation ({approx} 0.1 to 3 M{sub {circle_dot}}) remains one of the most elusive and important goals of theoretical astrophysics. The star-formation process is the outcome of the complex dynamics of interstellar gas involving non-linear interactions of turbulence, gravity, magnetic field and radiation. The evolution of protostellar condensations, from the moment they are assembled by turbulent flows to the time they reach stellar densities, spans an enormous range of scales, resulting in a major computational challenge for simulations. Since the previous Protostars and Planets conference, dramatic advances in the development of new numerical algorithmic techniques have been successfully implemented on large scale parallel supercomputers. Among such techniques, Adaptive Mesh Refinement and Smooth Particle Hydrodynamics have provided frameworks to simulate the process of low-mass star formation with a very large dynamic range. It is now feasible to explore the turbulent fragmentation of molecular clouds and the gravitational collapse of cores into stars self-consistently within the same calculation. The increased sophistication of these powerful methods comes with substantial caveats associated with the use of the techniques and the interpretation of the numerical results. In this review, we examine what has been accomplished in the field and present a critique of both numerical methods and scientific results. We stress that computational simulations should obey the available observational constraints and demonstrate numerical convergence. Failing this, results of large scale simulations do not advance our understanding of low-mass star formation.

  4. Optimisation of the usage of LHC and local computing resources in a multidisciplinary physics department hosting a WLCG Tier-2 centre

    Science.gov (United States)

    Barberis, Stefano; Carminati, Leonardo; Leveraro, Franco; Mazza, Simone Michele; Perini, Laura; Perlz, Francesco; Rebatto, David; Tura, Ruggero; Vaccarossa, Luca; Villaplana, Miguel

    2015-12-01

    We present the approach of the University of Milan Physics Department and the local unit of INFN to allow and encourage the sharing among different research areas of computing, storage and networking resources (the largest ones being those composing the Milan WLCG Tier-2 centre and tailored to the needs of the ATLAS experiment). Computing resources are organised as independent HTCondor pools, with a global master in charge of monitoring them and optimising their usage. The configuration has to provide satisfactory throughput for both serial and parallel (multicore, MPI) jobs. A combination of local, remote and cloud storage options are available. The experience of users from different research areas operating on this shared infrastructure is discussed. The promising direction of improving scientific computing throughput by federating access to distributed computing and storage also seems to fit very well with the objectives listed in the European Horizon 2020 framework for research and development.

  5. Dosage considerations for transcranial direct current stimulation in children: a computational modeling study.

    Directory of Open Access Journals (Sweden)

    Sudha Kilaru Kessler

    Full Text Available Transcranial direct current stimulation (tDCS is being widely investigated in adults as a therapeutic modality for brain disorders involving abnormal cortical excitability or disordered network activity. Interest is also growing in studying tDCS in children. Limited empirical studies in children suggest that tDCS is well tolerated and may have a similar safety profile as in adults. However, in electrotherapy as in pharmacotherapy, dose selection in children requires special attention, and simple extrapolation from adult studies may be inadequate. Critical aspects of dose adjustment include 1 differences in neurophysiology and disease, and 2 variation in brain electric fields for a specified dose due to gross anatomical differences between children and adults. In this study, we used high-resolution MRI derived finite element modeling simulations of two healthy children, ages 8 years and 12 years, and three healthy adults with varying head size to compare differences in electric field intensity and distribution. Multiple conventional and high-definition tDCS montages were tested. Our results suggest that on average, children will be exposed to higher peak electrical fields for a given applied current intensity than adults, but there is likely to be overlap between adults with smaller head size and children. In addition, exposure is montage specific. Variations in peak electrical fields were seen between the two pediatric models, despite comparable head size, suggesting that the relationship between neuroanatomic factors and bioavailable current dose is not trivial. In conclusion, caution is advised in using higher tDCS doses in children until 1 further modeling studies in a larger group shed light on the range of exposure possible by applied dose and age and 2 further studies correlate bioavailable dose estimates from modeling studies with empirically tested physiologic effects, such as modulation of motor evoked potentials after stimulation.

  6. Water Resources Status and Availability Assessment in Current and Future Climate Change Scenarios for Beas River Basin of North Western Himalaya

    Science.gov (United States)

    Aggarwal, S. P.; Thakur, P. K.; Garg, V.; Nikam, B. R.; Chouksey, A.; Dhote, P.; Bhattacharya, T.

    2016-10-01

    The water resources status and availability of any river basin is of primary importance for overall and sustainable development of any river basin. This study has been done in Beas river basin which is located in North Western Himalaya for assessing the status of water resources in present and future climate change scenarios. In this study hydrological modelling approach has been used for quantifying the water balance components of Beas river basin upto Pandoh. The variable infiltration capacity (VIC) model has been used in energy balance mode for Beas river basin at 1km grid scale. The VIC model has been run with snow elevation zones files to simulate the snow module of VIC. The model was run with National Centre for Environmental Prediction (NCEP) forcing data (Tmax, Tmin, Rainfall and wind speed at 0.5degree resolution) from 1 Jan. 1999 to 31 Dec 2006 for calibration purpose. The additional component of glacier melt was added into overall river runoff using semi-empirical approach utilizing air temperature and glacier type and extent data. The ground water component is computed from overall recharge of ground water by water balance approach. The overall water balance approach is validated with river discharge data provided by Bhakra Beas Management Board (BBMB) from 1994-2014. VIC routing module was used to assess pixel wise flow availability at daily, monthly and annual time scales. The mean monthly flow at Pandoh during study period varied from 19 - 1581 m3/s from VIC and 50 to 1556 m3/sec from observation data, with minimum water flow occurring in month of January and maximum flow in month of August with annual R2 of 0.68. The future climate change data is taken from CORDEX database. The climate model of NOAA-GFDL-ESM2M for IPCC RCP scenario 4.5 and 8.5 were used for South Asia at 0.44 deg. grid from year 2006 to 2100. The climate forcing data for VIC model was prepared using daily maximum and minimum near surface air temperature, daily precipitation and

  7. Computation of antenna pattern correlation and MIMO performance by means of surface current distribution and spherical wave theory

    Directory of Open Access Journals (Sweden)

    O. Klemp

    2006-01-01

    Full Text Available In order to satisfy the stringent demand for an accurate prediction of MIMO channel capacity and diversity performance in wireless communications, more effective and suitable models that account for real antenna radiation behavior have to be taken into account. One of the main challenges is the accurate modeling of antenna correlation that is directly related to the amount of channel capacity or diversity gain which might be achieved in multi element antenna configurations. Therefore spherical wave theory in electromagnetics is a well known technique to express antenna far fields by means of a compact field expansion with a reduced number of unknowns that was recently applied to derive an analytical approach in the computation of antenna pattern correlation. In this paper we present a novel and efficient computational technique to determine antenna pattern correlation based on the evaluation of the surface current distribution by means of a spherical mode expansion.

  8. Exploring the structural requirements for jasmonates and related compounds as novel plant growth regulators: a current computational perspective.

    Science.gov (United States)

    Chen, Ke-Xian; Li, Zu-Guang

    2009-11-01

    Jasmonates and related compounds have been highlighted recently in the field of plant physiology and plant molecular biology due to their significant regulatory roles in the signaling pathway for the diverse aspects of plant development and survival. Though a considerable amount of studies concerning their biological effects in different plants have been widely reported, the molecular details of the signaling mechanism are still poorly understood. This review sheds new light on the structural requirements for the bioactivity/property of jasmonic acid derivatives in current computational perspective, which differs from previous research that mainly focus on their biological evaluation, gene and metabolic regulation and the enzymes in their biosynthesis. The computational results may contribute to further understanding the mechanism of drug-receptor interactions in their signaling pathway and designing novel plant growth regulators as high effective ecological pesticides.

  9. Robot-assisted and computer-enhanced therapies for children with cerebral palsy: current state and clinical implementation.

    Science.gov (United States)

    Meyer-Heim, Andreas; van Hedel, Hubertus J A

    2013-06-01

    The field of pediatric neurorehabilitation has rapidly evolved with the introduction of technological advancements over recent years. Rehabilitation robotics and computer-assisted systems can complement conventional physiotherapeutics or occupational therapies. These systems appear promising, especially in children, where exciting and challenging virtual reality scenarios could increase motivation to train intensely in a playful therapeutic environment. Despite promising experience and a large acceptance by the patients and parents, so far, only a few therapy systems have been evaluated in children, and well-designed randomized controlled studies in this field are still lacking. This narrative review aims to provide an overview about the to-date robot-assisted and computer-based therapies and the current level of evidence and to share the authors experience about the clinical implication of these new technologies available for children with cerebral palsy.

  10. Evaluation of Current Computer Models Applied in the DOE Complex for SAR Analysis of Radiological Dispersion & Consequences

    Energy Technology Data Exchange (ETDEWEB)

    O' Kula, K. R. [Savannah River Site (SRS), Aiken, SC (United States); East, J. M. [Savannah River Site (SRS), Aiken, SC (United States); Weber, A. H. [Savannah River Site (SRS), Aiken, SC (United States); Savino, A. V. [Savannah River Site (SRS), Aiken, SC (United States); Mazzola, C. A. [Savannah River Site (SRS), Aiken, SC (United States)

    2003-01-01

    The evaluation of atmospheric dispersion/ radiological dose analysis codes included fifteen models identified in authorization basis safety analysis at DOE facilities, or from regulatory and research agencies where past or current work warranted inclusion of a computer model. All computer codes examined were reviewed using general and specific evaluation criteria developed by the Working Group. The criteria were based on DOE Orders and other regulatory standards and guidance for performing bounding and conservative dose calculations. Included were three categories of criteria: (1) Software Quality/User Interface; (2) Technical Model Adequacy; and (3) Application/Source Term Environment. A consensus-based limited quantitative ranking process was used to base an order of model preference as both an overall conclusion, and under specific conditions.

  11. Educational Experiences in Oceanography through Hands-On Involvement with Surface Drifters: an Introduction to Ocean Currents, Engineering, Data Collection, and Computer Science

    Science.gov (United States)

    Anderson, T.

    2015-12-01

    The Northeast Fisheries Science Center's (NEFSC) Student Drifters Program is providing education opportunities for students of all ages. Using GPS-tracked ocean drifters, various educational institutions can provide students with hands-on experience in physical oceanography, engineering, and computer science. In building drifters many high school and undergraduate students may focus on drifter construction, sometimes designing their own drifter or attempting to improve current NEFSC models. While learning basic oceanography younger students can build drifters with the help of an educator and directions available on the studentdrifters.org website. Once drifters are deployed, often by a local mariner or oceanographic partner, drifter tracks can be visualised on maps provided at http://nefsc.noaa.gov/drifter. With the lesson plans available for those interested in computer science, students may download, process, and plot the drifter position data with basic Python code provided. Drifter tracks help students to visualize ocean currents, and also allow them to understand real particle tracking applications such as in search and rescue, oil spill dispersion, larval transport, and the movement of injured sea animals. Additionally, ocean circulation modelers can use student drifter paths to validate their models. The Student Drifters Program has worked with over 100 schools, several of them having deployed drifters on the West Coast. Funding for the program often comes from individual schools and small grants but in the future will preferably come from larger government grants. NSF, Sea-Grant, NOAA, and EPA are all possible sources of funding, especially with the support of multiple schools and large marine education associations. The Student Drifters Program is a unique resource for educators, students, and scientists alike.

  12. Current trends in hardware and software for brain-computer interfaces (BCIs).

    Science.gov (United States)

    Brunner, P; Bianchi, L; Guger, C; Cincotti, F; Schalk, G

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  13. Student use of computer tools designed to scaffold scientific problem-solving with hypermedia resources: A case study

    Science.gov (United States)

    Oliver, Kevin Matthew

    National science standards call for increasing student exposure to inquiry and real-world problem solving. Students can benefit from open-ended learning environments that stress the engagement of real problems and the development of thinking skills and processes. The Internet is an ideal resource for context-bound problems with its seemingly endless supply of resources. Problems may arise, however, since young students are cognitively ill-prepared to manage open-ended learning and may have difficulty processing hypermedia. Computer tools were used in a qualitative case study with 12 eighth graders to determine how such implements might support the process of solving open-ended problems. A preliminary study proposition suggested students would solve open-ended problems more appropriately if they used tools in a manner consistent with higher-order critical and creative thinking. Three research questions sought to identify: how students used tools, the nature of science learning in open-ended environments, and any personal or environmental barriers effecting problem solving. The findings were mixed. The participants did not typically use the tools and resources effectively. They successfully collected basic information, but infrequently organized, evaluated, generated, and justified their ideas. While the students understood how to use most tools procedurally, they lacked strategic understanding for why tool use was necessary. Students scored average to high on assessments of general content understanding, but developed artifacts suggesting their understanding of specific micro problems was naive and rife with misconceptions. Process understanding was also inconsistent, with some students describing basic problem solving processes, but most students unable to describe how tools could support open-ended inquiry. Barriers to effective problem solving were identified in the study. Personal barriers included naive epistemologies, while environmental barriers included a

  14. A GIS-based Computational Tool for Multidimensional Flow Velocity by Acoustic Doppler Current Profilers

    Science.gov (United States)

    Kim, D.; Winkler, M.; Muste, M.

    2015-06-01

    Acoustic Doppler Current Profilers (ADCPs) provide efficient and reliable flow measurements compared to other tools for characteristics of the riverine environments. In addition to originally targeted discharge measurements, ADCPs are increasingly utilized to assess river flow characteristics. The newly developed VMS (Velocity Mapping Software) aims at providing an efficient process for quality assurance, mapping velocity vectors for visualization and facilitating comparison with physical and numerical model results. VMS was designed to provide efficient and smooth work flows for processing groups of transects. The software allows the user to select group of files and subsequently to conduct statistical and graphical quality assurance on the files as a group or individually as appropriate. VMS also enables spatial averaging in horizontal and vertical plane for ADCP data in a single or multiple transects over the same or consecutive cross sections. The analysis results are displayed in numerical and graphical formats.

  15. Computation Of Transformer Losses Under The Effects Of Non-Sinusoidal Currents

    Directory of Open Access Journals (Sweden)

    Amit Gupta

    2011-12-01

    Full Text Available Transformers are normally designed and built for use at rated frequency and perfectsinusoidal load current. A non-linear load on a transformer leads to harmonic power losses which causeincreased operational costs and additional heating in power system components. It leads to higher losses,early fatigue of insulation, premature failure and reduction of the useful life of the transformer. Toprevent these problems, the rated capacity of transformer which supplies harmonic loads must bereduced. In this work a typical 100 KVA three phase distribution transformer with real practicalparameters is taken under non-linear loads generated due to domestic loads. The equivalent losses andcapacity of the distribution transformer is evaluated using the conventional method & also by using softcomputing technique using MATLAB simulation based on valid model of transformer under harmonicconditions. And finally a relation associated with transformer losses and life assessments are reviewed &analyzed and then a comparison is being carried out on the results obtained by both the methods.

  16. Information-seeking behavior and the use of online resources: a snapshot of current health sciences faculty.

    Science.gov (United States)

    De Groote, Sandra L; Shultz, Mary; Blecic, Deborah D

    2014-07-01

    The research assesses the information-seeking behaviors of health sciences faculty, including their use of online databases, journals, and social media. A survey was designed and distributed via email to 754 health sciences faculty at a large urban research university with 6 health sciences colleges. Twenty-six percent (198) of faculty responded. MEDLINE was the primary database utilized, with 78.5% respondents indicating they use the database at least once a week. Compared to MEDLINE, Google was utilized more often on a daily basis. Other databases showed much lower usage. Low use of online databases other than MEDLINE, link-out tools to online journals, and online social media and collaboration tools demonstrates a need for meaningful promotion of online resources and informatics literacy instruction for faculty. Library resources are plentiful and perhaps somewhat overwhelming. Librarians need to help faculty discover and utilize the resources and tools that libraries have to offer.

  17. Research on the Construction of Mobile Learning Resources Based on Cloud Computing%基于云计算的移动学习资源建设研究

    Institute of Scientific and Technical Information of China (English)

    罗金玲

    2016-01-01

    in view of the current smart phones and handheld computers, mobile terminal has been more and more widely used, and mobile learning resources system construction development is relatively slow, unable to meet the user's mobile learning de-mand, this paper on cloud computing technology and mobile learning resources construction phase node were analyzed. Firstly, the paper introduces the cloud computing and mobile technology in the field of learning development present situation, secondly on cloud computing technology and mobile learning combined with the necessity of were discussed, finally discusses cloud com-puting support mobile learning resources construction.%针对当前智能手机、掌上电脑等移动终端得到了越来越广泛的应用,而移动学习系统资源建设却相对发展缓慢,无法满足用户的移动学习需求,文章对云计算技术与移动学习资源建设相结合进行了浅析,首先介绍了云计算技术及移动学习领域的发展现状,其次就云计算技术与移动学习相结合的必要性进行了探讨,最后讨论了云计算支持的移动学习资源建设的实现。

  18. Current and future trends in educational computing: Implications for training language teachers

    Directory of Open Access Journals (Sweden)

    Jannie Botha

    2013-02-01

    Full Text Available In the first part of this paper an overview is given of the current state of educational technology as well as some future trends in this rapidly developing field. The focus is on developments with regard to hardware and software. It is pointed out that a clear distinction between hardware and software is not always possible. Specific reference is made to a Hypertext programme and the promises it holds for the future. In the second part of the paper specific guidelines for training teachers in computerassisted language instruction (CALl are given. This is done against the background given in the first part of the paper. Guidelines are proposed for the training of language teachers. In die eerste gedeelte van hierdie artikel word 'n oorsig gegee van die huidige stand van opvoedkundige tegnologie en ook sommige toekomsneigings op hierdie snelontwikkelende gebied. Die fokus vat op ontwikkelings ten opsigte van apparatuur en pro grammatuur. Daar word aangedui dat 'n duidelike onderskeid tussen apparatuur en programmatuur nie altyd moontlik is nie. Daar word spesifiek verwys na 'n hiperteksprogram en die moontlikhede wat dit vir die toekoms inhou. In die tweede gedeelte van die artikel word spesifieke riglyne vir die opleiding van rekenaargesteunde taalonderrigonderwysers (RGTO gegee. Dit word gedoen teen die agtergrond wat in die eerste gedeelte van die artikel gegee is. Riglyne vir die opleiding van taalonderwysers word voorgestel.

  19. Economics and resources analysis of the potential use of reprocessing options by the current Spanish nuclear reactor park

    Energy Technology Data Exchange (ETDEWEB)

    Alvarez-Velarde, F.; Merino Rodriguez, I.; Gonzalez-Romero, E.

    2014-07-01

    Reprocessing of irradiated nuclear fuel serves multiple purposes, from Pu separation and recovery for MOX fuel fabrication to reduction of high level waste volume, and is nowadays being implemented in several countries like France, Japan, Russia or United Kingdom. This work is aimed at exploring the possibility (in resources and economic terms) of implementing reprocessing for MOX fabrication in Spain. (Author)

  20. Development of online instructional resources for Earth system science education: An example of current practice from China

    Science.gov (United States)

    Dong, Shaochun; Xu, Shijin; Lu, Xiancai

    2009-06-01

    Educators around the world are striving to make science more accessible and relevant to students. Online instructional resources have become an integral component of tertiary science education and will continue to grow in influence and importance over the coming decades. A case study in the iterative improvement of the online instructional resources provided for first-year undergraduates taking " Introductory Earth System Science" at Nanjing University in China is presented in this paper. Online instructional resources are used to conduct a student-centered learning model in the domain of Earth system science, resulting in a sustainable online instructional framework for students and instructors. The purpose of our practice is to make Earth system science education more accessible and exciting to students, changing instruction from a largely textbook-based teacher-centered approach to a more interactive and student-centered approach, and promoting the integration of knowledge and development of deep understanding by students. Evaluation on learning performance and learning satisfaction is conducted to identify helpful components and perception based on students' learning activities. The feedbacks indicate that the use of online instructional resources has positive impacts on mitigating Earth system science education challenges, and has the potential to promote deep learning.