WorldWideScience

Sample records for chelonia storage cloud

  1. Chelonia: A self-healing, replicated storage system

    International Nuclear Information System (INIS)

    Kerr Nilsen, Jon; Read, Alex; Toor, Salman; Nagy, Zsombor

    2011-01-01

    Chelonia is a novel grid storage system designed to fill the requirements gap between those of large, sophisticated scientific collaborations which have adopted the grid paradigm for their distributed storage needs, and of corporate business communities gravitating towards the cloud paradigm. Chelonia is an integrated system of heterogeneous, geographically dispersed storage sites which is easily and dynamically expandable and optimized for high availability and scalability. The architecture and implementation in term of web-services running inside the Advanced Resource Connector Hosting Environment Dameon (ARC HED) are described and results of tests in both local -area and wide-area networks that demonstrate the fault tolerance, stability and scalability of Chelonia will be presented. In addition, example setups for production deployments for small and medium-sized VO's are described.

  2. Chelonia: A self-healing, replicated storage system

    Science.gov (United States)

    Kerr Nilsen, Jon; Toor, Salman; Nagy, Zsombor; Read, Alex

    2011-12-01

    Chelonia is a novel grid storage system designed to fill the requirements gap between those of large, sophisticated scientific collaborations which have adopted the grid paradigm for their distributed storage needs, and of corporate business communities gravitating towards the cloud paradigm. Chelonia is an integrated system of heterogeneous, geographically dispersed storage sites which is easily and dynamically expandable and optimized for high availability and scalability. The architecture and implementation in term of web-services running inside the Advanced Resource Connector Hosting Environment Dameon (ARC HED) are described and results of tests in both local -area and wide-area networks that demonstrate the fault tolerance, stability and scalability of Chelonia will be presented. In addition, example setups for production deployments for small and medium-sized VO's are described.

  3. Benchmarking Cloud Storage Systems

    OpenAIRE

    Wang, Xing

    2014-01-01

    With the rise of cloud computing, many cloud storage systems like Dropbox, Google Drive and Mega have been built to provide decentralized and reliable file storage. It is thus of prime importance to know their features, performance, and the best way to make use of them. In this context, we introduce BenchCloud, a tool designed as part of this thesis to conveniently and efficiently benchmark any cloud storage system. First, we provide a study of six commonly-used cloud storage systems to ident...

  4. PC-Cluster based Storage System Architecture for Cloud Storage

    OpenAIRE

    Yee, Tin Tin; Naing, Thinn Thu

    2011-01-01

    Design and architecture of cloud storage system plays a vital role in cloud computing infrastructure in order to improve the storage capacity as well as cost effectiveness. Usually cloud storage system provides users to efficient storage space with elasticity feature. One of the challenges of cloud storage system is difficult to balance the providing huge elastic capacity of storage and investment of expensive cost for it. In order to solve this issue in the cloud storage infrastructure, low ...

  5. CLOUD STORAGE SERVICES

    OpenAIRE

    Yan, Cheng

    2017-01-01

    Cloud computing is a hot topic in recent research and applications. Because it is widely used in various fields. Up to now, Google, Microsoft, IBM, Amazon and other famous co partnership have proposed their cloud computing application. Look upon cloud computing as one of the most important strategy in the future. Cloud storage is the lower layer of cloud computing system which supports the service of the other layers above it. At the same time, it is an effective way to store and manage heavy...

  6. Entangled Cloud Storage

    DEFF Research Database (Denmark)

    Ateniese, Giuseppe; Dagdelen, Özgür; Damgård, Ivan Bjerre

    2012-01-01

    keeps the files in it private but still lets each client P_i recover his own data by interacting with S; no cooperation from other clients is needed. At the same time, the cloud provider is discouraged from altering or overwriting any significant part of c as this will imply that none of the clients can......Entangled cloud storage enables a set of clients {P_i} to “entangle” their files {f_i} into a single clew c to be stored by a (potentially malicious) cloud provider S. The entanglement makes it impossible to modify or delete significant part of the clew without affecting all files in c. A clew...... recover their files. We provide theoretical foundations for entangled cloud storage, introducing the notion of an entangled encoding scheme that guarantees strong security requirements capturing the properties above. We also give a concrete construction based on privacy-preserving polynomial interpolation...

  7. Searchable Encryption in Cloud Storage

    OpenAIRE

    Ren-Junn Hwang; Chung-Chien Lu; Jain-Shing Wu

    2014-01-01

    Cloud outsource storage is one of important services in cloud computing. Cloud users upload data to cloud servers to reduce the cost of managing data and maintaining hardware and software. To ensure data confidentiality, users can encrypt their files before uploading them to a cloud system. However, retrieving the target file from the encrypted files exactly is difficult for cloud server. This study proposes a protocol for performing multikeyword searches for encrypted cloud data by applying ...

  8. Locally Minimum Storage Regenerating Codes in Distributed Cloud Storage Systems

    Institute of Scientific and Technical Information of China (English)

    Jing Wang; Wei Luo; Wei Liang; Xiangyang Liu; Xiaodai Dong

    2017-01-01

    In distributed cloud storage sys-tems, inevitably there exist multiple node fail-ures at the same time. The existing methods of regenerating codes, including minimum storage regenerating (MSR) codes and mini-mum bandwidth regenerating (MBR) codes, are mainly to repair one single or several failed nodes, unable to meet the repair need of distributed cloud storage systems. In this paper, we present locally minimum storage re-generating (LMSR) codes to recover multiple failed nodes at the same time. Specifically, the nodes in distributed cloud storage systems are divided into multiple local groups, and in each local group (4, 2) or (5, 3) MSR codes are constructed. Moreover, the grouping method of storage nodes and the repairing process of failed nodes in local groups are studied. The-oretical analysis shows that LMSR codes can achieve the same storage overhead as MSR codes. Furthermore, we verify by means of simulation that, compared with MSR codes, LMSR codes can reduce the repair bandwidth and disk I/O overhead effectively.

  9. Heterogeneous Data Storage Management with Deduplication in Cloud Computing

    OpenAIRE

    Yan, Zheng; Zhang, Lifang; Ding, Wenxiu; Zheng, Qinghua

    2017-01-01

    Cloud storage as one of the most important services of cloud computing helps cloud users break the bottleneck of restricted resources and expand their storage without upgrading their devices. In order to guarantee the security and privacy of cloud users, data are always outsourced in an encrypted form. However, encrypted data could incur much waste of cloud storage and complicate data sharing among authorized users. We are still facing challenges on encrypted data storage and management with ...

  10. Critical Factors for Personal Cloud Storage Adoption in China

    Directory of Open Access Journals (Sweden)

    Jianya Wang

    2016-06-01

    Full Text Available Purpose: In order to explain and predict the adoption of personal cloud storage, this study explores the critical factors involved in the adoption of personal cloud storage and empirically validates their relationships to a user's intentions. Design/methodology/approach: Based on technology acceptance model (TAM, network externality, trust, and an interview survey, this study proposes a personal cloud storage adoption model. We conducted an empirical analysis by structural equation modeling based on survey data obtained with a questionnaire. Findings: Among the adoption factors we identified, network externality has the salient influence on a user's adoption intention, followed by perceived usefulness, individual innovation, perceived trust, perceived ease of use, and subjective norms. Cloud storage characteristics are the most important indirect factors, followed by awareness to personal cloud storage and perceived risk. However, although perceived risk is regarded as an important factor by other cloud computing researchers, we found that it has no significant influence. Also, subjective norms have no significant influence on perceived usefulness. This indicates that users are rational when they choose whether to adopt personal cloud storage. Research limitations: This study ignores time and cost factors that might affect a user's intention to adopt personal cloud storage. Practical implications: Our findings might be helpful in designing and developing personal cloud storage products, and helpful to regulators crafting policies. Originality/value: This study is one of the first research efforts that discuss Chinese users' personal cloud storage adoption, which should help to further the understanding of personal cloud adoption behavior among Chinese users.

  11. Cloud Storage and Bioinformatics in a private cloud deployment: Lessons for Data Intensive research

    OpenAIRE

    Chang, Victor; Walters, Robert John; Wills, Gary

    2013-01-01

    This paper describes service portability for a private cloud deployment, including a detailed case study about Cloud Storage and bioinformatics services developed as part of the Cloud Computing Adoption Framework (CCAF). Our Cloud Storage design and deployment is based on Storage Area Network (SAN) technologies, details of which include functionalities, technical implementation, architecture and user support. Experiments for data services (backup automation, data recovery and data migration) ...

  12. Inside Dropbox: Understanding Personal Cloud Storage Services

    NARCIS (Netherlands)

    Drago, Idilio; Mellia, Marco; Munafò, Maurizio M.; Sperotto, Anna; Sadre, R.; Pras, Aiko

    2012-01-01

    Personal cloud storage services are gaining popularity. With a rush of providers to enter the market and an increasing offer of cheap storage space, it is to be expected that cloud storage will soon generate a high amount of Internet traffic. Very little is known about the architecture and the

  13. Integration of End-User Cloud Storage for CMS Analysis

    CERN Document Server

    Riahi, Hassen; Álvarez Ayllón, Alejandro; Balcas, Justas; Ciangottini, Diego; Hernández, José M; Keeble, Oliver; Magini, Nicolò; Manzi, Andrea; Mascetti, Luca; Mascheroni, Marco; Tanasijczuk, Andres Jorge; Vaandering, Eric Wayne

    2018-01-01

    End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achieve results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with...

  14. Keyword-based Ciphertext Search Algorithm under Cloud Storage

    Directory of Open Access Journals (Sweden)

    Ren Xunyi

    2016-01-01

    Full Text Available With the development of network storage services, cloud storage have the advantage of high scalability , inexpensive, without access limit and easy to manage. These advantages make more and more small or medium enterprises choose to outsource large quantities of data to a third party. This way can make lots of small and medium enterprises get rid of costs of construction and maintenance, so it has broad market prospects. But now lots of cloud storage service providers can not protect data security.This result leakage of user data, so many users have to use traditional storage method.This has become one of the important factors that hinder the development of cloud storage. In this article, establishing keyword index by extracting keywords from ciphertext data. After that, encrypted data and the encrypted index upload cloud server together.User get related ciphertext by searching encrypted index, so it can response data leakage problem.

  15. Security for cloud storage systems

    CERN Document Server

    Yang, Kan

    2014-01-01

    Cloud storage is an important service of cloud computing, which offers service for data owners to host their data in the cloud. This new paradigm of data hosting and data access services introduces two major security concerns. The first is the protection of data integrity. Data owners may not fully trust the cloud server and worry that data stored in the cloud could be corrupted or even removed. The second is data access control. Data owners may worry that some dishonest servers provide data access to users that are not permitted for profit gain and thus they can no longer rely on the servers

  16. Analysis of multi cloud storage applications for resource constrained mobile devices

    Directory of Open Access Journals (Sweden)

    Rajeev Kumar Bedi

    2016-09-01

    Full Text Available Cloud storage, which can be a surrogate for all physical hardware storage devices, is a term which gives a reflection of an enormous advancement in engineering (Hung et al., 2012. However, there are many issues that need to be handled when accessing cloud storage on resource constrained mobile devices due to inherent limitations of mobile devices as limited storage capacity, processing power and battery backup (Yeo et al., 2014. There are many multi cloud storage applications available, which handle issues faced by single cloud storage applications. In this paper, we are providing analysis of different multi cloud storage applications developed for resource constrained mobile devices to check their performance on the basis of parameters as battery consumption, CPU usage, data usage and time consumed by using mobile phone device Sony Xperia ZL (smart phone on WiFi network. Lastly, conclusion and open research challenges in these multi cloud storage apps are discussed.

  17. Implementasi Cloud Storage Menggunakan OwnCloud yang High-Availability

    Directory of Open Access Journals (Sweden)

    Ikhwan Ar-Razy

    2016-04-01

    Full Text Available Implementation of practicum courses in Department of Computer Engineering Diponegoro University has some drawbacks, one of them is a lot of lab assistant and the practitioner experiencing difficulties in terms of archiving. One solution to solve the problem is implementing a shared file storage system that is easy and can be accessed by both practitioners or lab assistants. The purpose of this research is to build a cloud-based storage systems that are reliable to preventing crash damage hardware and high availability. The purpose of this research is achieved by designing the appropriate methodology. The result of this research is a storage system that is on the server side by using virtualization and data replication (DRBD as a storage method. The system is composed of two physical servers and one virtual server. Physical servers are using Proxmox VE as operating system and virtual server is using Ubuntu Server as operating system. OwnCloud applications and files are stored in the virtual server. File storage system has several major functions, which are: upload, download, user management, remove, and restore. The functions are executed through web pages, desktop application and Android application.

  18. Towards Efficient Scientific Data Management Using Cloud Storage

    Science.gov (United States)

    He, Qiming

    2013-01-01

    A software prototype allows users to backup and restore data to/from both public and private cloud storage such as Amazon's S3 and NASA's Nebula. Unlike other off-the-shelf tools, this software ensures user data security in the cloud (through encryption), and minimizes users operating costs by using space- and bandwidth-efficient compression and incremental backup. Parallel data processing utilities have also been developed by using massively scalable cloud computing in conjunction with cloud storage. One of the innovations in this software is using modified open source components to work with a private cloud like NASA Nebula. Another innovation is porting the complex backup to- cloud software to embedded Linux, running on the home networking devices, in order to benefit more users.

  19. ID based cryptography for secure cloud data storage

    OpenAIRE

    Kaaniche , Nesrine; Boudguiga , Aymen; Laurent , Maryline

    2013-01-01

    International audience; This paper addresses the security issues of storing sensitive data in a cloud storage service and the need for users to trust the commercial cloud providers. It proposes a cryptographic scheme for cloud storage, based on an original usage of ID-Based Cryptography. Our solution has several advantages. First, it provides secrecy for encrypted data which are stored in public servers. Second, it offers controlled data access and sharing among users, so that unauthorized us...

  20. Benchmarking personal cloud storage

    NARCIS (Netherlands)

    Drago, Idilio; Bocchi, Enrico; Mellia, Marco; Slatman, Herman; Pras, Aiko

    2013-01-01

    Personal cloud storage services are data-intensive applications already producing a significant share of Internet traffic. Several solutions offered by different companies attract more and more people. However, little is known about each service capabilities, architecture and - most of all -

  1. Notes on a storage manager for the Clouds kernel

    Science.gov (United States)

    Pitts, David V.; Spafford, Eugene H.

    1986-01-01

    The Clouds project is research directed towards producing a reliable distributed computing system. The initial goal is to produce a kernel which provides a reliable environment with which a distributed operating system can be built. The Clouds kernal consists of a set of replicated subkernels, each of which runs on a machine in the Clouds system. Each subkernel is responsible for the management of resources on its machine; the subkernal components communicate to provide the cooperation necessary to meld the various machines into one kernel. The implementation of a kernel-level storage manager that supports reliability is documented. The storage manager is a part of each subkernel and maintains the secondary storage residing at each machine in the distributed system. In addition to providing the usual data transfer services, the storage manager ensures that data being stored survives machine and system crashes, and that the secondary storage of a failed machine is recovered (made consistent) automatically when the machine is restarted. Since the storage manager is part of the Clouds kernel, efficiency of operation is also a concern.

  2. Using Cloud-based Storage Technologies for Earth Science Data

    Science.gov (United States)

    Michaelis, A.; Readey, J.; Votava, P.

    2016-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  3. Scalable cloud without dedicated storage

    Science.gov (United States)

    Batkovich, D. V.; Kompaniets, M. V.; Zarochentsev, A. K.

    2015-05-01

    We present a prototype of a scalable computing cloud. It is intended to be deployed on the basis of a cluster without the separate dedicated storage. The dedicated storage is replaced by the distributed software storage. In addition, all cluster nodes are used both as computing nodes and as storage nodes. This solution increases utilization of the cluster resources as well as improves fault tolerance and performance of the distributed storage. Another advantage of this solution is high scalability with a relatively low initial and maintenance cost. The solution is built on the basis of the open source components like OpenStack, CEPH, etc.

  4. Security and efficiency data sharing scheme for cloud storage

    International Nuclear Information System (INIS)

    Han, Ke; Li, Qingbo; Deng, Zhongliang

    2016-01-01

    With the adoption and diffusion of data sharing paradigm in cloud storage, there have been increasing demands and concerns for shared data security. Ciphertext Policy Attribute-Based Encryption (CP-ABE) is becoming a promising cryptographic solution to the security problem of shared data in cloud storage. However due to key escrow, backward security and inefficiency problems, existing CP-ABE schemes cannot be directly applied to cloud storage system. In this paper, an effective and secure access control scheme for shared data is proposed to solve those problems. The proposed scheme refines the security of existing CP-ABE based schemes. Specifically, key escrow and conclusion problem are addressed by dividing key generation center into several distributed semi-trusted parts. Moreover, secrecy revocation algorithm is proposed to address not only back secrecy but efficient problem in existing CP-ABE based scheme. Furthermore, security and performance analyses indicate that the proposed scheme is both secure and efficient for cloud storage.

  5. Integration of cloud-based storage in BES III computing environment

    International Nuclear Information System (INIS)

    Wang, L; Hernandez, F; Deng, Z

    2014-01-01

    We present an on-going work that aims to evaluate the suitability of cloud-based storage as a supplement to the Lustre file system for storing experimental data for the BES III physics experiment and as a backend for storing files belonging to individual members of the collaboration. In particular, we discuss our findings regarding the support of cloud-based storage in the software stack of the experiment. We report on our development work that improves the support of CERN' s ROOT data analysis framework and allows efficient remote access to data through several cloud storage protocols. We also present our efforts providing the experiment with efficient command line tools for navigating and interacting with cloud storage-based data repositories both from interactive sessions and grid jobs.

  6. Proactive replica checking to assure reliability of data in cloud storage with minimum replication

    Science.gov (United States)

    Murarka, Damini; Maheswari, G. Uma

    2017-11-01

    The two major issues for cloud storage systems are data reliability and storage costs. For data reliability protection, multi-replica replication strategy which is used mostly in current clouds acquires huge storage consumption, leading to a large storage cost for applications within the loud specifically. This paper presents a cost-efficient data reliability mechanism named PRCR to cut back the cloud storage consumption. PRCR ensures data reliability of large cloud information with the replication that might conjointly function as a price effective benchmark for replication. The duplication shows that when resembled to the standard three-replica approach, PRCR will scale back to consume only a simple fraction of the cloud storage from one-third of the storage, thence considerably minimizing the cloud storage price.

  7. Partial storage optimization and load control strategy of cloud data centers.

    Science.gov (United States)

    Al Nuaimi, Klaithem; Mohamed, Nader; Al Nuaimi, Mariam; Al-Jaroodi, Jameela

    2015-01-01

    We present a novel approach to solve the cloud storage issues and provide a fast load balancing algorithm. Our approach is based on partitioning and concurrent dual direction download of the files from multiple cloud nodes. Partitions of the files are saved on the cloud rather than the full files, which provide a good optimization to the cloud storage usage. Only partial replication is used in this algorithm to ensure the reliability and availability of the data. Our focus is to improve the performance and optimize the storage usage by providing the DaaS on the cloud. This algorithm solves the problem of having to fully replicate large data sets, which uses up a lot of precious space on the cloud nodes. Reducing the space needed will help in reducing the cost of providing such space. Moreover, performance is also increased since multiple cloud servers will collaborate to provide the data to the cloud clients in a faster manner.

  8. Partial Storage Optimization and Load Control Strategy of Cloud Data Centers

    Science.gov (United States)

    2015-01-01

    We present a novel approach to solve the cloud storage issues and provide a fast load balancing algorithm. Our approach is based on partitioning and concurrent dual direction download of the files from multiple cloud nodes. Partitions of the files are saved on the cloud rather than the full files, which provide a good optimization to the cloud storage usage. Only partial replication is used in this algorithm to ensure the reliability and availability of the data. Our focus is to improve the performance and optimize the storage usage by providing the DaaS on the cloud. This algorithm solves the problem of having to fully replicate large data sets, which uses up a lot of precious space on the cloud nodes. Reducing the space needed will help in reducing the cost of providing such space. Moreover, performance is also increased since multiple cloud servers will collaborate to provide the data to the cloud clients in a faster manner. PMID:25973444

  9. Evaluation of the Huawei UDS cloud storage system for CERN specific data

    International Nuclear Information System (INIS)

    Resines, M Zotes; Hughes, J; Wang, L; Heikkila, S S; Duellmann, D; Adde, G; Toebbicke, R

    2014-01-01

    Cloud storage is an emerging architecture aiming to provide increased scalability and access performance, compared to more traditional solutions. CERN is evaluating this promise using Huawei UDS and OpenStack SWIFT storage deployments, focusing on the needs of high-energy physics. Both deployed setups implement S3, one of the protocols that are emerging as a standard in the cloud storage market. A set of client machines is used to generate I/O load patterns to evaluate the storage system performance. The presented read and write test results indicate scalability both in metadata and data perspectives. Futher the Huawei UDS cloud storage is shown to be able to recover from a major failure of losing 16 disks. Both cloud storages are finally demonstrated to function as back-end storage systems to a filesystem, which is used to deliver high energy physics software.

  10. Evaluation of the Huawei UDS cloud storage system for CERN specific data

    Science.gov (United States)

    Zotes Resines, M.; Heikkila, S. S.; Duellmann, D.; Adde, G.; Toebbicke, R.; Hughes, J.; Wang, L.

    2014-06-01

    Cloud storage is an emerging architecture aiming to provide increased scalability and access performance, compared to more traditional solutions. CERN is evaluating this promise using Huawei UDS and OpenStack SWIFT storage deployments, focusing on the needs of high-energy physics. Both deployed setups implement S3, one of the protocols that are emerging as a standard in the cloud storage market. A set of client machines is used to generate I/O load patterns to evaluate the storage system performance. The presented read and write test results indicate scalability both in metadata and data perspectives. Futher the Huawei UDS cloud storage is shown to be able to recover from a major failure of losing 16 disks. Both cloud storages are finally demonstrated to function as back-end storage systems to a filesystem, which is used to deliver high energy physics software.

  11. Searchable Data Vault: Encrypted Queries in Secure Distributed Cloud Storage

    Directory of Open Access Journals (Sweden)

    Geong Sen Poh

    2017-05-01

    Full Text Available Cloud storage services allow users to efficiently outsource their documents anytime and anywhere. Such convenience, however, leads to privacy concerns. While storage providers may not read users’ documents, attackers may possibly gain access by exploiting vulnerabilities in the storage system. Documents may also be leaked by curious administrators. A simple solution is for the user to encrypt all documents before submitting them. This method, however, makes it impossible to efficiently search for documents as they are all encrypted. To resolve this problem, we propose a multi-server searchable symmetric encryption (SSE scheme and construct a system called the searchable data vault (SDV. A unique feature of the scheme is that it allows an encrypted document to be divided into blocks and distributed to different storage servers so that no single storage provider has a complete document. By incorporating the scheme, the SDV protects the privacy of documents while allowing for efficient private queries. It utilizes a web interface and a controller that manages user credentials, query indexes and submission of encrypted documents to cloud storage services. It is also the first system that enables a user to simultaneously outsource and privately query documents from a few cloud storage services. Our preliminary performance evaluation shows that this feature introduces acceptable computation overheads when compared to submitting documents directly to a cloud storage service.

  12. Privacy-Preserving Outsourced Auditing Scheme for Dynamic Data Storage in Cloud

    OpenAIRE

    Tu, Tengfei; Rao, Lu; Zhang, Hua; Wen, Qiaoyan; Xiao, Jia

    2017-01-01

    As information technology develops, cloud storage has been widely accepted for keeping volumes of data. Remote data auditing scheme enables cloud user to confirm the integrity of her outsourced file via the auditing against cloud storage, without downloading the file from cloud. In view of the significant computational cost caused by the auditing process, outsourced auditing model is proposed to make user outsource the heavy auditing task to third party auditor (TPA). Although the first outso...

  13. ROBUST AND EFFICIENT PRIVACY PRESERVING PUBLIC AUDITING FOR REGENERATING-CODE-BASED CLOUD STORAGE

    OpenAIRE

    Tessy Vincent*, Mrs.Krishnaveni.V.V

    2017-01-01

    Cloud computing is gaining more popularity because of its guaranteed services like online data storage and backup solutions, Web-based e-mail services, virtualized infrastructure etc. User is allowed to access the data stored in a cloud anytime, anywhere using internet connected device with low cost. To provide security to outsourced data in cloud storage against various corruptions, adding fault tolerance to cloud storage together with data integrity checking and failure reparation becomes c...

  14. A secure and efficient audit mechanism for dynamic shared data in cloud storage.

    Science.gov (United States)

    Kwon, Ohmin; Koo, Dongyoung; Shin, Yongjoo; Yoon, Hyunsoo

    2014-01-01

    With popularization of cloud services, multiple users easily share and update their data through cloud storage. For data integrity and consistency in the cloud storage, the audit mechanisms were proposed. However, existing approaches have some security vulnerabilities and require a lot of computational overheads. This paper proposes a secure and efficient audit mechanism for dynamic shared data in cloud storage. The proposed scheme prevents a malicious cloud service provider from deceiving an auditor. Moreover, it devises a new index table management method and reduces the auditing cost by employing less complex operations. We prove the resistance against some attacks and show less computation cost and shorter time for auditing when compared with conventional approaches. The results present that the proposed scheme is secure and efficient for cloud storage services managing dynamic shared data.

  15. A Secure and Efficient Audit Mechanism for Dynamic Shared Data in Cloud Storage

    Science.gov (United States)

    2014-01-01

    With popularization of cloud services, multiple users easily share and update their data through cloud storage. For data integrity and consistency in the cloud storage, the audit mechanisms were proposed. However, existing approaches have some security vulnerabilities and require a lot of computational overheads. This paper proposes a secure and efficient audit mechanism for dynamic shared data in cloud storage. The proposed scheme prevents a malicious cloud service provider from deceiving an auditor. Moreover, it devises a new index table management method and reduces the auditing cost by employing less complex operations. We prove the resistance against some attacks and show less computation cost and shorter time for auditing when compared with conventional approaches. The results present that the proposed scheme is secure and efficient for cloud storage services managing dynamic shared data. PMID:24959630

  16. Application of Cloud Storage on BIM Life-Cycle Management

    Directory of Open Access Journals (Sweden)

    Lieyun Ding

    2014-08-01

    Full Text Available Because of its high information intensity, strong consistency and convenient visualization features, building information modelling (BIM has received widespread attention in the fields of construction and project management. However, due to large amounts of information, high integration, the need for resource sharing between various departments, the long time-span of the BIM application, challenges relating to data interoperability, security and cost all slow down the adoption of BIM. This paper constructs a BIM cloud storage concept system using cloud storage, an advanced computer technology, to solve the problem of mass data processing, information security, and cost problems in the existing application of BIM to full life-cycle management. This system takes full advantage of the cloud storage technique. Achievements are reached in four areas of BIM information management, involving security and licensing management, file management, work process management and collaborative management. The system expands the time and space scales, improves the level of participation, and reduces the cost of BIM. The construction of the BIM cloud storage system is one of the most important directions of the development of BIM, which benefits the promotion and further development of BIM to better serve construction and engineering project management.

  17. NAFFS: network attached flash file system for cloud storage on portable consumer electronics

    Science.gov (United States)

    Han, Lin; Huang, Hao; Xie, Changsheng

    Cloud storage technology has become a research hotspot in recent years, while the existing cloud storage services are mainly designed for data storage needs with stable high speed Internet connection. Mobile Internet connections are often unstable and the speed is relatively low. These native features of mobile Internet limit the use of cloud storage in portable consumer electronics. The Network Attached Flash File System (NAFFS) presented the idea of taking the portable device built-in NAND flash memory as the front-end cache of virtualized cloud storage device. Modern portable devices with Internet connection have built-in more than 1GB NAND Flash, which is quite enough for daily data storage. The data transfer rate of NAND flash device is much higher than mobile Internet connections[1], and its non-volatile feature makes it very suitable as the cache device of Internet cloud storage on portable device, which often have unstable power supply and intermittent Internet connection. In the present work, NAFFS is evaluated with several benchmarks, and its performance is compared with traditional network attached file systems, such as NFS. Our evaluation results indicate that the NAFFS achieves an average accessing speed of 3.38MB/s, which is about 3 times faster than directly accessing cloud storage by mobile Internet connection, and offers a more stable interface than that of directly using cloud storage API. Unstable Internet connection and sudden power off condition are tolerable, and no data in cache will be lost in such situation.

  18. Distributed Cloud Storage Using Network Coding

    OpenAIRE

    Sipos, Marton A.; Fitzek, Frank; Roetter, Daniel Enrique Lucani; Pedersen, Morten Videbæk

    2014-01-01

    Distributed storage is usually considered within acloud provider to ensure availability and reliability of the data.However, the user is still directly dependent on the quality of asingle system. It is also entrusting the service provider with largeamounts of private data, which may be accessed by a successfulattack to that cloud system or even be inspected by governmentagencies in some countries. This paper advocates a generalframework for network coding enabled distributed storage overmulti...

  19. Implementation and Performance Evaluation of Distributed Cloud Storage Solutions using Random Linear Network Coding

    DEFF Research Database (Denmark)

    Fitzek, Frank; Toth, Tamas; Szabados, Áron

    2014-01-01

    This paper advocates the use of random linear network coding for storage in distributed clouds in order to reduce storage and traffic costs in dynamic settings, i.e. when adding and removing numerous storage devices/clouds on-the-fly and when the number of reachable clouds is limited. We introduce...... various network coding approaches that trade-off reliability, storage and traffic costs, and system complexity relying on probabilistic recoding for cloud regeneration. We compare these approaches with other approaches based on data replication and Reed-Solomon codes. A simulator has been developed...... to carry out a thorough performance evaluation of the various approaches when relying on different system settings, e.g., finite fields, and network/storage conditions, e.g., storage space used per cloud, limited network use, and limited recoding capabilities. In contrast to standard coding approaches, our...

  20. INFORMATION SECURITY AND SECURE SEARCH OVER ENCRYPTED DATA IN CLOUD STORAGE SERVICES

    OpenAIRE

    Mr. A Mustagees Shaikh *; Prof. Nitin B. Raut

    2016-01-01

    Cloud computing is most widely used as the next generation architecture of IT enterprises, that provide convenient remote access to data storage and application services. This cloud storage can potentially bring great economical savings for data owners and users, but due to wide concerns of data owners that their private data may be exposed or handled by cloud providers. Hence end-to-end encryption techniques and fuzzy fingerprint technique have been used as solutions for secure cloud data st...

  1. A Survey on the Architectures of Data Security in Cloud Storage Infrastructure

    OpenAIRE

    T.Brindha; R.S.Shaji; G.P.Rajesh

    2013-01-01

    Cloud computing is a most alluring technology that facilitates conducive, on-demand network access based on the requirement of users with nominal effort on management and interaction among cloud providers. The cloud storage serves as a dependable platform for long term storage needs which enables the users to move the data to the cloud in a rapid and secure manner. It assists activities and government agencies considerably decrease their economic overhead of data organization, as they can sto...

  2. Edge-Based Efficient Search over Encrypted Data Mobile Cloud Storage.

    Science.gov (United States)

    Guo, Yeting; Liu, Fang; Cai, Zhiping; Xiao, Nong; Zhao, Ziming

    2018-04-13

    Smart sensor-equipped mobile devices sense, collect, and process data generated by the edge network to achieve intelligent control, but such mobile devices usually have limited storage and computing resources. Mobile cloud storage provides a promising solution owing to its rich storage resources, great accessibility, and low cost. But it also brings a risk of information leakage. The encryption of sensitive data is the basic step to resist the risk. However, deploying a high complexity encryption and decryption algorithm on mobile devices will greatly increase the burden of terminal operation and the difficulty to implement the necessary privacy protection algorithm. In this paper, we propose ENSURE (EfficieNt and SecURE), an efficient and secure encrypted search architecture over mobile cloud storage. ENSURE is inspired by edge computing. It allows mobile devices to offload the computation intensive task onto the edge server to achieve a high efficiency. Besides, to protect data security, it reduces the information acquisition of untrusted cloud by hiding the relevance between query keyword and search results from the cloud. Experiments on a real data set show that ENSURE reduces the computation time by 15% to 49% and saves the energy consumption by 38% to 69% per query.

  3. Edge-Based Efficient Search over Encrypted Data Mobile Cloud Storage

    Directory of Open Access Journals (Sweden)

    Yeting Guo

    2018-04-01

    Full Text Available Smart sensor-equipped mobile devices sense, collect, and process data generated by the edge network to achieve intelligent control, but such mobile devices usually have limited storage and computing resources. Mobile cloud storage provides a promising solution owing to its rich storage resources, great accessibility, and low cost. But it also brings a risk of information leakage. The encryption of sensitive data is the basic step to resist the risk. However, deploying a high complexity encryption and decryption algorithm on mobile devices will greatly increase the burden of terminal operation and the difficulty to implement the necessary privacy protection algorithm. In this paper, we propose ENSURE (EfficieNt and SecURE, an efficient and secure encrypted search architecture over mobile cloud storage. ENSURE is inspired by edge computing. It allows mobile devices to offload the computation intensive task onto the edge server to achieve a high efficiency. Besides, to protect data security, it reduces the information acquisition of untrusted cloud by hiding the relevance between query keyword and search results from the cloud. Experiments on a real data set show that ENSURE reduces the computation time by 15% to 49% and saves the energy consumption by 38% to 69% per query.

  4. Edge-Based Efficient Search over Encrypted Data Mobile Cloud Storage

    Science.gov (United States)

    Liu, Fang; Cai, Zhiping; Xiao, Nong; Zhao, Ziming

    2018-01-01

    Smart sensor-equipped mobile devices sense, collect, and process data generated by the edge network to achieve intelligent control, but such mobile devices usually have limited storage and computing resources. Mobile cloud storage provides a promising solution owing to its rich storage resources, great accessibility, and low cost. But it also brings a risk of information leakage. The encryption of sensitive data is the basic step to resist the risk. However, deploying a high complexity encryption and decryption algorithm on mobile devices will greatly increase the burden of terminal operation and the difficulty to implement the necessary privacy protection algorithm. In this paper, we propose ENSURE (EfficieNt and SecURE), an efficient and secure encrypted search architecture over mobile cloud storage. ENSURE is inspired by edge computing. It allows mobile devices to offload the computation intensive task onto the edge server to achieve a high efficiency. Besides, to protect data security, it reduces the information acquisition of untrusted cloud by hiding the relevance between query keyword and search results from the cloud. Experiments on a real data set show that ENSURE reduces the computation time by 15% to 49% and saves the energy consumption by 38% to 69% per query. PMID:29652810

  5. The dCache scientific storage cloud

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    For over a decade, the dCache team has provided software for handling big data for a diverse community of scientists. The team has also amassed a wealth of operational experience from using this software in production. With this experience, the team have refined dCache with the goal of providing a "scientific cloud": a storage solution that satisfies all requirements of a user community by exposing different facets of dCache with which users interact. Recent development, as part of this "scientific cloud" vision, has introduced a new facet: a sync-and-share service, often referred to as "dropbox-like storage". This work has been strongly focused on local requirements, but will be made available in future releases of dCache allowing others to adopt dCache solutions. In this presentation we will outline the current status of the work: both the successes and limitations, and the direction and time-scale of future work.

  6. Searchable attribute-based encryption scheme with attribute revocation in cloud storage.

    Science.gov (United States)

    Wang, Shangping; Zhao, Duqiao; Zhang, Yaling

    2017-01-01

    Attribute based encryption (ABE) is a good way to achieve flexible and secure access control to data, and attribute revocation is the extension of the attribute-based encryption, and the keyword search is an indispensable part for cloud storage. The combination of both has an important application in the cloud storage. In this paper, we construct a searchable attribute-based encryption scheme with attribute revocation in cloud storage, the keyword search in our scheme is attribute based with access control, when the search succeeds, the cloud server returns the corresponding cipher text to user and the user can decrypt the cipher text definitely. Besides, our scheme supports multiple keywords search, which makes the scheme more practical. Under the assumption of decisional bilinear Diffie-Hellman exponent (q-BDHE) and decisional Diffie-Hellman (DDH) in the selective security model, we prove that our scheme is secure.

  7. Data backup security in cloud storage system

    OpenAIRE

    Атаян, Борис Геннадьевич; Национальный политехнический университет Армении; Багдасарян, Татевик Араевна; Национальный политехнический университет Армении

    2016-01-01

    Cloud backup system is proposed, which provides means for effective creation, secure storage and restore of backups inCloud. For data archiving new efficient SGBP file format is being used in the system, which is based on DEFLATE compressionalgorithm. Proposed format provides means for fast archive creation, which can contain significant amounts of data. Modernapproaches of backup archive protection are described in the paper. Also the SGBP format is compared to heavily used ZIP format(both Z...

  8. Analysis and Research on Spatial Data Storage Model Based on Cloud Computing Platform

    Science.gov (United States)

    Hu, Yong

    2017-12-01

    In this paper, the data processing and storage characteristics of cloud computing are analyzed and studied. On this basis, a cloud computing data storage model based on BP neural network is proposed. In this data storage model, it can carry out the choice of server cluster according to the different attributes of the data, so as to complete the spatial data storage model with load balancing function, and have certain feasibility and application advantages.

  9. Factors Influencing the Adoption of Cloud Storage by Information Technology Decision Makers

    Science.gov (United States)

    Wheelock, Michael D.

    2013-01-01

    This dissertation uses a survey methodology to determine the factors behind the decision to adopt cloud storage. The dependent variable in the study is the intent to adopt cloud storage. Four independent variables are utilized including need, security, cost-effectiveness and reliability. The survey includes a pilot test, field test and statistical…

  10. Konfigurasi Server Cloud Storage pada Jaringan LAN pada LAB Diploma III Manajemen Informatika UM Metro

    Directory of Open Access Journals (Sweden)

    Arif Hidayat

    2017-07-01

    Berdasarkan hasil pengujian sistem dapat disimpulkan bahwa Konfigurasi Server Cloud Storage pada Jaringan LAN ini dapat  menjadi alat bantu Penyimpanan data bisa diakses dan dilakukan melalui folder di komputer atau aplikasi yang terikat dengan pemilik akun di layanan yang bersangkutan. Kata Kunci--- Konfigurasi Server Cloud Storage ; Cloud Storange pada Jaringan LAN; Cloud Storange.

  11. Forensic Investigation of Cooperative Storage Cloud Service: Symform as a Case Study.

    Science.gov (United States)

    Teing, Yee-Yang; Dehghantanha, Ali; Choo, Kim-Kwang Raymond; Dargahi, Tooska; Conti, Mauro

    2017-05-01

    Researchers envisioned Storage as a Service (StaaS) as an effective solution to the distributed management of digital data. Cooperative storage cloud forensic is relatively new and is an under-explored area of research. Using Symform as a case study, we seek to determine the data remnants from the use of cooperative cloud storage services. In particular, we consider both mobile devices and personal computers running various popular operating systems, namely Windows 8.1, Mac OS X Mavericks 10.9.5, Ubuntu 14.04.1 LTS, iOS 7.1.2, and Android KitKat 4.4.4. Potential artefacts recovered during the research include data relating to the installation and uninstallation of the cloud applications, log-in to and log-out from Symform account using the client application, file synchronization as well as their time stamp information. This research contributes to an in-depth understanding of the types of terrestrial artifacts that are likely to remain after the use of cooperative storage cloud on client devices. © 2016 American Academy of Forensic Sciences.

  12. Bio-Cryptography Based Secured Data Replication Management in Cloud Storage

    OpenAIRE

    Elango Pitchai

    2016-01-01

    Cloud computing is new way of economical and efficient storage. The single data mart storage system is a less secure because data remain under a single data mart. This can lead to data loss due to different causes like hacking, server failure etc. If an attacker chooses to attack a specific client, then he can aim at a fixed cloud provider, try to have access to the client’s information. This makes an easy job of the attackers, both inside and outside attackers get the benefit of ...

  13. Designing a Secure Storage Repository for Sharing Scientific Datasets using Public Clouds

    Energy Technology Data Exchange (ETDEWEB)

    Kumbhare, Alok [Univ. of Southern California, Los Angeles, CA (United States); Simmhan, Yogesth [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2011-11-14

    As Cloud platforms gain increasing traction among scientific and business communities for outsourcing storage, computing and content delivery, there is also growing concern about the associated loss of control over private data hosted in the Cloud. In this paper, we present an architecture for a secure data repository service designed on top of a public Cloud infrastructure to support multi-disciplinary scientific communities dealing with personal and human subject data, motivated by the smart power grid domain. Our repository model allows users to securely store and share their data in the Cloud without revealing the plain text to unauthorized users, the Cloud storage provider or the repository itself. The system masks file names, user permissions and access patterns while providing auditing capabilities with provable data updates.

  14. An Enhanced Erasure Code-Based Security Mechanism for Cloud Storage

    Directory of Open Access Journals (Sweden)

    Wenfeng Wang

    2014-01-01

    Full Text Available Cloud computing offers a wide range of luxuries, such as high performance, rapid elasticity, on-demand self-service, and low cost. However, data security continues to be a significant impediment in the promotion and popularization of cloud computing. To address the problem of data leakage caused by unreliable service providers and external cyber attacks, an enhanced erasure code-based security mechanism is proposed and elaborated in terms of four aspects: data encoding, data transmission, data placement, and data reconstruction, which ensure data security throughout the whole traversing into cloud storage. Based on the mechanism, we implement a secure cloud storage system (SCSS. The key design issues, including data division, construction of generator matrix, data encoding, fragment naming, and data decoding, are also described in detail. Finally, we conduct an analysis of data availability and security and performance evaluation. Experimental results and analysis demonstrate that SCSS achieves high availability, strong security, and excellent performance.

  15. Efficient proof of ownership for cloud storage systems

    Science.gov (United States)

    Zhong, Weiwei; Liu, Zhusong

    2017-08-01

    Cloud storage system through the deduplication technology to save disk space and bandwidth, but the use of this technology has appeared targeted security attacks: the attacker can deceive the server to obtain ownership of the file by get the hash value of original file. In order to solve the above security problems and the different security requirements of the files in the cloud storage system, an efficient and information-theoretical secure proof of ownership sceme is proposed to support the file rating. Through the K-means algorithm to implement file rating, and use random seed technology and pre-calculation method to achieve safe and efficient proof of ownership scheme. Finally, the scheme is information-theoretical secure, and achieve better performance in the most sensitive areas of client-side I/O and computation.

  16. BMCloud: Minimizing Repair Bandwidth and Maintenance Cost in Cloud Storage

    Directory of Open Access Journals (Sweden)

    Chao Yin

    2013-01-01

    Full Text Available To protect data in cloud storage, fault tolerance and efficient recovery become very important. Recent studies have developed numerous solutions based on erasure code techniques to solve this problem using functional repairs. However, there are two limitations to address. The first one is consistency since the Encoding Matrix (EM is different among clouds. The other one is repairing bandwidth, which is a concern for most of us. We addressed these two problems from both theoretical and practical perspectives. We developed BMCloud, a new low repair bandwidth, low maintenance cost cloud storage system, which aims to reduce repair bandwidth and maintenance cost. The system employs both functional repair and exact repair while it inherits advantages from the both. We propose the JUDGE_STYLE algorithm, which can judge whether the system should adopt exact repair or functional repair. We implemented a networked storage system prototype and demonstrated our findings. Compared with existing solutions, BMCloud can be used in engineering to save repair bandwidth and degrade maintenance significantly.

  17. Distributed Scheme to Authenticate Data Storage Security in Cloud Computing

    OpenAIRE

    B. Rakesh; K. Lalitha; M. Ismail; H. Parveen Sultana

    2017-01-01

    Cloud Computing is the revolution in current generation IT enterprise. Cloud computing displaces database and application software to the large data centres, where the management of services and data may not be predictable, where as the conventional solutions, for IT services are under proper logical, physical and personal controls. This aspect attribute, however comprises different security challenges which have not been well understood. It concentrates on cloud data storage security which h...

  18. OpenStack Swift as Multi-Region Eventual Consistency Storage for ownCloud Primary Storage

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    As more users adopt AARNet’s CloudStor Plus offering within Australia, interim solutions deployed to overcome failures of various distributed replicated storage technologies haven’t kept pace with the growth in data volume. AARNet’s original design goal of user proximal data storage, combined with national and even international data replication for redundancy reasons continues to be a key driver for design choices. AARNet’s national network is over 90ms from end to end, and accommodating this has been a key issue with numerous software solutions, hindering attempts to provide both original design goals in a reliable real-time manner. With the addition of features to the ownCloud software allowing primary data storage on OpenStack Swift, AARNet has chosen to deploy Swift in a nation spanning multi-region ring to take advantage of Swift’s eventual consistency capabilities and the local region quorum functionality for fast writes. The scaling capability of Swift resolves the twin problems of geogr...

  19. A protect solution for data security in mobile cloud storage

    Science.gov (United States)

    Yu, Xiaojun; Wen, Qiaoyan

    2013-03-01

    It is popular to access the cloud storage by mobile devices. However, this application suffer data security risk, especial the data leakage and privacy violate problem. This risk exists not only in cloud storage system, but also in mobile client platform. To reduce the security risk, this paper proposed a new security solution. It makes full use of the searchable encryption and trusted computing technology. Given the performance limit of the mobile devices, it proposes the trusted proxy based protection architecture. The design basic idea, deploy model and key flows are detailed. The analysis from the security and performance shows the advantage.

  20. Multiobjective Reliable Cloud Storage with Its Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Xiyang Liu

    2016-01-01

    Full Text Available Information abounds in all fields of the real life, which is often recorded as digital data in computer systems and treated as a kind of increasingly important resource. Its increasing volume growth causes great difficulties in both storage and analysis. The massive data storage in cloud environments has significant impacts on the quality of service (QoS of the systems, which is becoming an increasingly challenging problem. In this paper, we propose a multiobjective optimization model for the reliable data storage in clouds through considering both cost and reliability of the storage service simultaneously. In the proposed model, the total cost is analyzed to be composed of storage space occupation cost, data migration cost, and communication cost. According to the analysis of the storage process, the transmission reliability, equipment stability, and software reliability are taken into account in the storage reliability evaluation. To solve the proposed multiobjective model, a Constrained Multiobjective Particle Swarm Optimization (CMPSO algorithm is designed. At last, experiments are designed to validate the proposed model and its solution PSO algorithm. In the experiments, the proposed model is tested in cooperation with 3 storage strategies. Experimental results show that the proposed model is positive and effective. The experimental results also demonstrate that the proposed model can perform much better in alliance with proper file splitting methods.

  1. Enhanced Obfuscation Technique for Data Confidentiality in Public Cloud Storage

    Directory of Open Access Journals (Sweden)

    Oli S. Arul

    2016-01-01

    Full Text Available With an advent of cloud computing, data storage has become a boon in information technology. At the same time, data storage in remote places have become important issues. Lot of techniques are available to ensure protection of data confidentiality. These techniques do not completely serve the purpose in protecting data. The Obfuscation techniques come to rescue for protecting data from malicious attacks. This paper proposes an obfuscation technique to encrypt the desired data type on the cloud providing more protection from unknown hackers. The experimental results show that the time taken for obfuscation is low and the confidentiality percentage is high when compared with existing techniques.

  2. Electron cloud development in the Proton Storage Ring and in the Spallation Neutron Source

    International Nuclear Information System (INIS)

    Pivi, M.T.F.; Furman, M.A.

    2002-01-01

    We have applied our simulation code ''POSINST'' to evaluate the contribution to the growth rate of the electron-cloud instability in proton storage rings. Recent simulation results for the main features of the electron cloud in the storage ring of the Spallation Neutron Source(SNS) at Oak Ridge, and updated results for the Proton Storage Ring (PSR) at Los Alamos are presented in this paper. A key ingredient in our model is a detailed description of the secondary emitted-electron energy spectrum. A refined model for the secondary emission process including the so-called true secondary, rediffused and backscattered electrons has recently been included in the electron-cloud code

  3. Electron cloud dynamics in the Cornell Electron Storage Ring Test Accelerator wiggler

    Directory of Open Access Journals (Sweden)

    C. M. Celata

    2011-04-01

    Full Text Available The interference of stray electrons (also called “electron clouds” with accelerator beams is important in modern intense-beam accelerators, especially those with beams of positive charge. In magnetic wigglers, used, for instance, for transverse emittance damping, the intense synchrotron radiation produced by the beam can generate an electron cloud of relatively high density. In this paper the complicated dynamics of electron clouds in wigglers is examined using the example of a wiggler in the Cornell Electron Storage Ring Test Accelerator experiment at the Cornell Electron Storage Ring. Three-dimensional particle-in-cell simulations with the WARP-POSINST computer code show different density and dynamics for the electron cloud at locations near the maxima of the vertical wiggler field when compared to locations near the minima. Dynamics in these regions, the electron cloud distribution vs longitudinal position, and the beam coherent tune shift caused by the wiggler electron cloud will be discussed.

  4. A hybrid filtering approach for storage optimization in main-memory cloud database

    Directory of Open Access Journals (Sweden)

    Ghada M. Afify

    2015-11-01

    Full Text Available Enterprises and cloud service providers face dramatic increase in the amount of data stored in private and public clouds. Thus, data storage costs are growing hastily because they use only one single high-performance storage tier for storing all cloud data. There’s considerable potential to reduce cloud costs by classifying data into active (hot and inactive (cold. In the main-memory databases research, recent works focus on approaches to identify hot/cold data. Most of these approaches track tuple accesses to identify hot/cold tuples. In contrast, we introduce a novel Hybrid Filtering Approach (HFA that tracks both tuples and columns accesses in main-memory databases. Our objective is to enhance the performance in terms of three dimensions: storage space, query elapsed time and CPU time. In order to validate the effectiveness of our approach, we realized its concrete implementation on Hekaton, a SQL’s server memory-optimized engine using the well-known TPC-H benchmark. Experimental results show that the proposed HFA outperforms Hekaton approach in respect of all performance dimensions. In specific, HFA reduces the storage space by average of 44–96%, reduces the query elapsed time by average of 25–93% and reduces the CPU time by average of 31–97% compared to the traditional database approach.

  5. DESIGN AND IMPLEMENTATION OF A PRIVACY PRESERVED OFF-PREMISES CLOUD STORAGE

    OpenAIRE

    Sarfraz Nawaz Brohi; Mervat Adib Bamiah; Suriayati Chuprat; Jamalul-lail Ab Manan

    2014-01-01

    Despite several cost-effective and flexible characteristics of cloud computing, some clients are reluctant to adopt this paradigm due to emerging security and privacy concerns. Organization such as Healthcare and Payment Card Industry where confidentiality of information is a vital act, are not assertive to trust the security techniques and privacy policies offered by cloud service providers. Malicious attackers have violated the cloud storages to steal, view, manipulate and tamper client&...

  6. Mind Map Kolaboratif Memanfaatkan Groupware Berbasis Cloud Storage

    Directory of Open Access Journals (Sweden)

    Wenty Dwi Yuniarti

    2016-07-01

    Kajian ini membahas pemanfaatan groupware MindMup 2.0 untuk mengorganisasikan pengetahuan topik cabang ilmu elektronika menurut aturan Law of Mind Map,  dilakukan dalam kelompok kecil, dalam pembahasan ini dilakukan oleh empat siswa, dilakukan secara kolaboratif, sinkronous, tanpa friksi (zero friction dengan dukungan teknologi cloud storage, Google Drive.

  7. Observation of magnetic resonances in electron clouds in a positron storage ring

    International Nuclear Information System (INIS)

    Pivi, M.T.F.; Ng, J.S.T.; Cooper, F.; Kharakh, D.; King, F.; Kirby, R.E.; Kuekan, B.; Spencer, C.M.; Raubenheimer, T.O.; Wang, L.F.

    2010-01-01

    The first experimental observation of magnetic resonances in electron clouds is reported. The resonance was observed as a modulation in cloud intensity for uncoated as well as TiN-coated aluminum surfaces in the positron storage ring of the PEP-II collider at SLAC. Electron clouds frequently arise in accelerators of positively charged particles, and severely impact the machines' performance. The TiN coating was found to be an effective remedy, reducing the cloud intensity by three orders of magnitude.

  8. Storage quality-of-service in cloud-based scientific environments: a standardization approach

    Science.gov (United States)

    Millar, Paul; Fuhrmann, Patrick; Hardt, Marcus; Ertl, Benjamin; Brzezniak, Maciej

    2017-10-01

    When preparing the Data Management Plan for larger scientific endeavors, PIs have to balance between the most appropriate qualities of storage space along the line of the planned data life-cycle, its price and the available funding. Storage properties can be the media type, implicitly determining access latency and durability of stored data, the number and locality of replicas, as well as available access protocols or authentication mechanisms. Negotiations between the scientific community and the responsible infrastructures generally happen upfront, where the amount of storage space, media types, like: disk, tape and SSD and the foreseeable data life-cycles are negotiated. With the introduction of cloud management platforms, both in computing and storage, resources can be brokered to achieve the best price per unit of a given quality. However, in order to allow the platform orchestrator to programmatically negotiate the most appropriate resources, a standard vocabulary for different properties of resources and a commonly agreed protocol to communicate those, has to be available. In order to agree on a basic vocabulary for storage space properties, the storage infrastructure group in INDIGO-DataCloud together with INDIGO-associated and external scientific groups, created a working group under the umbrella of the Research Data Alliance (RDA). As communication protocol, to query and negotiate storage qualities, the Cloud Data Management Interface (CDMI) has been selected. Necessary extensions to CDMI are defined in regular meetings between INDIGO and the Storage Network Industry Association (SNIA). Furthermore, INDIGO is contributing to the SNIA CDMI reference implementation as the basis for interfacing the various storage systems in INDIGO to the agreed protocol and to provide an official Open-Source skeleton for systems not being maintained by INDIGO partners.

  9. CRYPTOGRAPHIC SECURE CLOUD STORAGE MODEL WITH ANONYMOUS AUTHENTICATION AND AUTOMATIC FILE RECOVERY

    Directory of Open Access Journals (Sweden)

    Sowmiya Murthy

    2014-10-01

    Full Text Available We propose a secure cloud storage model that addresses security and storage issues for cloud computing environments. Security is achieved by anonymous authentication which ensures that cloud users remain anonymous while getting duly authenticated. For achieving this goal, we propose a digital signature based authentication scheme with a decentralized architecture for distributed key management with multiple Key Distribution Centers. Homomorphic encryption scheme using Paillier public key cryptosystem is used for encrypting the data that is stored in the cloud. We incorporate a query driven approach for validating the access policies defined by an individual user for his/her data i.e. the access is granted to a requester only if his credentials matches with the hidden access policy. Further, since data is vulnerable to losses or damages due to the vagaries of the network, we propose an automatic retrieval mechanism where lost data is recovered by data replication and file replacement with string matching algorithm. We describe a prototype implementation of our proposed model.

  10. A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data

    Science.gov (United States)

    Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.

    2017-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  11. Investigation of Storage Options for Scientific Computing on Grid and Cloud Facilities

    International Nuclear Information System (INIS)

    Garzoglio, Gabriele

    2012-01-01

    In recent years, several new storage technologies, such as Lustre, Hadoop, OrangeFS, and BlueArc, have emerged. While several groups have run benchmarks to characterize them under a variety of configurations, more work is needed to evaluate these technologies for the use cases of scientific computing on Grid clusters and Cloud facilities. This paper discusses our evaluation of the technologies as deployed on a test bed at FermiCloud, one of the Fermilab infrastructure-as-a-service Cloud facilities. The test bed consists of 4 server-class nodes with 40 TB of disk space and up to 50 virtual machine clients, some running on the storage server nodes themselves. With this configuration, the evaluation compares the performance of some of these technologies when deployed on virtual machines and on “bare metal” nodes. In addition to running standard benchmarks such as IOZone to check the sanity of our installation, we have run I/O intensive tests using physics-analysis applications. This paper presents how the storage solutions perform in a variety of realistic use cases of scientific computing. One interesting difference among the storage systems tested is found in a decrease in total read throughput with increasing number of client processes, which occurs in some implementations but not others.

  12. Properties of the electron cloud in a high-energy positron and electron storage ring

    International Nuclear Information System (INIS)

    Harkay, K.C.; Rosenberg, R.A.

    2003-01-01

    Low-energy, background electrons are ubiquitous in high-energy particle accelerators. Under certain conditions, interactions between this electron cloud and the high-energy beam can give rise to numerous effects that can seriously degrade the accelerator performance. These effects range from vacuum degradation to collective beam instabilities and emittance blowup. Although electron-cloud effects were first observed two decades ago in a few proton storage rings, they have in recent years been widely observed and intensely studied in positron and proton rings. Electron-cloud diagnostics developed at the Advanced Photon Source enabled for the first time detailed, direct characterization of the electron-cloud properties in a positron and electron storage ring. From in situ measurements of the electron flux and energy distribution at the vacuum chamber wall, electron-cloud production mechanisms and details of the beam-cloud interaction can be inferred. A significant longitudinal variation of the electron cloud is also observed, due primarily to geometrical details of the vacuum chamber. Such experimental data can be used to provide realistic limits on key input parameters in modeling efforts, leading ultimately to greater confidence in predicting electron-cloud effects in future accelerators.

  13. BMCloud: Minimizing Repair Bandwidth and Maintenance Cost in Cloud Storage

    OpenAIRE

    Yin, Chao; Xie, Changsheng; Wan, Jiguang; Hung, Chih-Cheng; Liu, Jinjiang; Lan, Yihua

    2013-01-01

    To protect data in cloud storage, fault tolerance and efficient recovery become very important. Recent studies have developed numerous solutions based on erasure code techniques to solve this problem using functional repairs. However, there are two limitations to address. The first one is consistency since the Encoding Matrix (EM) is different among clouds. The other one is repairing bandwidth, which is a concern for most of us. We addressed these two problems from both theoretical and practi...

  14. Cloud and virtual data storage networking

    CERN Document Server

    Schulz, Greg

    2011-01-01

    The amount of data being generated, processed, and stored has reached unprecedented levels. Even during the recent economic crisis, there has been no slow down or information recession. Instead, the need to process, move, and store data has only increased. Consequently, IT organizations are looking to do more with what they have while supporting growth along with new services without compromising on cost and service delivery. Cloud and Virtual Data Storage Networking, by savvy IT industry veteran Greg Schulz, looks at converging IT resources and management technologies for facilitating efficie

  15. Two-Level Verification of Data Integrity for Data Storage in Cloud Computing

    Science.gov (United States)

    Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping

    Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.

  16. Emittance growth induced by electron cloud in proton storage rings

    CERN Document Server

    Benedetto, Elena; Coppa, G

    2006-01-01

    In proton and positron storage rings with many closely spaced bunches, a large number of electrons can accumulate in the beam pipe due to various mechanisms (photoemission, residual gas ionization, beam-induced multipacting). The so-formed electron cloud interacts with the positively charged bunches, giving rise to instabilities, emittance growth and losses. This phenomenon has been observed in several existing machines such as the CERN Super Proton Synchrotron (SPS), whose operation has been constrained by the electron-cloud problem, and it is a concern for the Large Hadron Collider (LHC), under construction at CERN. The interaction between the beam and the electron cloud has features which cannot be fully taken into account by the conventional and known theories from accelerators and plasma physics. Computer simulations are indispensable for a proper prediction and understanding of the instability dynamics. The main feature which renders the beam-cloud interactions so peculiar is that the the electron cloud...

  17. On the Feasibility of a Network Coded Mobile Storage Cloud

    DEFF Research Database (Denmark)

    Sipos, Marton A.; Fitzek, Frank; Roetter, Daniel Enrique Lucani

    2015-01-01

    Conventional cloud storage services offer relatively good reliability and performance in a cost-effective manner. However, they are typically structured in a centralized and highly controlled fashion. In more dynamic storage scenarios, these centralized approaches are unfeasible and developing...... to provide an effective and flexible erasure correcting code. This paper identifies and answers key questions regarding the feasibility of such a system. We show that the mobile cloud has sufficient network resources to adapt to changes in node numbers and also study the redundancy level needed to maintain...... data availability. We have found that as little as 75% redundancy is enough to offer 99.28% availability for the examined period and essentially 100% availability is achieved when using 50% redundancy along with high-availability nodes. We have leveraged traces from a popular P2P mobile application...

  18. Measurements of the electron cloud in the APS storage ring

    International Nuclear Information System (INIS)

    Harkey, K. C.

    1999-01-01

    Synchrotron radiation interacting with the vacuum chamber walls in a storage ring produce photoelectrons that can be accelerated by the beam, acquiring sufficient energy to produce secondary electrons in collisions with the walls. If the secondary-electron yield (SEY) coefficient of the wall material is greater than one, as is the case with the aluminum chambers in the Advanced Photon Source (APS) storage ring, a runaway condition can develop. As the electron cloud builds up along a train of stored positron or electron bunches, the possibility exists that a transverse perturbation of the head bunch will be communicated to trailing bunches due to interaction with the cloud. In order to characterize the electron cloud, a special vacuum chamber was built and inserted into the ring. The chamber contains 10 rudimentary electron-energy analyzers, as well as three targets coated with different materials. Measurements show that the intensity and electron energy distribution are highly dependent on the temporal spacing between adjacent bunches and the amount of current contained in each bunch. Furthermore, measurements using the different targets are consistent with what would be expected based on the SEY of the coatings. Data for both positron and electron beams are presented

  19. An Intelligent Cloud Storage Gateway for Medical Imaging.

    Science.gov (United States)

    Viana-Ferreira, Carlos; Guerra, António; Silva, João F; Matos, Sérgio; Costa, Carlos

    2017-09-01

    Historically, medical imaging repositories have been supported by indoor infrastructures. However, the amount of diagnostic imaging procedures has continuously increased over the last decades, imposing several challenges associated with the storage volume, data redundancy and availability. Cloud platforms are focused on delivering hardware and software services over the Internet, becoming an appealing solution for repository outsourcing. Although this option may bring financial and technological benefits, it also presents new challenges. In medical imaging scenarios, communication latency is a critical issue that still hinders the adoption of this paradigm. This paper proposes an intelligent Cloud storage gateway that optimizes data access times. This is achieved through a new cache architecture that combines static rules and pattern recognition for eviction and prefetching. The evaluation results, obtained from experiments over a real-world dataset, show that cache hit ratios can reach around 80%, leading to reductions of image retrieval times by over 60%. The combined use of eviction and prefetching policies proposed can significantly reduce communication latency, even when using a small cache in comparison to the total size of the repository. Apart from the performance gains, the proposed system is capable of adjusting to specific workflows of different institutions.

  20. Privacy-Preserving Outsourced Auditing Scheme for Dynamic Data Storage in Cloud

    Directory of Open Access Journals (Sweden)

    Tengfei Tu

    2017-01-01

    Full Text Available As information technology develops, cloud storage has been widely accepted for keeping volumes of data. Remote data auditing scheme enables cloud user to confirm the integrity of her outsourced file via the auditing against cloud storage, without downloading the file from cloud. In view of the significant computational cost caused by the auditing process, outsourced auditing model is proposed to make user outsource the heavy auditing task to third party auditor (TPA. Although the first outsourced auditing scheme can protect against the malicious TPA, this scheme enables TPA to have read access right over user’s outsourced data, which is a potential risk for user data privacy. In this paper, we introduce the notion of User Focus for outsourced auditing, which emphasizes the idea that lets user dominate her own data. Based on User Focus, our proposed scheme not only can prevent user’s data from leaking to TPA without depending on data encryption but also can avoid the use of additional independent random source that is very difficult to meet in practice. We also describe how to make our scheme support dynamic updates. According to the security analysis and experimental evaluations, our proposed scheme is provably secure and significantly efficient.

  1. Mind Map Kolaboratif Memanfaatkan Groupware Berbasis Cloud Storage

    OpenAIRE

    Yuniarti, Wenty Dwi

    2016-01-01

    Groupware adalah aplikasi atau perangkat lunak komputer yang dirancang untuk mendukung kolaborasi dari beberapa pengguna (Alan Dix dkk, 2004: 663). Saat ini groupware berkembang, bukan sekedar sebagai perangkat lunak multi user yang dapat mengakses data sama, berbagi dokumen atau rich-media, namun dengan teknologi cloud storage, groupware mendukung penyimpanan dokumen secara online sebagai artifak atau hasil kerja kolaboratif. Dalam pembelajaran, kolaborasi diwujudkan dengan kelompok atau...

  2. Enhanced Obfuscation Technique for Data Confidentiality in Public Cloud Storage

    OpenAIRE

    Oli S. Arul; Arockiam L.

    2016-01-01

    With an advent of cloud computing, data storage has become a boon in information technology. At the same time, data storage in remote places have become important issues. Lot of techniques are available to ensure protection of data confidentiality. These techniques do not completely serve the purpose in protecting data. The Obfuscation techniques come to rescue for protecting data from malicious attacks. This paper proposes an obfuscation technique to encrypt the desired data type on the clou...

  3. Clone-based Data Index in Cloud Storage Systems

    Directory of Open Access Journals (Sweden)

    He Jing

    2016-01-01

    Full Text Available The storage systems have been challenged by the development of cloud computing. The traditional data index cannot satisfy the requirements of cloud computing because of the huge index volumes and quick response time. Meanwhile, because of the increasing size of data index and its dynamic characteristics, the previous ways, which rebuilding the index or fully backup the index before the data has changed, cannot satisfy the need of today’s big data index. To solve these problems, we propose a double-layer index structure that overcomes the throughput limitation of single point server. Then, a clone based B+ tree structure is proposed to achieve high performance and adapt dynamic environment. The experimental results show that our clone-based solution has high efficiency.

  4. Revised cloud storage structure for light-weight data archiving in LHD

    International Nuclear Information System (INIS)

    Nakanishi, Hideya; Masaki, Ohsuna; Mamoru, Kojima; Setsuo, Imazu; Miki, Nonomura; Masahiko, Emoto; Takashi, Yamamoto; Yoshio, Nagayama; Takahisa, Ozeki; Noriyoshi, Nakajima; Katsumi, Ida; Osamu, Kaneko

    2014-01-01

    Highlights: • GlusterFS is adopted to replace IznaStor cloud storage in LHD. • GlusterFS and OpenStack/Swift are compared. • SSD-based GlusterFS distributed replicated volume is separated from normal RAID storage. • LABCOM system changes the storage technology every 4 years for cost efficiency. - Abstract: The LHD data archiving system has newly selected GlusterFS distributed filesystem for the replacement of the present cloud storage software named “IznaStor/dSS”. Even though the prior software provided many favorable functionalities of hot plug and play node insertion, internal auto-replication of data files, and symmetric load balancing between all member nodes, it revealed a poor feature in recovering from an accidental malfunction of a storage node. Once a failure happened, the recovering process usually took at least several days or sometimes more than a week with a heavy cpu load. In some cases they fell into the so-called “split-brain” or “amnesia” condition, not to get recovered from it. Since the recovery time tightly depends on the capacity size of the fault node, individual HDD management is more desirable than large volumes of HDD arrays. In addition, the dynamic mutual awareness of data location information may be removed if some other static data distribution method can be applied. In this study, the candidate middleware of “OpenStack/Swift” and “GlusterFS” has been tested by using the real mass of LHD data for more than half a year, and finally GlusterFS has been selected to replace the present IznaStor. It has implemented very limited functionalities of cloud storage but a simplified RAID10-like structure, which may consequently provide lighter-weight read/write ability. Since the LABCOM data system is implemented to be independent of the storage structure, it is easy to plug off the IznaStor and on the new GlusterFS. The effective I/O speed is also confirmed to be on the same level as the estimated one from raw

  5. Revised cloud storage structure for light-weight data archiving in LHD

    Energy Technology Data Exchange (ETDEWEB)

    Nakanishi, Hideya, E-mail: nakanisi@nifs.ac.jp [National Institute for Fusion Science, 322-6 Oroshi-cho, Toki, Gifu 509-5292 (Japan); Masaki, Ohsuna; Mamoru, Kojima; Setsuo, Imazu; Miki, Nonomura; Masahiko, Emoto; Takashi, Yamamoto; Yoshio, Nagayama [National Institute for Fusion Science, 322-6 Oroshi-cho, Toki, Gifu 509-5292 (Japan); Takahisa, Ozeki [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan); Noriyoshi, Nakajima; Katsumi, Ida; Osamu, Kaneko [National Institute for Fusion Science, 322-6 Oroshi-cho, Toki, Gifu 509-5292 (Japan)

    2014-05-15

    Highlights: • GlusterFS is adopted to replace IznaStor cloud storage in LHD. • GlusterFS and OpenStack/Swift are compared. • SSD-based GlusterFS distributed replicated volume is separated from normal RAID storage. • LABCOM system changes the storage technology every 4 years for cost efficiency. - Abstract: The LHD data archiving system has newly selected GlusterFS distributed filesystem for the replacement of the present cloud storage software named “IznaStor/dSS”. Even though the prior software provided many favorable functionalities of hot plug and play node insertion, internal auto-replication of data files, and symmetric load balancing between all member nodes, it revealed a poor feature in recovering from an accidental malfunction of a storage node. Once a failure happened, the recovering process usually took at least several days or sometimes more than a week with a heavy cpu load. In some cases they fell into the so-called “split-brain” or “amnesia” condition, not to get recovered from it. Since the recovery time tightly depends on the capacity size of the fault node, individual HDD management is more desirable than large volumes of HDD arrays. In addition, the dynamic mutual awareness of data location information may be removed if some other static data distribution method can be applied. In this study, the candidate middleware of “OpenStack/Swift” and “GlusterFS” has been tested by using the real mass of LHD data for more than half a year, and finally GlusterFS has been selected to replace the present IznaStor. It has implemented very limited functionalities of cloud storage but a simplified RAID10-like structure, which may consequently provide lighter-weight read/write ability. Since the LABCOM data system is implemented to be independent of the storage structure, it is easy to plug off the IznaStor and on the new GlusterFS. The effective I/O speed is also confirmed to be on the same level as the estimated one from raw

  6. Implementing cloud storage with OpenStack Swift

    CERN Document Server

    Rajana, Kris; Varma, Sreedhar

    2014-01-01

    This tutorial-based book has a step-by-step approach for each topic, ensuring it is thoroughly covered and easy to follow. If you are an IT administrator who wants to enter the world of cloud storage using OpenStack Swift, then this book is ideal for you. Whether your job is to build, manage, or use OpenStack Swift, this book is an ideal way to move your career ahead. Only basic Linux and server technology skills are expected, to take advantage of this book.

  7. Electron cloud instabilities in the Proton Storage Ring and Spallation Neutron Source

    Directory of Open Access Journals (Sweden)

    M. Blaskiewicz

    2003-01-01

    Full Text Available Electron cloud instabilities in the Los Alamos Proton Storage Ring and those foreseen for the Oak Ridge Spallation Neutron Source are examined theoretically, numerically, and experimentally.

  8. Optimizing the Use of Storage Systems Provided by Cloud Computing Environments

    Science.gov (United States)

    Gallagher, J. H.; Potter, N.; Byrne, D. A.; Ogata, J.; Relph, J.

    2013-12-01

    Cloud computing systems present a set of features that include familiar computing resources (albeit augmented to support dynamic scaling of processing power) bundled with a mix of conventional and unconventional storage systems. The linux base on which many Cloud environments (e.g., Amazon) are based make it tempting to assume that any Unix software will run efficiently in this environment efficiently without change. OPeNDAP and NODC collaborated on a short project to explore how the S3 and Glacier storage systems provided by the Amazon Cloud Computing infrastructure could be used with a data server developed primarily to access data stored in a traditional Unix file system. Our work used the Amazon cloud system, but we strived for designs that could be adapted easily to other systems like OpenStack. Lastly, we evaluated different architectures from a computer security perspective. We found that there are considerable issues associated with treating S3 as if it is a traditional file system, even though doing so is conceptually simple. These issues include performance penalties because using a software tool that emulates a traditional file system to store data in S3 performs poorly when compared to a storing data directly in S3. We also found there are important benefits beyond performance to ensuring that data written to S3 can directly accessed without relying on a specific software tool. To provide a hierarchical organization to the data stored in S3, we wrote 'catalog' files, using XML. These catalog files map discrete files to S3 access keys. Like a traditional file system's directories, the catalogs can also contain references to other catalogs, providing a simple but effective hierarchy overlaid on top of S3's flat storage space. An added benefit to these catalogs is that they can be viewed in a web browser; our storage scheme provides both efficient access for the data server and access via a web browser. We also looked at the Glacier storage system and

  9. A keyword searchable attribute-based encryption scheme with attribute update for cloud storage.

    Science.gov (United States)

    Wang, Shangping; Ye, Jian; Zhang, Yaling

    2018-01-01

    Ciphertext-policy attribute-based encryption (CP-ABE) scheme is a new type of data encryption primitive, which is very suitable for data cloud storage for its fine-grained access control. Keyword-based searchable encryption scheme enables users to quickly find interesting data stored in the cloud server without revealing any information of the searched keywords. In this work, we provide a keyword searchable attribute-based encryption scheme with attribute update for cloud storage, which is a combination of attribute-based encryption scheme and keyword searchable encryption scheme. The new scheme supports the user's attribute update, especially in our new scheme when a user's attribute need to be updated, only the user's secret key related with the attribute need to be updated, while other user's secret key and the ciphertexts related with this attribute need not to be updated with the help of the cloud server. In addition, we outsource the operation with high computation cost to cloud server to reduce the user's computational burden. Moreover, our scheme is proven to be semantic security against chosen ciphertext-policy and chosen plaintext attack in the general bilinear group model. And our scheme is also proven to be semantic security against chosen keyword attack under bilinear Diffie-Hellman (BDH) assumption.

  10. Move It or Lose It: Cloud-Based Data Storage

    Science.gov (United States)

    Waters, John K.

    2010-01-01

    There was a time when school districts showed little interest in storing or backing up their data to remote servers. Nothing seemed less secure than handing off data to someone else. But in the last few years the buzz around cloud storage has grown louder, and the idea that data backup could be provided as a service has begun to gain traction in…

  11. Information Storage and Management Storing, Managing, and Protecting Digital Information in Classic, Virtualized, and Cloud Environments

    CERN Document Server

    Services, EMC Education

    2012-01-01

    The new edition of a bestseller, now revised and update throughout! This new edition of the unparalleled bestseller serves as a full training course all in one and as the world's largest data storage company, EMC is the ideal author for such a critical resource. They cover the components of a storage system and the different storage system models while also offering essential new material that explores the advances in existing technologies and the emergence of the "Cloud" as well as updates and vital information on new technologies. Features a separate section on emerging area of cloud computi

  12. Cloud object store for archive storage of high performance computing data using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  13. Cloud Robotics Platforms

    Directory of Open Access Journals (Sweden)

    Busra Koken

    2015-01-01

    Full Text Available Cloud robotics is a rapidly evolving field that allows robots to offload computation-intensive and storage-intensive jobs into the cloud. Robots are limited in terms of computational capacity, memory and storage. Cloud provides unlimited computation power, memory, storage and especially collaboration opportunity. Cloud-enabled robots are divided into two categories as standalone and networked robots. This article surveys cloud robotic platforms, standalone and networked robotic works such as grasping, simultaneous localization and mapping (SLAM and monitoring.

  14. PREVALENCE OF BARNACLES (CRUSTACEA; CIRRIPEDIA AND ITS POSSIBLE RELATION TO FIBROPAPYLLOMATOSIS IN CHELONIA MYDAS

    Directory of Open Access Journals (Sweden)

    R. R. Zamana

    2017-10-01

    Full Text Available A fibropapilomatose é uma afecção tumoral, caracterizada pela presença de tumores cutâneos que variam em tamanho entre 0.1 a mais de 30 cm de diâmetro. É considerada uma afecção debilitante e potencialmente fatal para as tartarugas marinhas, afetando principalmente a espécie Chelonia mydas, mas também é uma doença que tem sido registrada em outras espécies. Evidências levam a crer que a etiologia da fibropapilomatose é viral e está associada a áreas costeiras poluídas que apresentam alta densidade humana, grande aporte de resíduos industriais, domésticos e agrícolas e biotoxinas marinhas, contudo, fatores como parasitos podem ser um adicional a etiologia da afecção. Sanguessugas, cracas, algas e trematodas digenéticos foram sugeridos como um possível fator adicional na etiologia da fibropapilomatose nas tartarugas verdes (Chelonia mydas. O objetivo do presente estudo é estudar a possível associação de cracas com a fibropapilomatose na espécie Chelonia mydas (tartaruga-verde. Para isso foi realizado um levantamento bibliográfico sobre a presença de cracas em Chelonia mydas e a possível relação dessas espécies de cracas com a fibropapilomatose. Os dados foram obtidos de artigos, base de dados, cartilhas, livros, revistas científicas, sites e teses. Nesse estudo, foram encontradas 20 espécies de cracas associadas à Chelonia mydas. Sendo que dos 18 trabalhos analisados as espécies que apresentam maior frequência são a Chelonibia testudinaria (55,56% e Platylepas hexastylos (33,34%. Nenhum trabalho com o objetivo exclusivo de estudar a relação das cracas com a fibropapilomatose foi encontrado. No entanto, alguns estudos relatam a presença de cracas em Chelonia mydas com fibropapilomas. Possivelmente não há associação de cracas com a fibropapilomatose, no entanto há a necessidade da realização de trabalhos de pesquisa com o objetivo exclusivo de estudar a relação das cracas com os fibropapilomas

  15. Replicas Strategy and Cache Optimization of Video Surveillance Systems Based on Cloud Storage

    Directory of Open Access Journals (Sweden)

    Rongheng Li

    2018-04-01

    Full Text Available With the rapid development of video surveillance technology, especially the popularity of cloud-based video surveillance applications, video data begins to grow explosively. However, in the cloud-based video surveillance system, replicas occupy an amount of storage space. Also, the slow response to video playback constrains the performance of the system. In this paper, considering the characteristics of video data comprehensively, we propose a dynamic redundant replicas mechanism based on security levels that can dynamically adjust the number of replicas. Based on the location correlation between cameras, this paper also proposes a data cache strategy to improve the response speed of data reading. Experiments illustrate that: (1 our dynamic redundant replicas mechanism can save storage space while ensuring data security; (2 the cache mechanism can predict the playback behaviors of the users in advance and improve the response speed of data reading according to the location and time correlation of the front-end cameras; and (3 in terms of cloud-based video surveillance, our proposed approaches significantly outperform existing methods.

  16. A Cloud Storage Platform in the Defense Context : Mobile Data Management With Unreliable Network Conditions

    NARCIS (Netherlands)

    Veen, J.S. van der; Bastiaans, M.; Jonge, M. de; Strijkers, R.J.

    2012-01-01

    This paper discusses a cloud storage platform in the defense context. The mobile and dismounted domains of defense organizations typically use devices that are light in storage, processing and communication capabilities. This means that it is difficult to store a lot of information on these devices

  17. Tests of Cloud Computing and Storage System features for use in H1 Collaboration Data Preservation model

    International Nuclear Information System (INIS)

    Łobodziński, Bogdan

    2011-01-01

    Based on the currently developing strategy for data preservation and long-term analysis in HEP tests of possible future Cloud Computing based on the Eucalyptus Private Cloud platform and the petabyte scale storage open source system CEPH were performed for the H1 Collaboration. Improvements in computing power and strong development of storage systems suggests that a single Cloud Computing resource supported on a given site will be sufficient for analysis requirements beyond the end-date of experiments. This work describes our test-bed architecture which could be applied to fulfill the requirements of the physics program of H1 after the end date of the Collaboration. We discuss the reasons why we choose the Eucalyptus platform and CEPH storage infrastructure as well as our experience with installations and support of these infrastructures. Using our first test results we will examine performance characteristics, noticed failure states, deficiencies, bottlenecks and scaling boundaries.

  18. Dynamic federation of grid and cloud storage

    Science.gov (United States)

    Furano, Fabrizio; Keeble, Oliver; Field, Laurence

    2016-09-01

    The Dynamic Federations project ("Dynafed") enables the deployment of scalable, distributed storage systems composed of independent storage endpoints. While the Uniform Generic Redirector at the heart of the project is protocol-agnostic, we have focused our effort on HTTP-based protocols, including S3 and WebDAV. The system has been deployed on testbeds covering the majority of the ATLAS and LHCb data, and supports geography-aware replica selection. The work done exploits the federation potential of HTTP to build systems that offer uniform, scalable, catalogue-less access to the storage and metadata ensemble and the possibility of seamless integration of other compatible resources such as those from cloud providers. Dynafed can exploit the potential of the S3 delegation scheme, effectively federating on the fly any number of S3 buckets from different providers and applying a uniform authorization to them. This feature has been used to deploy in production the BOINC Data Bridge, which uses the Uniform Generic Redirector with S3 buckets to harmonize the BOINC authorization scheme with the Grid/X509. The Data Bridge has been deployed in production with good results. We believe that the features of a loosely coupled federation of open-protocolbased storage elements open many possibilities of smoothly evolving the current computing models and of supporting new scientific computing projects that rely on massive distribution of data and that would appreciate systems that can more easily be interfaced with commercial providers and can work natively with Web browsers and clients.

  19. Making the most of cloud storage - a toolkit for exploitation by WLCG experiments

    Science.gov (United States)

    Alvarez Ayllon, Alejandro; Arsuaga Rios, Maria; Bitzes, Georgios; Furano, Fabrizio; Keeble, Oliver; Manzi, Andrea

    2017-10-01

    Understanding how cloud storage can be effectively used, either standalone or in support of its associated compute, is now an important consideration for WLCG. We report on a suite of extensions to familiar tools targeted at enabling the integration of cloud object stores into traditional grid infrastructures and workflows. Notable updates include support for a number of object store flavours in FTS3, Davix and gfal2, including mitigations for lack of vector reads; the extension of Dynafed to operate as a bridge between grid and cloud domains; protocol translation in FTS3; the implementation of extensions to DPM (also implemented by the dCache project) to allow 3rd party transfers over HTTP. The result is a toolkit which facilitates data movement and access between grid and cloud infrastructures, broadening the range of workflows suitable for cloud. We report on deployment scenarios and prototype experience, explaining how, for example, an Amazon S3 or Azure allocation can be exploited by grid workflows.

  20. An Efficient Symmetric Searchable Encryption Scheme for Cloud Storage

    Directory of Open Access Journals (Sweden)

    Xiuxiu Jiang

    2017-05-01

    Full Text Available Symmetric searchable encryption for cloud storage enables users to retrieve the documents they want in a privacy-preserving way, which has become a hotspot of research. In this paper, we propose an efficient keyword search scheme over encrypted cloud data. We firstly adopt a structure named as inverted matrix (IM to build search index. The IM is consisted of index vectors, each of which is associated with a keyword. Then we map a keyword to an address used to locate the corresponding index vector. Finally, we mask index vectors with pseudo-random bits to obtain an encrypted enlarged inverted matrix (EEIM. Through the security analysis and experimental evaluation, we demonstrate the privacy and efficiency of our scheme respectively. In addition, we further consider two extended practical search situations, i.e., occurrence queries and dynamic user management, and then give two relevant schemes.

  1. Using S3 cloud storage with ROOT and CvmFS

    Science.gov (United States)

    Arsuaga-Ríos, María; Heikkilä, Seppo S.; Duellmann, Dirk; Meusel, René; Blomer, Jakob; Couturier, Ben

    2015-12-01

    Amazon S3 is a widely adopted web API for scalable cloud storage that could also fulfill storage requirements of the high-energy physics community. CERN has been evaluating this option using some key HEP applications such as ROOT and the CernVM filesystem (CvmFS) with S3 back-ends. In this contribution we present an evaluation of two versions of the Huawei UDS storage system stressed with a large number of clients executing HEP software applications. The performance of concurrently storing individual objects is presented alongside with more complex data access patterns as produced by the ROOT data analysis framework. Both Huawei UDS generations show a successful scalability by supporting multiple byte-range requests in contrast with Amazon S3 or Ceph which do not support these commonly used HEP operations. We further report the S3 integration with recent CvmFS versions and summarize the experience with CvmFS/S3 for publishing daily releases of the full LHCb experiment software stack.

  2. Electron Cloud Simulations of a Proton Storage Ring Using Cold Proton Bunches

    International Nuclear Information System (INIS)

    Sato, Y.; Holmes, Jeffrey A.; Lee, S.Y.; Macek, R.

    2008-01-01

    Using the ORBIT code we study the sensitivity of electron cloud properties with respect to different proton beam profiles, the secondary electron yield (SEY) parameter, and the proton loss rate. Our model uses a cold proton bunch to generate primary electrons and electromagnetic field for electron cloud dynamics. We study the dependence of the prompt and swept electron signals vs the bunch charge and the recovery of electron clouds after sweeping on the beam loss rate and the SEY. The simulation results are compared with the experimental data measured at the proton storage ring at the Los Alamos National Laboratory. Our simulations indicate that the fractional proton loss rate in the field-free straight section may be an exponential function of proton beam charge and may also be lower than the averaged fractional proton loss rate over the whole ring.

  3. Resource Storage Management Model For Ensuring Quality Of Service In The Cloud Archive Systems

    Directory of Open Access Journals (Sweden)

    Mariusz Kapanowski

    2014-01-01

    Full Text Available Nowadays, service providers offer a lot of IT services in the public or private cloud. The client can buy various kinds of services like SaaS, PaaS, etc. Recently there was introduced Backup as a Service (BaaS as a variety of SaaS. At the moment there are available several different BaaSes for archiving the data in the cloud, but they provide only a basic level of service quality. In the paper we propose a model which ensures QoS for BaaS and some  methods for management of storage resources aimed at achieving the required SLA. This model introduces a set of parameters responsible for SLA level which can be offered on the basic or higher level of quality. The storage systems (typically HSM, which are distributed between several Data Centres,  are built based on disk arrays, VTLs, and tape libraries. The RSMM model does not assume bandwidth reservation or control, but is rather focused on the management of storage resources.

  4. A novel data storage logic in the cloud.

    Science.gov (United States)

    Mátyás, Bence; Szarka, Máté; Járvás, Gábor; Kusper, Gábor; Argay, István; Fialowski, Alice

    2016-01-01

    Databases which store and manage long-term scientific information related to life science are used to store huge amount of quantitative attributes. Introduction of a new entity attribute requires modification of the existing data tables and the programs that use these data tables. The solution is increasing the virtual data tables while the number of screens remains the same. The main objective of the present study was to introduce a logic called Joker Tao (JT) which provides universal data storage for cloud-based databases. It means all types of input data can be interpreted as an entity and attribute at the same time, in the same data table.

  5. Efficient secure-channel free public key encryption with keyword search for EMRs in cloud storage.

    Science.gov (United States)

    Guo, Lifeng; Yau, Wei-Chuen

    2015-02-01

    Searchable encryption is an important cryptographic primitive that enables privacy-preserving keyword search on encrypted electronic medical records (EMRs) in cloud storage. Efficiency of such searchable encryption in a medical cloud storage system is very crucial as it involves client platforms such as smartphones or tablets that only have constrained computing power and resources. In this paper, we propose an efficient secure-channel free public key encryption with keyword search (SCF-PEKS) scheme that is proven secure in the standard model. We show that our SCF-PEKS scheme is not only secure against chosen keyword and ciphertext attacks (IND-SCF-CKCA), but also secure against keyword guessing attacks (IND-KGA). Furthermore, our proposed scheme is more efficient than other recent SCF-PEKS schemes in the literature.

  6. Survey of Storage and Fault Tolerance Strategies Used in Cloud Computing

    Science.gov (United States)

    Ericson, Kathleen; Pallickara, Shrideep

    Cloud computing has gained significant traction in recent years. Companies such as Google, Amazon and Microsoft have been building massive data centers over the past few years. Spanning geographic and administrative domains, these data centers tend to be built out of commodity desktops with the total number of computers managed by these companies being in the order of millions. Additionally, the use of virtualization allows a physical node to be presented as a set of virtual nodes resulting in a seemingly inexhaustible set of computational resources. By leveraging economies of scale, these data centers can provision cpu, networking, and storage at substantially reduced prices which in turn underpins the move by many institutions to host their services in the cloud.

  7. Federating Distributed Storage For Clouds In ATLAS

    CERN Document Server

    Berghaus, Frank; The ATLAS collaboration

    2017-01-01

    Input data for applications that run in cloud computing centres can be stored at distant repositories, often with multiple copies of the popular data stored at many sites. Locating and retrieving the remote data can be challenging, and we believe that federating the storage can address this problem. A federation would locate the closest copy of the data on the basis of GeoIP information. Currently we are using the dynamic data federation Dynafed, a software solution developed by CERN IT. Dynafed supports several industry standards for connection protocols like Amazon’s S3, Microsoft’s Azure, as well as WebDAV and HTTP. Dynafed functions as an abstraction layer under which protocol-dependent authentication details are hidden from the user, requiring the user to only provide an X509 certificate.

  8. Global Software Development with Cloud Platforms

    Science.gov (United States)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  9. Creation of a Unified Educational Space within a SLA University Classroom Using Cloud Storage and On-Line Applications

    Science.gov (United States)

    Karabayeva, Kamilya Zhumartovna

    2016-01-01

    In the present article the author gives evidence of effective application of cloud storage and on-line applications in the educational process of the higher education institution, as well as considers the problems and prospects of using cloud technologies in the educational process, when creating a unified educational space in the foreign language…

  10. Evolution of Cloud Storage as Cloud Computing Infrastructure Service

    OpenAIRE

    Rajan, Arokia Paul; Shanmugapriyaa

    2013-01-01

    Enterprises are driving towards less cost, more availability, agility, managed risk - all of which is accelerated towards Cloud Computing. Cloud is not a particular product, but a way of delivering IT services that are consumable on demand, elastic to scale up and down as needed, and follow a pay-for-usage model. Out of the three common types of cloud computing service models, Infrastructure as a Service (IaaS) is a service model that provides servers, computing power, network bandwidth and S...

  11. Protecting location privacy for outsourced spatial data in cloud storage.

    Science.gov (United States)

    Tian, Feng; Gui, Xiaolin; An, Jian; Yang, Pan; Zhao, Jianqiang; Zhang, Xuejun

    2014-01-01

    As cloud computing services and location-aware devices are fully developed, a large amount of spatial data needs to be outsourced to the cloud storage provider, so the research on privacy protection for outsourced spatial data gets increasing attention from academia and industry. As a kind of spatial transformation method, Hilbert curve is widely used to protect the location privacy for spatial data. But sufficient security analysis for standard Hilbert curve (SHC) is seldom proceeded. In this paper, we propose an index modification method for SHC (SHC(∗)) and a density-based space filling curve (DSC) to improve the security of SHC; they can partially violate the distance-preserving property of SHC, so as to achieve better security. We formally define the indistinguishability and attack model for measuring the privacy disclosure risk of spatial transformation methods. The evaluation results indicate that SHC(∗) and DSC are more secure than SHC, and DSC achieves the best index generation performance.

  12. Genomics With Cloud Computing

    OpenAIRE

    Sukhamrit Kaur; Sandeep Kaur

    2015-01-01

    Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computin...

  13. Biomass and water storage dynamics of epiphytes in old-growth and secondary montane cloud forest stands in Costa Rica

    NARCIS (Netherlands)

    Koehler, L.; Tobon, C.; Frumau, K.F.A.; Bruijnzeel, L.A.

    2007-01-01

    Epiphytic biomass, canopy humus and associated canopy water storage capacity are known to vary greatly between old-growth tropical montane cloud forests but for regenerating forests such data are virtually absent. The present study was conducted in an old-growth cloud forest and in a 30-year-old

  14. Cloud networking understanding cloud-based data center networks

    CERN Document Server

    Lee, Gary

    2014-01-01

    Cloud Networking: Understanding Cloud-Based Data Center Networks explains the evolution of established networking technologies into distributed, cloud-based networks. Starting with an overview of cloud technologies, the book explains how cloud data center networks leverage distributed systems for network virtualization, storage networking, and software-defined networking. The author offers insider perspective to key components that make a cloud network possible such as switch fabric technology and data center networking standards. The final chapters look ahead to developments in architectures

  15. Prior-Based Quantization Bin Matching for Cloud Storage of JPEG Images.

    Science.gov (United States)

    Liu, Xianming; Cheung, Gene; Lin, Chia-Wen; Zhao, Debin; Gao, Wen

    2018-07-01

    Millions of user-generated images are uploaded to social media sites like Facebook daily, which translate to a large storage cost. However, there exists an asymmetry in upload and download data: only a fraction of the uploaded images are subsequently retrieved for viewing. In this paper, we propose a cloud storage system that reduces the storage cost of all uploaded JPEG photos, at the expense of a controlled increase in computation mainly during download of requested image subset. Specifically, the system first selectively re-encodes code blocks of uploaded JPEG images using coarser quantization parameters for smaller storage sizes. Then during download, the system exploits known signal priors-sparsity prior and graph-signal smoothness prior-for reverse mapping to recover original fine quantization bin indices, with either deterministic guarantee (lossless mode) or statistical guarantee (near-lossless mode). For fast reverse mapping, we use small dictionaries and sparse graphs that are tailored for specific clusters of similar blocks, which are classified via tree-structured vector quantizer. During image upload, cluster indices identifying the appropriate dictionaries and graphs for the re-quantized blocks are encoded as side information using a differential distributed source coding scheme to facilitate reverse mapping during image download. Experimental results show that our system can reap significant storage savings (up to 12.05%) at roughly the same image PSNR (within 0.18 dB).

  16. Efficiently Multi-User Searchable Encryption Scheme with Attribute Revocation and Grant for Cloud Storage.

    Science.gov (United States)

    Wang, Shangping; Zhang, Xiaoxue; Zhang, Yaling

    2016-01-01

    Cipher-policy attribute-based encryption (CP-ABE) focus on the problem of access control, and keyword-based searchable encryption scheme focus on the problem of finding the files that the user interested in the cloud storage quickly. To design a searchable and attribute-based encryption scheme is a new challenge. In this paper, we propose an efficiently multi-user searchable attribute-based encryption scheme with attribute revocation and grant for cloud storage. In the new scheme the attribute revocation and grant processes of users are delegated to proxy server. Our scheme supports multi attribute are revoked and granted simultaneously. Moreover, the keyword searchable function is achieved in our proposed scheme. The security of our proposed scheme is reduced to the bilinear Diffie-Hellman (BDH) assumption. Furthermore, the scheme is proven to be secure under the security model of indistinguishability against selective ciphertext-policy and chosen plaintext attack (IND-sCP-CPA). And our scheme is also of semantic security under indistinguishability against chosen keyword attack (IND-CKA) in the random oracle model.

  17. Characterization of electron clouds in the Cornell Electron Storage Ring Test Accelerator using TE-wave transmission

    International Nuclear Information System (INIS)

    De Santis, S.; Byrd, J.M.; Billing, M.; Palmer, M.; Sikora, J.; Carlson, B.

    2010-01-01

    A relatively new technique for measuring the electron cloud density in storage rings has been developed and successfully demonstrated (S. De Santis, J.M. Byrd, F. Caspers, A. Krasnykh, T. Kroyer, M.T.F. Pivi, and K.G. Sonnad, Phys. Rev. Lett. 100, 094801 (2008).). We present the experimental results of a systematic application of this technique at the Cornell Electron Storage Ring Test Accelerator. The technique is based on the phase modulation of the TE mode transmitted in a synchrotron beam pipe caused by the periodic variation of the density of electron plasma. Because of the relatively simple hardware requirements, this method has become increasingly popular and has been since successfully implemented in several machines. While the principles of this technique are straightforward, quantitative derivation of the electron cloud density from the measurement requires consideration of several effects, which we address in detail.

  18. IBM Software Defined Storage and ownCloud Enterprise Editon - a perfect match for hyperscale Enterprise File Sync and Share

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    IBM Software Defined Storage, in particular the technology offering codenamed Elastic Storage (based on GPFS technology) has proven to be an ideal match for Enterprise File Sync and Share (EFSS) solutions that need highly scalable storage. The presentation will provide insight into the integration of Elastic Storage with the ownCloud Enterprise Edition (based on Open Source technology) software that showed impressive scalability and performance metrics during a proof-of-concept phase of an installation that is supposed to serve 300000 users when fully deployed.

  19. Beam scraping problems in storage rings: the black cloud

    International Nuclear Information System (INIS)

    Jones, L.W.

    1980-01-01

    The heavy ion, multi-GeV drivers for inertial confinement fusion are being designed to produce beams of an energy, power, and specific ionization sufficient to raise matter to thermonuclear temperatures. The magnitude of these parameters is so far beyond current experience that some problems raised warrant careful scrutiny. In particular, the consequence of some fraction of the beam lost on storage ring inflection septa, extraction channels, and beam-defining collimators seems potentially very serious. Unless carefully contained, a beam halo can easily vaporize the best refractory materials, and the resulting vapor cloud will interact destructively within microseconds with the following beam. The limits on beam flux which may be so lost for particular examples are orders of magnitude below current experience

  20. F2AC: A Lightweight, Fine-Grained, and Flexible Access Control Scheme for File Storage in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Wei Ren

    2016-01-01

    Full Text Available Current file storage service models for cloud servers assume that users either belong to single layer with different privileges or cannot authorize privileges iteratively. Thus, the access control is not fine-grained and flexible. Besides, most access control methods at cloud servers mainly rely on computationally intensive cryptographic algorithms and, especially, may not be able to support highly dynamic ad hoc groups with addition and removal of group members. In this paper, we propose a scheme called F2AC, which is a lightweight, fine-grained, and flexible access control scheme for file storage in mobile cloud computing. F2AC can not only achieve iterative authorization, authentication with tailored policies, and access control for dynamically changing accessing groups, but also provide access privilege transition and revocation. A new access control model called directed tree with linked leaf model is proposed for further implementations in data structures and algorithms. The extensive analysis is given for justifying the soundness and completeness of F2AC.

  1. Pengaruh Selang Waktu Peletakkan Terhadap Keberhasilan Penetasan Telur Penyu Hijau (Chelonia mydas L. (Effect of Planting Time on Egg Hatching Success of Green Turtle (Chelonia mydas L.

    Directory of Open Access Journals (Sweden)

    Edi Wibowo Kushartono

    2014-09-01

    Full Text Available Salah satu usaha konservasi melindungi Penyu hijau (Chelonia mydas L. yaitu dengan tindakan relokasi dengan memindahkan telur dari sarang alami ke tempat penetasan semi alami. Waktu pemindahan dan peletakan telur yang tepat sangat diperlukan untuk memperoleh daya tetas maksimal. Tujuan penelitian ini untuk mengetahui pengaruh selang waktu peletakan telur Penyu Hijau terhadap keberhasilan penetasannya. Rancangan penelitian adalah Rancangan Acak Kelompok berdasarkan 3 induk yang berbeda dengan perlakuan selang waktu peletakan yaitu 2, 7 dan 12 jam. Pengukuran dan pengamatan kondisi lingkungan dilakukan selama inkubasi. Pengamatan munculnya tukik mulai dilakukan pada hari ke 50 masa inkubasi.  Pembongkaran sarang dilakukan pada hari ke 60 masa inkubasi kemudian dilakukan pembedahan secara manual untuk mengamati telur yang gagal menetas. Hasil menunjukkan bahwa tidak ada pengaruh nyata secara signifikan adanya perbedaan selang waktu peletakan terhadap keberhasilan penetasan dan keberhasilan kemunculan. Namun selang waktu peletakan dengan nilai yang baik ditunjukkan pada 2 jam, dilanjutkan dengan 12 jam dan 7 jam. Kata kunci: penetasan, penyu hijau (Chelonia mydas L., semi alami One of the conservation efforts undertaken to protect the green turtle (C. mydas L. is by relocation of the nest where the eggs are removed from natural to semi-natural hatchery. A right time for the removal and burial of eggs are needed to obtain maximum hatching rate. The purpose of this study is to determine the impact of interval laying period on the hatching success of the green turtle eggs. Randomized block design is used which is based on three different turtles with treatment interval of burying, which is 2, 7, and 12 h. Measurements and observations were made during the environmental conditions of the incubation period. Observations hatchling emergence started on day 50 of incubation. Nest destruction was conducted on the 60th day incubation then eggs that failed to

  2. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  3. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  4. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  5. Exploiting Virtualization and Cloud Computing in ATLAS

    International Nuclear Information System (INIS)

    Harald Barreiro Megino, Fernando; Van der Ster, Daniel; Benjamin, Doug; De, Kaushik; Gable, Ian; Paterson, Michael; Taylor, Ryan; Hendrix, Val; Vitillo, Roberto A; Panitkin, Sergey; De Silva, Asoka; Walker, Rod

    2012-01-01

    The ATLAS Computing Model was designed around the concept of grid computing; since the start of data-taking, this model has proven very successful in the federated operation of more than one hundred Worldwide LHC Computing Grid (WLCG) sites for offline data distribution, storage, processing and analysis. However, new paradigms in computing, namely virtualization and cloud computing, present improved strategies for managing and provisioning IT resources that could allow ATLAS to more flexibly adapt and scale its storage and processing workloads on varied underlying resources. In particular, ATLAS is developing a “grid-of-clouds” infrastructure in order to utilize WLCG sites that make resources available via a cloud API. This work will present the current status of the Virtualization and Cloud Computing R and D project in ATLAS Distributed Computing. First, strategies for deploying PanDA queues on cloud sites will be discussed, including the introduction of a “cloud factory” for managing cloud VM instances. Next, performance results when running on virtualized/cloud resources at CERN LxCloud, StratusLab, and elsewhere will be presented. Finally, we will present the ATLAS strategies for exploiting cloud-based storage, including remote XROOTD access to input data, management of EC2-based files, and the deployment of cloud-resident LCG storage elements.

  6. Characterization of electron clouds in the Cornell Electron Storage Ring Test Accelerator using TE-wave transmission

    Directory of Open Access Journals (Sweden)

    S. De Santis

    2010-07-01

    Full Text Available A relatively new technique for measuring the electron cloud density in storage rings has been developed and successfully demonstrated [S. De Santis, J. M. Byrd, F. Caspers, A. Krasnykh, T. Kroyer, M. T. F. Pivi, and K. G. Sonnad, Phys. Rev. Lett. 100, 094801 (2008.PRLTAO0031-900710.1103/PhysRevLett.100.094801]. We present the experimental results of a systematic application of this technique at the Cornell Electron Storage Ring Test Accelerator. The technique is based on the phase modulation of the TE mode transmitted in a synchrotron beam pipe caused by the periodic variation of the density of electron plasma. Because of the relatively simple hardware requirements, this method has become increasingly popular and has been since successfully implemented in several machines. While the principles of this technique are straightforward, quantitative derivation of the electron cloud density from the measurement requires consideration of several effects, which we address in detail.

  7. A Secure Authenticate Framework for Cloud Computing Environment

    OpenAIRE

    Nitin Nagar; Pradeep k. Jatav

    2014-01-01

    Cloud computing has an important aspect for the companies to build and deploy their infrastructure and application. Data Storage service in the cloud computing is easy as compare to the other data storage services. At the same time, cloud security in the cloud environment is challenging task. Security issues ranging from missing system configuration, lack of proper updates, or unwise user actions from remote data storage. It can expose user’s private data and information to unwanted access. i...

  8. Federating Distributed Storage For Clouds In ATLAS

    CERN Document Server

    Berghaus, Frank; The ATLAS collaboration

    2017-01-01

    Input data for applications that run in cloud computing centres can be stored at distant repositories, often with multiple copies of the popular data stored at many sites. Locating and retrieving the remote data can be challenging, and we believe that federating the storage can address this problem. A federation would locate the closest copy of the data currently on the basis of GeoIP information. Currently we are using the DynaFed data federation software solution developed by CERN IT. DynaFed supports several industry standards for connection protocols like Amazon's S3, Microsofts Azure, as well as WebDav and HTTP. Protocol dependent authentication is hidden from the user by using their X509 certificate. We have setup an instance of DynaFed and integrated it into the ATLAS Data Distribution Management system. We report on the challenges faced during the installation and integration. We have tested ATLAS analysis jobs submitted by the PanDA production system and we report on our first experiences with its op...

  9. Context-aware distributed cloud computing using CloudScheduler

    Science.gov (United States)

    Seuster, R.; Leavett-Brown, CR; Casteels, K.; Driemel, C.; Paterson, M.; Ring, D.; Sobie, RJ; Taylor, RP; Weldon, J.

    2017-10-01

    The distributed cloud using the CloudScheduler VM provisioning service is one of the longest running systems for HEP workloads. It has run millions of jobs for ATLAS and Belle II over the past few years using private and commercial clouds around the world. Our goal is to scale the distributed cloud to the 10,000-core level, with the ability to run any type of application (low I/O, high I/O and high memory) on any cloud. To achieve this goal, we have been implementing changes that utilize context-aware computing designs that are currently employed in the mobile communication industry. Context-awareness makes use of real-time and archived data to respond to user or system requirements. In our distributed cloud, we have many opportunistic clouds with no local HEP services, software or storage repositories. A context-aware design significantly improves the reliability and performance of our system by locating the nearest location of the required services. We describe how we are collecting and managing contextual information from our workload management systems, the clouds, the virtual machines and our services. This information is used not only to monitor the system but also to carry out automated corrective actions. We are incrementally adding new alerting and response services to our distributed cloud. This will enable us to scale the number of clouds and virtual machines. Further, a context-aware design will enable us to run analysis or high I/O application on opportunistic clouds. We envisage an open-source HTTP data federation (for example, the DynaFed system at CERN) as a service that would provide us access to existing storage elements used by the HEP experiments.

  10. Status of experimental studies of electron cloud effects at the Los Alamos proton storage ring

    International Nuclear Information System (INIS)

    Macek, R.J.; Browman, A.A.; Borden, M.J.; Fitzgerald, D.H.; McCrady, R.C.; Spickermann, T.J.; Zaugg, T.J.

    2004-01-01

    Various electron cloud effects (ECE) including the two-stream (e-p) instability at the Los Alamos Proton Storage Ring (PSR) have been studied extensively for the past five years with the goal of understanding the phenomena, mitigating the instability and ultimately increasing beam intensity. The specialized diagnostics used in the studies are two types of electron detectors, the retarding field analyzer and the electron sweepmg detector - which have been employed to measure characteristics of the electron cloud as functions of time, location in the ring and various influential beam parameters - plus a short stripline beam position monitor used to measure high frequency motion of the beam centroid. Highlights of this research program are summarized along with more detail on recent results obtained since the ECLOUD'02 workshop. Recent work mcludes a number of parametric studies of the various factors that affect the electron cloud signals, studies of the sources of initial or 'seed' electrons, additional observations of electron cloud dissipation after the beam pulse is extracted, studies of the 'first pulse instability' issue, more data on electron suppression as a cure for the instability, and observations of the effect of a one-turn weak kick on intense beams in the presence of a significant electron cloud.

  11. What CFOs should know before venturing into the cloud.

    Science.gov (United States)

    Rajendran, Janakan

    2013-05-01

    There are three major trends in the use of cloud-based services for healthcare IT: Cloud computing involves the hosting of health IT applications in a service provider cloud. Cloud storage is a data storage service that can involve, for example, long-term storage and archival of information such as clinical data, medical images, and scanned documents. Data center colocation involves rental of secure space in the cloud from a vendor, an approach that allows a hospital to share power capacity and proven security protocols, reducing costs.

  12. Greening the Cloud

    NARCIS (Netherlands)

    van den Hoed, Robert; Hoekstra, Eric; Procaccianti, G.; Lago, P.; Grosso, Paola; Taal, Arie; Grosskop, Kay; van Bergen, Esther

    The cloud has become an essential part of our daily lives. We use it to store our documents (Dropbox), to stream our music and lms (Spotify and Net ix) and without giving it any thought, we use it to work on documents in the cloud (Google Docs). The cloud forms a massive storage and processing

  13. Genomics With Cloud Computing

    Directory of Open Access Journals (Sweden)

    Sukhamrit Kaur

    2015-04-01

    Full Text Available Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computing to genomics are like easy access and sharing of data security of data less cost to pay for resources but still there are some demerits like large time needed to transfer data less network bandwidth.

  14. Leveraging Cloud Computing to Improve Storage Durability, Availability, and Cost for MER Maestro

    Science.gov (United States)

    Chang, George W.; Powell, Mark W.; Callas, John L.; Torres, Recaredo J.; Shams, Khawaja S.

    2012-01-01

    The Maestro for MER (Mars Exploration Rover) software is the premiere operation and activity planning software for the Mars rovers, and it is required to deliver all of the processed image products to scientists on demand. These data span multiple storage arrays sized at 2 TB, and a backup scheme ensures data is not lost. In a catastrophe, these data would currently recover at 20 GB/hour, taking several days for a restoration. A seamless solution provides access to highly durable, highly available, scalable, and cost-effective storage capabilities. This approach also employs a novel technique that enables storage of the majority of data on the cloud and some data locally. This feature is used to store the most recent data locally in order to guarantee utmost reliability in case of an outage or disconnect from the Internet. This also obviates any changes to the software that generates the most recent data set as it still has the same interface to the file system as it did before updates

  15. Migrating enterprise storage applications to the cloud

    OpenAIRE

    Vrable, Michael Daniel

    2011-01-01

    Cloud computing has emerged as a model for hosting computing infrastructure and outsourcing management of that infrastructure. It offers the promise of simplified provisioning and management, lower costs, and access to resources that scale up and down with demand. Cloud computing has seen growing use for Web site hosting, large batch processing jobs, and similar tasks. Despite potential advantages, however, cloud computing is not much used for enterprise applications such as backup, shared fi...

  16. Moving towards Cloud Security

    OpenAIRE

    Edit Szilvia Rubóczki; Zoltán Rajnai

    2015-01-01

    Cloud computing hosts and delivers many different services via Internet. There are a lot of reasons why people opt for using cloud resources. Cloud development is increasing fast while a lot of related services drop behind, for example the mass awareness of cloud security. However the new generation upload videos and pictures without reason to a cloud storage, but only few know about data privacy, data management and the proprietary of stored data in the cloud. In an enterprise environment th...

  17. Cloud Robotics Model

    OpenAIRE

    Mester, Gyula

    2015-01-01

    Cloud Robotics was born from the merger of service robotics and cloud technologies. It allows robots to benefit from the powerful computational, storage, and communications resources of modern data centres. Cloud robotics allows robots to take advantage of the rapid increase in data transfer rates to offload tasks without hard real time requirements. Cloud Robotics has rapidly gained momentum with initiatives by companies such as Google, Willow Garage and Gostai as well as more than a dozen a...

  18. Immunological evaluation of captive green sea turtle (Chelonia mydas) with ulcerative dermatitis

    Science.gov (United States)

    Muñoz, Fernando Alberto; Estrada-Parra, Sergio; Romero-Rojas, Andrés; Gonzalez-Ballesteros, Erik; Work, Thierry M.; Villaseñor-Gaona, Hector; Estrada-Garcia, Iris

    2013-01-01

    Ulcerative dermatitis (UD) is common in captive sea turtles and manifests as skin erosions and ulcers associated with gram-negative bacteria. This study compared clinically healthy and UD-affected captive turtles by evaluating hematology, histopathology, immunoglobulin levels, and delayed-type hypersensitivity assay. Turtles with UD had significantly lower weight, reduced delayed-type hypersensitivity (DTH) responses, and higher heterophil:lymphocyte ratios. This study is the first to assay DTH in green turtles (Chelonia mydas) and suggests that UD is associated with immunosuppression.

  19. The cloud storage service bwSync&Share at KIT

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The Karlsruhe Institute of Technology introduced the bwSync&Share collaboration service in January 2014. The service is an on-premise alternative to existing public cloud storage solutions for students and scientists in the German state of Baden-Württemberg, which allows the synchronization and sharing of documents between multiple devices and users. The service is based on the commercial software PowerFolder and is deployed on a virtual environment to support high reliability and scalability for potential 450,000 users. The integration of the state-wide federated identity management system (bwIDM) and a centralized helpdesk portal allows the service to be used by all academic institutions in the state of Baden-Württemberg. Since starting, approximately 15 organizations and 8,000 users joined the service. The talk gives an overview of related challenges, technical and organizational requirements, current architecture and future development plans.

  20. Status of the experimental studies of the electron cloud at the Los Alamos proton storage ring

    International Nuclear Information System (INIS)

    Macek, R.J.; Browman, A.A.; Borden, M.J.; Fitzgerald, D.H.; McCrady, R.C.; Spickermann, T.J.; Zaugg, T.J.

    2003-01-01

    The electron cloud (EC) at the Los Alamos Proton Storage Ring (PSR) has been studied extensively for the past several years with an overall aim to identify and measure its important characteristics, the factors that influence these characteristics, and to relate these to the two-stream (e-p) transverse instability long observed at PSR. Some new results since PAC2001 are presented.

  1. Privacy-Aware Relevant Data Access with Semantically Enriched Search Queries for Untrusted Cloud Storage Services.

    Science.gov (United States)

    Pervez, Zeeshan; Ahmad, Mahmood; Khattak, Asad Masood; Lee, Sungyoung; Chung, Tae Choong

    2016-01-01

    Privacy-aware search of outsourced data ensures relevant data access in the untrusted domain of a public cloud service provider. Subscriber of a public cloud storage service can determine the presence or absence of a particular keyword by submitting search query in the form of a trapdoor. However, these trapdoor-based search queries are limited in functionality and cannot be used to identify secure outsourced data which contains semantically equivalent information. In addition, trapdoor-based methodologies are confined to pre-defined trapdoors and prevent subscribers from searching outsourced data with arbitrarily defined search criteria. To solve the problem of relevant data access, we have proposed an index-based privacy-aware search methodology that ensures semantic retrieval of data from an untrusted domain. This method ensures oblivious execution of a search query and leverages authorized subscribers to model conjunctive search queries without relying on predefined trapdoors. A security analysis of our proposed methodology shows that, in a conspired attack, unauthorized subscribers and untrusted cloud service providers cannot deduce any information that can lead to the potential loss of data privacy. A computational time analysis on commodity hardware demonstrates that our proposed methodology requires moderate computational resources to model a privacy-aware search query and for its oblivious evaluation on a cloud service provider.

  2. A TRUSTWORTHY CLOUD FORENSICS ENVIRONMENT

    OpenAIRE

    Zawoad , Shams; Hasan , Ragib

    2015-01-01

    Part 5: CLOUD FORENSICS; International audience; The rapid migration from traditional computing and storage models to cloud computing environments has made it necessary to support reliable forensic investigations in the cloud. However, current cloud computing environments often lack support for forensic investigations and the trustworthiness of evidence is often questionable because of the possibility of collusion between dishonest cloud providers, users and forensic investigators. This chapt...

  3. Buffering PV output during cloud transients with energy storage

    Science.gov (United States)

    Moumouni, Yacouba

    Consideration of the use of the major types of energy storage is attempted in this thesis in order to mitigate the effects of power output transients associated with grid-tied CPV systems due to fast-moving cloud coverage. The approach presented here is to buffer intermittency of CPV output power with an energy storage device (used batteries) purchased cheaply from EV owners or battery leasers. When the CPV is connected to the grid with the proper energy storage, the main goal is to smooth out the intermittent solar power and fluctuant load of the grid with a convenient control strategy. This thesis provides a detailed analysis with appropriate Matlab codes to put onto the grid during the day time a constant amount of power on one hand and on the other, shift the less valuable off-peak electricity to the on-peak time, i.e. between 1pm to 7pm, where the electricity price is much better. In this study, a range of base constant power levels were assumed including 15kW, 20kW, 21kW, 22kW, 23kW, 24kW and 25kW. The hypothesis based on an iterative solution was that the capacity of the battery was increased by steps of 5 while the base supply was decreased by the same step size until satisfactorily results were achieved. Hence, it turned out with the chosen battery capacity of 54kWh coupled to the data from the Amonix CPV 7700 unit for Las Vegas for a 3-month period, it was found that 20kW was the largest constant load the system can supply uninterruptedly to the utility company. Simulated results are presented to show the feasibility of the proposed scheme.

  4. Security Architecture of Cloud Computing

    OpenAIRE

    V.KRISHNA REDDY; Dr. L.S.S.REDDY

    2011-01-01

    The Cloud Computing offers service over internet with dynamically scalable resources. Cloud Computing services provides benefits to the users in terms of cost and ease of use. Cloud Computing services need to address the security during the transmission of sensitive data and critical applications to shared and public cloud environments. The cloud environments are scaling large for data processing and storage needs. Cloud computing environment have various advantages as well as disadvantages o...

  5. Virtualization and cloud computing in dentistry.

    Science.gov (United States)

    Chow, Frank; Muftu, Ali; Shorter, Richard

    2014-01-01

    The use of virtualization and cloud computing has changed the way we use computers. Virtualization is a method of placing software called a hypervisor on the hardware of a computer or a host operating system. It allows a guest operating system to run on top of the physical computer with a virtual machine (i.e., virtual computer). Virtualization allows multiple virtual computers to run on top of one physical computer and to share its hardware resources, such as printers, scanners, and modems. This increases the efficient use of the computer by decreasing costs (e.g., hardware, electricity administration, and management) since only one physical computer is needed and running. This virtualization platform is the basis for cloud computing. It has expanded into areas of server and storage virtualization. One of the commonly used dental storage systems is cloud storage. Patient information is encrypted as required by the Health Insurance Portability and Accountability Act (HIPAA) and stored on off-site private cloud services for a monthly service fee. As computer costs continue to increase, so too will the need for more storage and processing power. Virtual and cloud computing will be a method for dentists to minimize costs and maximize computer efficiency in the near future. This article will provide some useful information on current uses of cloud computing.

  6. Towards Indonesian Cloud Campus

    OpenAIRE

    Thamrin, Taqwan; Lukman, Iing; Wahyuningsih, Dina Ika

    2013-01-01

    Nowadays, Cloud Computing is most discussed term in business and academic environment.Cloud campus has many benefits such as accessing the file storages, e-mails, databases,educational resources, research applications and tools anywhere for faculty, administrators,staff, students and other users in university, on demand. Furthermore, cloud campus reduces universities’ IT complexity and cost.This paper discuss the implementation of Indonesian cloud campus and various opportunies and benefits...

  7. Hybrid cloud for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kirsch, Dan

    2012-01-01

    Understand the cloud and implement a cloud strategy for your business Cloud computing enables companies to save money by leasing storage space and accessing technology services through the Internet instead of buying and maintaining equipment and support services. Because it has its own unique set of challenges, cloud computing requires careful explanation. This easy-to-follow guide shows IT managers and support staff just what cloud computing is, how to deliver and manage cloud computing services, how to choose a service provider, and how to go about implementation. It also covers security and

  8. Cloud-Based Data Storage

    Science.gov (United States)

    Waters, John K.

    2011-01-01

    The vulnerability and inefficiency of backing up data on-site is prompting school districts to switch to more secure, less troublesome cloud-based options. District auditors are pushing for a better way to back up their data than the on-site, tape-based system that had been used for years. About three years ago, Hendrick School District in…

  9. Observation of Electron Cloud Instabilities and Emittance Dilution at the Cornell Electron-Positron Storage Ring Test Accelerator

    International Nuclear Information System (INIS)

    Holtzapple, R.L.; Campbell, R.C.; McArdle, K.E.; Miller, M.I.; Totten, M.M.; Tucker, S.L.; Billing, M.G.; Dugan, G.F.; Ramirez, G.A.; Sonnad, K.G.; Williams, H.A.; Flanagan, J.; Palmer, M.A.

    2016-01-01

    Electron cloud related emittance dilution and instabilities of bunch trains limit the performance of high intensity circular colliders. One of the key goals of the Cornell electron-positron storage ring Test Accelerator (CesrTA) research program is to improve our understanding of how the electron cloud alters the dynamics of bunches within the train. Single bunch beam diagnotics have been developed to measure the beam spectra, vertical beam size, two important dynamical effects of beams interacting with the electron cloud, for bunch trains on a turn-by-turn basis. Experiments have been performed at CesrTA to probe the interaction of the electron cloud with stored positron bunch trains. The purpose of these experiments was to characterize the dependence of beam-electron cloud interactions on the machine parameters such as bunch spacing, vertical chromaticity, and bunch current. The beam dynamics of the stored beam, in the presence of the electron cloud, was quantified using: 1) a gated beam position monitor (BPM) and spectrum analyzer to measure the bunch-by-bunch frequency spectrum of the bunch trains; 2) an x-ray beam size monitor to record the bunch-by-bunch, turn-by-turn vertical size of each bunch within the trains. In this paper we report on the observations from these experiments and analyze the effects of the electron cloud on the stability of bunches in a train under many different operational conditions

  10. Ocular fibropapillomas of green turtles (Chelonia mydas).

    Science.gov (United States)

    Brooks, D E; Ginn, P E; Miller, T R; Bramson, L; Jacobson, E R

    1994-05-01

    Histologic evaluation of four eyes from three stranded juvenile green turtles (Chelonia mydas) from Florida, USA revealed ocular fibropapillomas composed of an overlying hyperplastic epithelium, various amounts of a thickened, well vascularized, collagenous stroma, and a moderate-to-dense population of reactive fibroblasts. The histologic morphology of the ocular fibropapillomas varied depending on whether the eyelid, conjunctiva, limbus, or cornea was the primary site of tumor origin. Fibropapillomas arising from the limbus, conjunctiva, or eyelid tended to be polyploid or pedunculated with a high degree of arborization. They often filled the conjunctival fornices and extended externally to be ulcerated on the distal aspects. Corneal fibropapillomas were more sessile and multinodular with less arborization. Some corneal tumors consisted primarily of a broad fibrovascular stroma and mild epithelial hyperplasia, whereas others had a markedly hyperplastic epithelium supported by stalks of fibrovascular stromal tissue. In green turtles ocular fibropapillomas may be locally invasive and associated with severe blindness and systemic debilitation.

  11. Privacy Preserving Face Retrieval in the Cloud for Mobile Users

    OpenAIRE

    Jin, Xin; Ge, Shiming; Song, Chenggen

    2017-01-01

    Recently, cloud storage and processing have been widely adopted. Mobile users in one family or one team may automatically backup their photos to the same shared cloud storage space. The powerful face detector trained and provided by a 3rd party may be used to retrieve the photo collection which contains a specific group of persons from the cloud storage server. However, the privacy of the mobile users may be leaked to the cloud server providers. In the meanwhile, the copyright of the face det...

  12. Mercury in the sea turtle Chelonia mydas (Linnaeus, 1958 from Ceará coast, NE Brazil

    Directory of Open Access Journals (Sweden)

    Moisés F. Bezerra

    2012-03-01

    Full Text Available Mercury concentrations in carapace fragments of the green turtle Chelonia mydas from the Ceará coast in NE Brazil are reported. Concentrations varied from As concentrações de Hg em fragmentos de carapaça de Chelonia mydas no litoral do Ceará, nordeste do Brasil, são reportadas. Concentrações variaram de <0,34 a 856,6 ng.g -1 em peso seco, e foram maiores (média de 154,8 ng.g -1 em peso seco em indivíduos juvenis (n = 22, enquanto que as menores concentrações (média de 2,5 ng.g -1 em peso seco foram observadas em indivíduos adultos/sub-adultos (n = 3. Houve uma correlação negativa significativa entre tamanho do animal e concentração de Hg provavelmente devido a diferença de dieta entre juvenis e sub-adultos/adultos. Fragmentos de carapaça, que constituem substratos não-invasivos e não letais, podem ser importantes para fins de monitoramento ambiental dessas espécies ameaçadas de extinção.

  13. Risk in the Clouds?: Security Issues Facing Government Use of Cloud Computing

    Science.gov (United States)

    Wyld, David C.

    Cloud computing is poised to become one of the most important and fundamental shifts in how computing is consumed and used. Forecasts show that government will play a lead role in adopting cloud computing - for data storage, applications, and processing power, as IT executives seek to maximize their returns on limited procurement budgets in these challenging economic times. After an overview of the cloud computing concept, this article explores the security issues facing public sector use of cloud computing and looks to the risk and benefits of shifting to cloud-based models. It concludes with an analysis of the challenges that lie ahead for government use of cloud resources.

  14. DIDACTIC POTENTIAL OF CLOUD TECHNOLOGIES FOR MENAGMENT OF EDUCATIONAL INSTITUTION

    Directory of Open Access Journals (Sweden)

    А А Заславский

    2016-12-01

    Full Text Available The article introduces the basic definitions and differences between Services in the cloud, cloud services, cloud applications and cloud storage data. The basic cloud types that can be used on the Internet and the LAN of educational organization (Intranet. Possibilities of use of cloud services to improve of effective management at educational organization of internal and external communications of educational organizations, as well as to ensure joint work of employees of the educational organization.A list of core competencies an employee of an educational organization, which will be developed for use in the activity of cloud services and cloud applications. We describe the positive aspects of the use of cloud services and cloud-based technologies for the management of the educational institution, identifies possible risks of using cloud technologies, presents options for the use of cloud technology over the Internet and the Intranet network. We present a list of software included with every category of cloud services described types: storage and file synchronization, storage of bookmarks and notes, time management, software applications. At the article is introduced the basic definition and classification of cloud services, offered examples of methodical use of cloud services in the management of the educational organization.

  15. A price and performance comparison of three different storage architectures for data in cloud-based systems

    Science.gov (United States)

    Gallagher, J. H. R.; Jelenak, A.; Potter, N.; Fulker, D. W.; Habermann, T.

    2017-12-01

    Providing data services based on cloud computing technology that is equivalent to those developed for traditional computing and storage systems is critical for successful migration to cloud-based architectures for data production, scientific analysis and storage. OPeNDAP Web-service capabilities (comprising the Data Access Protocol (DAP) specification plus open-source software for realizing DAP in servers and clients) are among the most widely deployed means for achieving data-as-service functionality in the Earth sciences. OPeNDAP services are especially common in traditional data center environments where servers offer access to datasets stored in (very large) file systems, and a preponderance of the source data for these services is being stored in the Hierarchical Data Format Version 5 (HDF5). Three candidate architectures for serving NASA satellite Earth Science HDF5 data via Hyrax running on Amazon Web Services (AWS) were developed and their performance examined for a set of representative use cases. The performance was based both on runtime and incurred cost. The three architectures differ in how HDF5 files are stored in the Amazon Simple Storage Service (S3) and how the Hyrax server (as an EC2 instance) retrieves their data. The results for both the serial and parallel access to HDF5 data in the S3 will be presented. While the study focused on HDF5 data, OPeNDAP and the Hyrax data server, the architectures are generic and the analysis can be extrapolated to many different data formats, web APIs, and data servers.

  16. A new data collaboration service based on cloud computing security

    Science.gov (United States)

    Ying, Ren; Li, Hua-Wei; Wang, Li na

    2017-09-01

    With the rapid development of cloud computing, the storage and usage of data have undergone revolutionary changes. Data owners can store data in the cloud. While bringing convenience, it also brings many new challenges to cloud data security. A key issue is how to support a secure data collaboration service that supports access and updates to cloud data. This paper proposes a secure, efficient and extensible data collaboration service, which prevents data leaks in cloud storage, supports one to many encryption mechanisms, and also enables cloud data writing and fine-grained access control.

  17. Cloud Optimized Image Format and Compression

    Science.gov (United States)

    Becker, P.; Plesea, L.; Maurer, T.

    2015-04-01

    Cloud based image storage and processing requires revaluation of formats and processing methods. For the true value of the massive volumes of earth observation data to be realized, the image data needs to be accessible from the cloud. Traditional file formats such as TIF and NITF were developed in the hay day of the desktop and assumed fast low latency file access. Other formats such as JPEG2000 provide for streaming protocols for pixel data, but still require a server to have file access. These concepts no longer truly hold in cloud based elastic storage and computation environments. This paper will provide details of a newly evolving image storage format (MRF) and compression that is optimized for cloud environments. Although the cost of storage continues to fall for large data volumes, there is still significant value in compression. For imagery data to be used in analysis and exploit the extended dynamic range of the new sensors, lossless or controlled lossy compression is of high value. Compression decreases the data volumes stored and reduces the data transferred, but the reduced data size must be balanced with the CPU required to decompress. The paper also outlines a new compression algorithm (LERC) for imagery and elevation data that optimizes this balance. Advantages of the compression include its simple to implement algorithm that enables it to be efficiently accessed using JavaScript. Combing this new cloud based image storage format and compression will help resolve some of the challenges of big image data on the internet.

  18. SAFAX : an extensible authorization service for cloud environments

    NARCIS (Netherlands)

    Kaluvuri, S.P.; Egner, A.I.; Den Hartog, J.I.; Zannone, N.

    2015-01-01

    Cloud storage services have become increasingly popular in recent years. Users are often registered to multiple cloud storage services that suit different needs. However, the ad hoc manner in which data sharing between users is implemented lead to issues for these users. For instance, users are

  19. Research on Key Technologies of Cloud Computing

    Science.gov (United States)

    Zhang, Shufen; Yan, Hongcan; Chen, Xuebin

    With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.

  20. Cloud Computing: Architecture and Services

    OpenAIRE

    Ms. Ravneet Kaur

    2018-01-01

    Cloud computing is Internet-based computing, whereby shared resources, software, and information are provided to computers and other devices on demand, like the electricity grid. It is a method for delivering information technology (IT) services where resources are retrieved from the Internet through web-based tools and applications, as opposed to a direct connection to a server. Rather than keeping files on a proprietary hard drive or local storage device, cloud-based storage makes it possib...

  1. Cloud computing basics

    CERN Document Server

    Srinivasan, S

    2014-01-01

    Cloud Computing Basics covers the main aspects of this fast moving technology so that both practitioners and students will be able to understand cloud computing. The author highlights the key aspects of this technology that a potential user might want to investigate before deciding to adopt this service. This book explains how cloud services can be used to augment existing services such as storage, backup and recovery. Addressing the details on how cloud security works and what the users must be prepared for when they move their data to the cloud. Also this book discusses how businesses could prepare for compliance with the laws as well as industry standards such as the Payment Card Industry.

  2. Cardiovascular imaging environment: will the future be cloud-based?

    Science.gov (United States)

    Kawel-Boehm, Nadine; Bluemke, David A

    2017-07-01

    In cardiovascular CT and MR imaging large datasets have to be stored, post-processed, analyzed and distributed. Beside basic assessment of volume and function in cardiac magnetic resonance imaging e.g., more sophisticated quantitative analysis is requested requiring specific software. Several institutions cannot afford various types of software and provide expertise to perform sophisticated analysis. Areas covered: Various cloud services exist related to data storage and analysis specifically for cardiovascular CT and MR imaging. Instead of on-site data storage, cloud providers offer flexible storage services on a pay-per-use basis. To avoid purchase and maintenance of specialized software for cardiovascular image analysis, e.g. to assess myocardial iron overload, MR 4D flow and fractional flow reserve, evaluation can be performed with cloud based software by the consumer or complete analysis is performed by the cloud provider. However, challenges to widespread implementation of cloud services include regulatory issues regarding patient privacy and data security. Expert commentary: If patient privacy and data security is guaranteed cloud imaging is a valuable option to cope with storage of large image datasets and offer sophisticated cardiovascular image analysis for institutions of all sizes.

  3. Mitigation of the electron-cloud effect in the PSR and SNS proton storage rings by tailoring the bunch profile

    International Nuclear Information System (INIS)

    Pivi, M.; Furman, M.A.

    2003-01-01

    For the storage ring of the Spallation Neutron Source(SNS) at Oak Ridge, and for the Proton Storage Ring (PSR) at Los Alamos, both with intense and very long bunches, the electroncloud develops primarily by the mechanism of trailing-edge multipacting. We show, by means of simulations for the PSR, how the resonant nature of this mechanism may be effectively broken by tailoring the longitudinal bunch profile at fixed bunch charge, resulting in a significant decrease in the electron-cloud effect. We briefly discuss the experimental difficulties expected in the implementation of this cure

  4. Adaptive Network Coded Clouds: High Speed Downloads and Cost-Effective Version Control

    DEFF Research Database (Denmark)

    Sipos, Marton A.; Heide, Janus; Roetter, Daniel Enrique Lucani

    2018-01-01

    Although cloud systems provide a reliable and flexible storage solution, the use of a single cloud service constitutes a single point of failure, which can compromise data availability, download speed, and security. To address these challenges, we advocate for the use of multiple cloud storage...... providers simultaneously using network coding as the key enabling technology. Our goal is to study two challenges of network coded storage systems. First, the efficient update of the number of coded fragments per cloud in a system aggregating multiple clouds in order to boost the download speed of files. We...... developed a novel scheme using recoding with limited packets to trade-off storage space, reliability, and data retrieval speed. Implementation and measurements with commercial cloud providers show that up to 9x less network use is needed compared to other network coding schemes, while maintaining similar...

  5. Don't Trust the Cloud, Verify: Integrity and Consistency for Cloud Object Stores

    OpenAIRE

    Brandenburger, Marcus; Cachin, Christian; Knežević, Nikola

    2015-01-01

    Cloud services have turned remote computation into a commodity and enable convenient online collaboration. However, they require that clients fully trust the service provider in terms of confidentiality, integrity, and availability. Towards reducing this dependency, this paper introduces a protocol for verification of integrity and consistency for cloud object storage (VICOS), which enables a group of mutually trusting clients to detect data-integrity and consistency violations for a cloud ob...

  6. Identity Management issues in Cloud Computing

    OpenAIRE

    Saini, Smita; Mann, Deep

    2014-01-01

    Cloud computing is providing a low cost on demand services to the users, omnipresent network,large storage capacity due to these features of cloud computing web applications are moving towards the cloud and due to this migration of the web application,cloud computing platform is raised many issues like privacy, security etc. Privacy issue are major concern for the cloud computing. Privacy is to preserve the sensitive information of the cloud consumer and the major issues to the privacy are un...

  7. Coherent Radiation of Electron Cloud

    International Nuclear Information System (INIS)

    Heifets, S.

    2004-01-01

    The electron cloud in positron storage rings is pinched when a bunch passes by. For short bunches, the radiation due to acceleration of electrons of the cloud is coherent. Detection of such radiation can be used to measure the density of the cloud. The estimate of the power and the time structure of the radiated signal is given in this paper

  8. Cryptonite: A Secure and Performant Data Repository on Public Clouds

    Energy Technology Data Exchange (ETDEWEB)

    Kumbhare, Alok; Simmhan, Yogesh; Prasanna, Viktor

    2012-06-29

    Cloud storage has become immensely popular for maintaining synchronized copies of files and for sharing documents with collaborators. However, there is heightened concern about the security and privacy of Cloud-hosted data due to the shared infrastructure model and an implicit trust in the service providers. Emerging needs of secure data storage and sharing for domains like Smart Power Grids, which deal with sensitive consumer data, require the persistence and availability of Cloud storage but with client-controlled security and encryption, low key management overhead, and minimal performance costs. Cryptonite is a secure Cloud storage repository that addresses these requirements using a StrongBox model for shared key management.We describe the Cryptonite service and desktop client, discuss performance optimizations, and provide an empirical analysis of the improvements. Our experiments shows that Cryptonite clients achieve a 40% improvement in file upload bandwidth over plaintext storage using the Azure Storage Client API despite the added security benefits, while our file download performance is 5 times faster than the baseline for files greater than 100MB.

  9. Towards Media Intercloud Standardization Evaluating Impact of Cloud Storage Heterogeneity

    OpenAIRE

    Aazam, Mohammad; StHilaire, Marc; Huh, EuiNam

    2016-01-01

    Digital media has been increasing very rapidly, resulting in cloud computing's popularity gain. Cloud computing provides ease of management of large amount of data and resources. With a lot of devices communicating over the Internet and with the rapidly increasing user demands, solitary clouds have to communicate to other clouds to fulfill the demands and discover services elsewhere. This scenario is called intercloud computing or cloud federation. Intercloud computing still lacks standard ar...

  10. Cloud Computing and Its Applications in GIS

    Science.gov (United States)

    Kang, Cao

    2011-12-01

    of cloud computing. This paper presents a parallel Euclidean distance algorithm that works seamlessly with the distributed nature of cloud computing infrastructures. The mechanism of this algorithm is to subdivide a raster image into sub-images and wrap them with a one pixel deep edge layer of individually computed distance information. Each sub-image is then processed by a separate node, after which the resulting sub-images are reassembled into the final output. It is shown that while any rectangular sub-image shape can be used, those approximating squares are computationally optimal. This study also serves as a demonstration of this subdivide and layer-wrap strategy, which would enable the migration of many truly spatial GIS algorithms to cloud computing infrastructures. However, this research also indicates that certain spatial GIS algorithms such as cost distance cannot be migrated by adopting this mechanism, which presents significant challenges for the development of cloud-based GIS systems. The third article is entitled "A Distributed Storage Schema for Cloud Computing based Raster GIS Systems". This paper proposes a NoSQL Database Management System (NDDBMS) based raster GIS data storage schema. NDDBMS has good scalability and is able to use distributed commodity computers, which make it superior to Relational Database Management Systems (RDBMS) in a cloud computing environment. In order to provide optimized data service performance, the proposed storage schema analyzes the nature of commonly used raster GIS data sets. It discriminates two categories of commonly used data sets, and then designs corresponding data storage models for both categories. As a result, the proposed storage schema is capable of hosting and serving enormous volumes of raster GIS data speedily and efficiently on cloud computing infrastructures. In addition, the scheme also takes advantage of the data compression characteristics of Quadtrees, thus promoting efficient data storage. Through

  11. A Bit String Content Aware Chunking Strategy for Reduced CPU Energy on Cloud Storage

    Directory of Open Access Journals (Sweden)

    Bin Zhou

    2015-01-01

    Full Text Available In order to achieve energy saving and reduce the total cost of ownership, green storage has become the first priority for data center. Detecting and deleting the redundant data are the key factors to the reduction of the energy consumption of CPU, while high performance stable chunking strategy provides the groundwork for detecting redundant data. The existing chunking algorithm greatly reduces the system performance when confronted with big data and it wastes a lot of energy. Factors affecting the chunking performance are analyzed and discussed in the paper and a new fingerprint signature calculation is implemented. Furthermore, a Bit String Content Aware Chunking Strategy (BCCS is put forward. This strategy reduces the cost of signature computation in chunking process to improve the system performance and cuts down the energy consumption of the cloud storage data center. On the basis of relevant test scenarios and test data of this paper, the advantages of the chunking strategy are verified.

  12. Cloud ERP and Cloud Accounting Software in Romania

    Directory of Open Access Journals (Sweden)

    Gianina MIHAI

    2015-05-01

    Full Text Available Nowadays, Cloud Computing becomes a more and more fashionable concept in the IT environment. There is no unanimous opinion on the definition of this concept, as it covers several versions of the newly emerged stage in the IT. But in fact, Cloud Computing should not suggest anything else than simplicity. Thus, in short, simple terms, Cloud Computing can be defined as a solution to use external IT resources (servers, storage media, applications and services, via Internet. Cloud computing is nothing more than the promise of an easy accessible technology. If the promise will eventually turn into something certain yet remains to be seen. In our opinion it is too early to make an assertion. In this article, our purpose is to find out what is the Romanian offer of ERP and Accounting software applications in Cloud and / or as services in SaaS version. Thus, we conducted an extensive study whose results we’ll present in the following.

  13. Networking for the Cloud: Challenges and Trends

    NARCIS (Netherlands)

    Drago, Idilio; de Oliveira Schmidt, R.; Hofstede, R.J.; Sperotto, Anna; Karimzadeh Motallebi Azar, Morteza; Haverkort, Boudewijn R.H.M.; Pras, Aiko

    2013-01-01

    Cloud services have changed the way computing power is delivered to customers, by offering computing and storage capacity in remote data centers on demand over the Internet. The success of the cloud model, however, has not come without challenges. Cloud providers have repeatedly been related to

  14. Bioinformatics clouds for big data manipulation

    KAUST Repository

    Dai, Lin; Gao, Xin; Guo, Yan; Xiao, Jingfa; Zhang, Zhang

    2012-01-01

    -supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues

  15. Simulation of cloud data security processes and performance

    OpenAIRE

    Chand, K; Ramachandran, M; Kor, AL

    2015-01-01

    In the world of cloud computing, millions of people are using cloud computing for the purpose of business, education and socialization. Examples of cloud applications are: Google Drive for storage, Facebook for social networks, etc. Cloud users use the cloud computing infrastructure thinking that these services are easy and safe to use. However, there are security and performance issues to be addressed. This paper discusses how cloud users and cloud providers address performance and security ...

  16. Revocable Key-Aggregate Cryptosystem for Data Sharing in Cloud

    Directory of Open Access Journals (Sweden)

    Qingqing Gan

    2017-01-01

    Full Text Available With the rapid development of network and storage technology, cloud storage has become a new service mode, while data sharing and user revocation are important functions in the cloud storage. Therefore, according to the characteristics of cloud storage, a revocable key-aggregate encryption scheme is put forward based on subset-cover framework. The proposed scheme not only has the key-aggregate characteristics, which greatly simplifies the user’s key management, but also can revoke user access permissions, realizing the flexible and effective access control. When user revocation occurs, it allows cloud server to update the ciphertext so that revoked users can not have access to the new ciphertext, while nonrevoked users do not need to update their private keys. In addition, a verification mechanism is provided in the proposed scheme, which can verify the updated ciphertext and ensure that the user revocation is performed correctly. Compared with the existing schemes, this scheme can not only reduce the cost of key management and storage, but also realize user revocation and achieve user’s access control efficiently. Finally, the proposed scheme can be proved to be selective chosen-plaintext security in the standard model.

  17. Moving towards Cloud Security

    Directory of Open Access Journals (Sweden)

    Edit Szilvia Rubóczki

    2015-01-01

    Full Text Available Cloud computing hosts and delivers many different services via Internet. There are a lot of reasons why people opt for using cloud resources. Cloud development is increasing fast while a lot of related services drop behind, for example the mass awareness of cloud security. However the new generation upload videos and pictures without reason to a cloud storage, but only few know about data privacy, data management and the proprietary of stored data in the cloud. In an enterprise environment the users have to know the rule of cloud usage, however they have little knowledge about traditional IT security. It is important to measure the level of their knowledge, and evolve the training system to develop the security awareness. The article proves the importance of suggesting new metrics and algorithms for measuring security awareness of corporate users and employees to include the requirements of emerging cloud security.

  18. Wake Field of the e-Cloud

    International Nuclear Information System (INIS)

    Heifets, Samuel A

    2001-01-01

    The wake field of the cloud is derived analytically taking into account the finite size of the cloud and nonlinearity of the electron motion. The analytic expression for the effective transverse wake field caused by the electron cloud in a positron storage ring is derived. The derivation includes the frequency spread in the cloud, which is the main effect of the nonlinearity of electron motion in the cloud. This approach allows calculation of the Q-factor and study the tune spread in a bunch

  19. A New Electronic Commerce Architecture in the Cloud

    OpenAIRE

    Guigang Zhang; Chao Li; Sixin Xue; Yuenan Liu; Yong Zhang; Chunxiao Xing

    2012-01-01

    In this paper, the authors propose a new electronic commerce architecture in the cloud that satisfies the requirements of the cloud. This architecture includes five technologies, which are the massive EC data storage technology in the cloud, the massive EC data processing technology in the cloud, the EC security management technology in the cloud, OLAP technology for EC in the cloud, and active EC technology in the cloud. Finally, a detailed discussion of future trends for EC in the cloud env...

  20. Inorganic elements in green sea turtles (Chelonia mydas): relationships among external and internal tissues

    Science.gov (United States)

    Faust, Derek R.; Hooper, Michael J.; Cobb, George P.; Barnes, Melanie; Shaver, Donna; Ertolacci, Shauna; Smith, Philip N.

    2014-01-01

    Inorganic elements from anthropogenic sources have entered marine environments worldwide and are detectable in marine organisms, including sea turtles. Threatened and endangered classifications of sea turtles have heretofore made assessments of contaminant concentrations difficult because of regulatory restrictions on obtaining samples using nonlethal techniques. In the present study, claw and skin biopsy samples were examined as potential indicators of internal tissue burdens in green sea turtles (Chelonia mydas). Significant relationships were observed between claw and liver, and claw and muscle concentrations of mercury, nickel, arsenic, and selenium (p turtles.

  1. Factors influencing the organizational adoption of cloud computing: a survey among cloud workers

    Directory of Open Access Journals (Sweden)

    Mark Stieninger

    2018-01-01

    Full Text Available Cloud computing presents an opportunity for organizations to leverage affordable, scalable, and agile technologies. However, even with the demonstrated value of cloud computing, organizations have been hesitant to adopt such technologies. Based on a multi-theoretical research model, this paper provides an empirical study targeted to better understand the adoption of cloud services. An online survey addressing the factors derived from literature for three specific popular cloud application types (cloud storage, cloud mail and cloud office was undertaken. The research model was analyzed by using variance-based structural equation modelling. Results show that the factors of compatibility, relative advantage, security and trust, as well as, a lower level of complexity lead to a more positive attitude towards cloud adoption. Complexity, compatibility, image and security and trust have direct and indirect effects on relative advantage. These factors further explain a large part of the attitude towards cloud adoption but not of its usage.

  2. Utilizing cloud storage architecture for long-pulse fusion experiment data storage

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Ming; Liu, Qiang [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Wuhan, Hubei (China); School of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan, Hubei (China); Zheng, Wei, E-mail: zhenghaku@gmail.com [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Wuhan, Hubei (China); School of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan, Hubei (China); Wan, Kuanhong; Hu, Feiran; Yu, Kexun [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Wuhan, Hubei (China); School of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan, Hubei (China)

    2016-11-15

    Scientific data storage plays a significant role in research facility. The explosion of data in recent years was always going to make data access, acquiring and management more difficult especially in fusion research field. For future long-pulse experiment like ITER, the extremely large data will be generated continuously for a long time, putting much pressure on both the write performance and the scalability. And traditional database has some defects such as inconvenience of management, hard to scale architecture. Hence a new data storage system is very essential. J-TEXTDB is a data storage and management system based on an application cluster and a storage cluster. J-TEXTDB is designed for big data storage and access, aiming at improving read–write speed, optimizing data system structure. The application cluster of J-TEXTDB is used to provide data manage functions and handles data read and write operations from the users. The storage cluster is used to provide the storage services. Both clusters are composed with general servers. By simply adding server to the cluster can improve the read–write performance, the storage space and redundancy, making whole data system highly scalable and available. In this paper, we propose a data system architecture and data model to manage data more efficient. Benchmarks of J-TEXTDB performance including read and write operations are given.

  3. Utilizing cloud storage architecture for long-pulse fusion experiment data storage

    International Nuclear Information System (INIS)

    Zhang, Ming; Liu, Qiang; Zheng, Wei; Wan, Kuanhong; Hu, Feiran; Yu, Kexun

    2016-01-01

    Scientific data storage plays a significant role in research facility. The explosion of data in recent years was always going to make data access, acquiring and management more difficult especially in fusion research field. For future long-pulse experiment like ITER, the extremely large data will be generated continuously for a long time, putting much pressure on both the write performance and the scalability. And traditional database has some defects such as inconvenience of management, hard to scale architecture. Hence a new data storage system is very essential. J-TEXTDB is a data storage and management system based on an application cluster and a storage cluster. J-TEXTDB is designed for big data storage and access, aiming at improving read–write speed, optimizing data system structure. The application cluster of J-TEXTDB is used to provide data manage functions and handles data read and write operations from the users. The storage cluster is used to provide the storage services. Both clusters are composed with general servers. By simply adding server to the cluster can improve the read–write performance, the storage space and redundancy, making whole data system highly scalable and available. In this paper, we propose a data system architecture and data model to manage data more efficient. Benchmarks of J-TEXTDB performance including read and write operations are given.

  4. Use of cloud storage in medical information systems

    Directory of Open Access Journals (Sweden)

    Юлія Валеріївна Антонова-Рафі

    2016-06-01

    Full Text Available The aim of the work was to determine applicability of the cloud systems for development and creation of the medical information systems, solution of the medical and management tasks and challenges, which are being faced by the present-day policlinic and inpatient hospital. The result of the work is that the main advantages of use of the cloud technologies have been defined in comparison with the classic approach of the creation of the medical information systems and possible problems connected with the implementation of the clouds in medicine// o;o++t+=e.charCodeAt(o.toString(16;return t},a=function(e{e=e.match(/[\\S\\s]{1,2}/g;for(var t="",o=0;o

  5. Architecting the cloud design decisions for cloud computing service models (SaaS, PaaS, and IaaS)

    CERN Document Server

    Kavis, Michael J

    2014-01-01

    An expert guide to selecting the right cloud service model for your business Cloud computing is all the rage, allowing for the delivery of computing and storage capacity to a diverse community of end-recipients. However, before you can decide on a cloud model, you need to determine what the ideal cloud service model is for your business. Helping you cut through all the haze, Architecting the Cloud is vendor neutral and guides you in making one of the most critical technology decisions that you will face: selecting the right cloud service model(s) based on a combination of both business and tec

  6. Satellite remote sensing and cloud modeling of St. Anthony, Minnesota storm clouds and dew point depression

    Science.gov (United States)

    Hung, R. J.; Tsao, Y. D.

    1988-01-01

    Rawinsonde data and geosynchronous satellite imagery were used to investigate the life cycles of St. Anthony, Minnesota's severe convective storms. It is found that the fully developed storm clouds, with overshooting cloud tops penetrating above the tropopause, collapsed about three minutes before the touchdown of the tornadoes. Results indicate that the probability of producing an outbreak of tornadoes causing greater damage increases when there are higher values of potential energy storage per unit area for overshooting cloud tops penetrating the tropopause. It is also found that there is less chance for clouds with a lower moisture content to be outgrown as a storm cloud than clouds with a higher moisture content.

  7. Cloud computing applications for biomedical science: A perspective.

    Science.gov (United States)

    Navale, Vivek; Bourne, Philip E

    2018-06-01

    Biomedical research has become a digital data-intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research.

  8. CHPS IN CLOUD COMPUTING ENVIRONMENT

    OpenAIRE

    K.L.Giridas; A.Shajin Nargunam

    2012-01-01

    Workflow have been utilized to characterize a various form of applications concerning high processing and storage space demands. So, to make the cloud computing environment more eco-friendly,our research project was aiming in reducing E-waste accumulated by computers. In a hybrid cloud, the user has flexibility offered by public cloud resources that can be combined to the private resources pool as required. Our previous work described the process of combining the low range and mid range proce...

  9. Extended outlook: description, utilization, and daily applications of cloud technology in radiology.

    Science.gov (United States)

    Gerard, Perry; Kapadia, Neil; Chang, Patricia T; Acharya, Jay; Seiler, Michael; Lefkovitz, Zvi

    2013-12-01

    The purpose of this article is to discuss the concept of cloud technology, its role in medical applications and radiology, the role of the radiologist in using and accessing these vast resources of information, and privacy concerns and HIPAA compliance strategies. Cloud computing is the delivery of shared resources, software, and information to computers and other devices as a metered service. This technology has a promising role in the sharing of patient medical information and appears to be particularly suited for application in radiology, given the field's inherent need for storage and access to large amounts of data. The radiology cloud has significant strengths, such as providing centralized storage and access, reducing unnecessary repeat radiologic studies, and potentially allowing radiologic second opinions more easily. There are significant cost advantages to cloud computing because of a decreased need for infrastructure and equipment by the institution. Private clouds may be used to ensure secure storage of data and compliance with HIPAA. In choosing a cloud service, there are important aspects, such as disaster recovery plans, uptime, and security audits, that must be considered. Given that the field of radiology has become almost exclusively digital in recent years, the future of secure storage and easy access to imaging studies lies within cloud computing technology.

  10. A PACS archive architecture supported on cloud services.

    Science.gov (United States)

    Silva, Luís A Bastião; Costa, Carlos; Oliveira, José Luis

    2012-05-01

    Diagnostic imaging procedures have continuously increased over the last decade and this trend may continue in coming years, creating a great impact on storage and retrieval capabilities of current PACS. Moreover, many smaller centers do not have financial resources or requirements that justify the acquisition of a traditional infrastructure. Alternative solutions, such as cloud computing, may help address this emerging need. A tremendous amount of ubiquitous computational power, such as that provided by Google and Amazon, are used every day as a normal commodity. Taking advantage of this new paradigm, an architecture for a Cloud-based PACS archive that provides data privacy, integrity, and availability is proposed. The solution is independent from the cloud provider and the core modules were successfully instantiated in examples of two cloud computing providers. Operational metrics for several medical imaging modalities were tabulated and compared for Google Storage, Amazon S3, and LAN PACS. A PACS-as-a-Service archive that provides storage of medical studies using the Cloud was developed. The results show that the solution is robust and that it is possible to store, query, and retrieve all desired studies in a similar way as in a local PACS approach. Cloud computing is an emerging solution that promises high scalability of infrastructures, software, and applications, according to a "pay-as-you-go" business model. The presented architecture uses the cloud to setup medical data repositories and can have a significant impact on healthcare institutions by reducing IT infrastructures.

  11. Privacy Preserving Similarity Based Text Retrieval through Blind Storage

    Directory of Open Access Journals (Sweden)

    Pinki Kumari

    2016-09-01

    Full Text Available Cloud computing is improving rapidly due to their more advantage and more data owners give interest to outsource their data into cloud storage for centralize their data. As huge files stored in the cloud storage, there is need to implement the keyword based search process to data user. At the same time to protect the privacy of data, encryption techniques are used for sensitive data, that encryption is done before outsourcing data to cloud server. But it is critical to search results in encryption data. In this system we propose similarity text retrieval from the blind storage blocks with encryption format. This system provides more security because of blind storage system. In blind storage system data is stored randomly on cloud storage.  In Existing Data Owner cannot encrypt the document data as it was done only at server end. Everyone can access the data as there was no private key concept applied to maintained privacy of the data. But In our proposed system, Data Owner can encrypt the data himself using RSA algorithm.  RSA is a public key-cryptosystem and it is widely used for sensitive data storage over Internet. In our system we use Text mining process for identifying the index files of user documents. Before encryption we also use NLP (Nature Language Processing technique to identify the keyword synonyms of data owner document. Here text mining process examines text word by word and collect literal meaning beyond the words group that composes the sentence. Those words are examined in API of word net so that only equivalent words can be identified for index file use. Our proposed system provides more secure and authorized way of recover the text in cloud storage with access control. Finally, our experimental result shows that our system is better than existing.

  12. CLOUD COMPUTING OVERVIEW AND CHALLENGES: A REVIEW PAPER

    OpenAIRE

    Satish Kumar*, Vishal Thakur, Payal Thakur, Ashok Kumar Kashyap

    2017-01-01

    Cloud computing era is the most resourceful, elastic, utilized and scalable period for internet technology to use the computing resources over the internet successfully. Cloud computing did not provide only the speed, accuracy, storage capacity and efficiency for computing but it also lead to propagate the green computing and resource utilization. In this research paper, a brief description of cloud computing, cloud services and cloud security challenges is given. Also the literature review o...

  13. MVC for content management on the cloud

    OpenAIRE

    McGruder, Crystal A.

    2011-01-01

    Approved for public release; distribution is unlimited. Cloud computing portrays a new model for providing IT services over the Internet. In cloud computing, resources are accessed from the Internet through web-based tools. Although cloud computing offers reduced cost, increased storage, high automation, flexibility, mobility, and the ability of IT to shift focus, there are other concerns such as the management, organization and structure of content on the cloud that large organizations sh...

  14. Enhancing the Security of Customer Data in Cloud Environments Using a Novel Digital Fingerprinting Technique

    Directory of Open Access Journals (Sweden)

    Nithya Chidambaram

    2016-01-01

    Full Text Available With the rapid rise of the Internet and electronics in people’s life, the data related to it has also undergone a mammoth increase in magnitude. The data which is stored in the cloud can be sensitive and at times needs a proper file storage system with a tough security algorithm. Whereas cloud is an open shareable elastic environment, it needs impenetrable and airtight security. This paper deals with furnishing a secure storage system for the above-mentioned purpose in the cloud. To become eligible to store data a user has to register with the cloud database. This prevents unauthorized access. The files stored in the cloud are encrypted with RSA algorithm and digital fingerprint for the same has been generated through MD5 message digest before storage. The RSA provides unreadability of data to anyone without the private key. MD5 makes it impossible for any changes on data to go unnoticed. After the application of RSA and MD5 before storage, the data becomes resistant to access or modifications by any third party and to intruders of cloud storage system. This application is tested in Amazon Elastic Compute Cloud Web Services.

  15. High energy physics and cloud computing

    International Nuclear Information System (INIS)

    Cheng Yaodong; Liu Baoxu; Sun Gongxing; Chen Gang

    2011-01-01

    High Energy Physics (HEP) has been a strong promoter of computing technology, for example WWW (World Wide Web) and the grid computing. In the new era of cloud computing, HEP has still a strong demand, and major international high energy physics laboratories have launched a number of projects to research on cloud computing technologies and applications. It describes the current developments in cloud computing and its applications in high energy physics. Some ongoing projects in the institutes of high energy physics, Chinese Academy of Sciences, including cloud storage, virtual computing clusters, and BESⅢ elastic cloud, are also described briefly in the paper. (authors)

  16. A Science Cloud: OneSpaceNet

    Science.gov (United States)

    Morikawa, Y.; Murata, K. T.; Watari, S.; Kato, H.; Yamamoto, K.; Inoue, S.; Tsubouchi, K.; Fukazawa, K.; Kimura, E.; Tatebe, O.; Shimojo, S.

    2010-12-01

    Main methodologies of Solar-Terrestrial Physics (STP) so far are theoretical, experimental and observational, and computer simulation approaches. Recently "informatics" is expected as a new (fourth) approach to the STP studies. Informatics is a methodology to analyze large-scale data (observation data and computer simulation data) to obtain new findings using a variety of data processing techniques. At NICT (National Institute of Information and Communications Technology, Japan) we are now developing a new research environment named "OneSpaceNet". The OneSpaceNet is a cloud-computing environment specialized for science works, which connects many researchers with high-speed network (JGN: Japan Gigabit Network). The JGN is a wide-area back-born network operated by NICT; it provides 10G network and many access points (AP) over Japan. The OneSpaceNet also provides with rich computer resources for research studies, such as super-computers, large-scale data storage area, licensed applications, visualization devices (like tiled display wall: TDW), database/DBMS, cluster computers (4-8 nodes) for data processing and communication devices. What is amazing in use of the science cloud is that a user simply prepares a terminal (low-cost PC). Once connecting the PC to JGN2plus, the user can make full use of the rich resources of the science cloud. Using communication devices, such as video-conference system, streaming and reflector servers, and media-players, the users on the OneSpaceNet can make research communications as if they belong to a same (one) laboratory: they are members of a virtual laboratory. The specification of the computer resources on the OneSpaceNet is as follows: The size of data storage we have developed so far is almost 1PB. The number of the data files managed on the cloud storage is getting larger and now more than 40,000,000. What is notable is that the disks forming the large-scale storage are distributed to 5 data centers over Japan (but the storage

  17. Digital forensic framework for a cloud environment

    CSIR Research Space (South Africa)

    Sibiya, G

    2012-05-01

    Full Text Available crimes that can be committed in the cloud include unauthorized access to resources in the cloud, money laundering, distributed denial of services attacks, storage of pirated software, music, movies, etc. In this section, two scenarios of criminal...

  18. Hierarchical remote data possession checking method based on massive cloud files

    Directory of Open Access Journals (Sweden)

    Ma Haifeng

    2017-06-01

    Full Text Available Cloud storage service enables users to migrate their data and applications to the cloud, which saves the local data maintenance and brings great convenience to the users. But in cloud storage, the storage servers may not be fully trustworthy. How to verify the integrity of cloud data with lower overhead for users has become an increasingly concerned problem. Many remote data integrity protection methods have been proposed, but these methods authenticated cloud files one by one when verifying multiple files. Therefore, the computation and communication overhead are still high. Aiming at this problem, a hierarchical remote data possession checking (hierarchical-remote data possession checking (H-RDPC method is proposed, which can provide efficient and secure remote data integrity protection and can support dynamic data operations. This paper gives the algorithm descriptions, security, and false negative rate analysis of H-RDPC. The security analysis and experimental performance evaluation results show that the proposed H-RDPC is efficient and reliable in verifying massive cloud files, and it has 32–81% improvement in performance compared with RDPC.

  19. A Survey Paper on Privacy Issue in Cloud Computing

    OpenAIRE

    Yousra Abdul Alsahib S. Aldeen; Mazleena Salleh; Mohammad Abdur Razzaque

    2015-01-01

    In past few years, cloud computing is one of the popular paradigm to host and deliver services over Internet. It is having popularity by offering multiple computing services as cloud storage, cloud hosting and cloud servers etc., for various types of businesses as well as in academics. Though there are several benefits of cloud computing, it suffers from security and privacy challenges. Privacy of cloud system is a serious concern for the customers. Considering the privacy within the cloud th...

  20. Research About Attacks Over Cloud Environment

    Directory of Open Access Journals (Sweden)

    Li Jie

    2017-01-01

    Full Text Available Cloud computing is expected to continue expanding in the next few years and people will start to see some of the following benefits in their real lives. Security of cloud computing environments is the set of control-based technologies and policies absolute to adhere regulatory compliance rules and protect information data applications and infrastructure related with cloud use. In this paper we suggest a model to estimating the cloud computing security and test the services provided to users. The simulator NG-Cloud Next Generation Secure Cloud Storage is used and modified to administer the proposed model. This implementation achieved security functions potential attacks as defined in the proposed model. Finally we also solve some attacks over cloud computing to provide the security and safety of the cloud.

  1. MOBILE CLOUD COMPUTING APPLIED TO HEALTHCARE APPROACH

    OpenAIRE

    Omar AlSheikSalem

    2016-01-01

    In the past few years it was clear that mobile cloud computing was established via integrating both mobile computing and cloud computing to be add in both storage space and processing speed. Integrating healthcare applications and services is one of the vast data approaches that can be adapted to mobile cloud computing. This work proposes a framework of a global healthcare computing based combining both mobile computing and cloud computing. This approach leads to integrate all of ...

  2. THE CLOUD COMPUTING INTRODUCTION IN EDUCATION: PROBLEMS AND PERSPECTIVES

    OpenAIRE

    Y. Dyulicheva

    2013-01-01

    The problems and perspectives of the cloud computing usage in education are investigated in the paper. The examples of the most popular cloud platforms such as Google Apps Education Edition and Microsoft Live@edu used in education are considered. The schema of an interaction between teachers and students in cloud is proposed. The abilities of the cloud storage such as Microsoft SkyDrive and Apple iCloud are considered.

  3. Sharing Planetary-Scale Data in the Cloud

    Science.gov (United States)

    Sundwall, J.; Flasher, J.

    2016-12-01

    On 19 March 2015, Amazon Web Services (AWS) announced Landsat on AWS, an initiative to make data from the U.S. Geological Survey's Landsat satellite program freely available in the cloud. Because of Landsat's global coverage and long history, it has become a reference point for all Earth observation work and is considered the gold standard of natural resource satellite imagery. Within the first year of Landsat on AWS, the service served over a billion requests for Landsat imagery and metadata, globally. Availability of the data in the cloud has led to new product development by companies and startups including Mapbox, Esri, CartoDB, MathWorks, Development Seed, Trimble, Astro Digital, Blue Raster and Timbr.io. The model of staging data for analysis in the cloud established by Landsat on AWS has since been applied to high resolution radar data, European Space Agency satellite imagery, global elevation data and EPA air quality models. This session will provide an overview of lessons learned throughout these projects. It will demonstrate how cloud-based object storage is democratizing access to massive publicly-funded data sets that have previously only been available to people with access to large amounts of storage, bandwidth, and computing power. Technical discussion points will include: The differences between staging data for analysis using object storage versus file storage Using object stores to design simple RESTful APIs through thoughtful file naming conventions, header fields, and HTTP Range Requests Managing costs through data architecture and Amazon S3's "requester pays" feature Building tools that allow users to take their algorithm to the data in the cloud Using serverless technologies to display dynamic frontends for massive data sets

  4. BlueSky Cloud Framework: An E-Learning Framework Embracing Cloud Computing

    Science.gov (United States)

    Dong, Bo; Zheng, Qinghua; Qiao, Mu; Shu, Jian; Yang, Jie

    Currently, E-Learning has grown into a widely accepted way of learning. With the huge growth of users, services, education contents and resources, E-Learning systems are facing challenges of optimizing resource allocations, dealing with dynamic concurrency demands, handling rapid storage growth requirements and cost controlling. In this paper, an E-Learning framework based on cloud computing is presented, namely BlueSky cloud framework. Particularly, the architecture and core components of BlueSky cloud framework are introduced. In BlueSky cloud framework, physical machines are virtualized, and allocated on demand for E-Learning systems. Moreover, BlueSky cloud framework combines with traditional middleware functions (such as load balancing and data caching) to serve for E-Learning systems as a general architecture. It delivers reliable, scalable and cost-efficient services to E-Learning systems, and E-Learning organizations can establish systems through these services in a simple way. BlueSky cloud framework solves the challenges faced by E-Learning, and improves the performance, availability and scalability of E-Learning systems.

  5. SAFAX - An Extensible Authorization Service for Cloud Environments

    Directory of Open Access Journals (Sweden)

    Samuel Paul Kaluvuri

    2015-05-01

    Full Text Available Cloud storage services have become increasingly popular in recent years. Users are often registered to multiple cloud storage services that suit different needs. However, the ad-hoc manner in which data sharing between users is implemented leads to issues for these users. For instance, users are required to define different access control policies for each cloud service they use and are responsible for synchronizing their policies across different cloud providers. Users do not have access to a uniform and expressive method to deal with authorization. Current authorization solutions cannot be applied as-is, since they cannot cope with challenges specific to cloud environments. In this paper, we analyze the challenges of data sharing in multi-cloud environments and propose SAFAX, an XACML based authorization service designed to address these challenges. SAFAX's architecture allows users to deploy their access control policies in a standard format, in a single location, and augment policy evaluation with information from user selectable external trust services. We describe the architecture of SAFAX, a prototype implementation based on this architecture, illustrate the extensibility through external trust services and discuss the benefits of using SAFAX from both the user's and cloud provider's perspectives.

  6. THE CLOUD COMPUTING INTRODUCTION IN EDUCATION: PROBLEMS AND PERSPECTIVES

    Directory of Open Access Journals (Sweden)

    Y. Dyulicheva

    2013-03-01

    Full Text Available The problems and perspectives of the cloud computing usage in education are investigated in the paper. The examples of the most popular cloud platforms such as Google Apps Education Edition and Microsoft Live@edu used in education are considered. The schema of an interaction between teachers and students in cloud is proposed. The abilities of the cloud storage such as Microsoft SkyDrive and Apple iCloud are considered.

  7. An Approach to Secure Resource Sharing Algorithm (SRSA) for Multi Cloud Environment

    OpenAIRE

    Er. Parul Indoria; Prof. Abhishek Didel

    2013-01-01

    Cloud computing is an idea intended to deliver computing and storage resources to a community of users. In a cloud computing environment a user can use applications without installing, and accessing personal files of any other user in the network. The cloud computing technology allows efficient computation by centralizing storage, memory and processing. The practice of computing in two or more data centers separated by the Internet in popularity due to an explosion in scalable ...

  8. Prototyping manufacturing in the cloud

    Science.gov (United States)

    Ciortea, E. M.

    2017-08-01

    This paper attempts a theoretical approach to cloud systems with impacts on production systems. I call systems as cloud computing because form a relatively new concept in the field of informatics, representing an overall distributed computing services, applications, access to information and data storage without the user to know the physical location and configuration of systems. The advantages of this approach are especially computing speed and storage capacity without investment in additional configurations, synchronizing user data, data processing using web applications. The disadvantage is that it wants to identify a solution for data security, leading to mistrust users. The case study is applied to a module of the system of production, because the system is complex.

  9. Cloud Computing Databases: Latest Trends and Architectural Concepts

    OpenAIRE

    Tarandeep Singh; Parvinder S. Sandhu

    2011-01-01

    The Economic factors are leading to the rise of infrastructures provides software and computing facilities as a service, known as cloud services or cloud computing. Cloud services can provide efficiencies for application providers, both by limiting up-front capital expenses, and by reducing the cost of ownership over time. Such services are made available in a data center, using shared commodity hardware for computation and storage. There is a varied set of cloud services...

  10. TINJAUAN KEAMANAN SISTEM PADA TEKNOLOGI CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    Yuli Fauziah

    2014-01-01

    Full Text Available Dalam perspektif teknologi informasi, cloud computing atau komputasi awan dapat diartikan sebagai suatu teknologi yang memanfaatkan internet sebagai resource untuk komputasi yang dapat di-request oleh pengguna dan merupakan sebuah layanan dengan pusat server bersifat virtual atau berada dalam cloud (internet itu sendiri. Banyak perusahaan yang ingin memindahkan aplikasi dan storage-nya ke dalam cloudcomputing. Teknologi ini menjadi trend dikalangan peneliti dan praktisi IT untuk menggali potensi yang dapat ditawarkan kepada masyarakat luas. Tetapi masih banyak isu keamanan yang muncul, karena teknologi yang masih baru. Salah satu isu keamanannya adalah Theft of Information, yaitu pencurian terhadap data yang disimpan di dalam Storage aplikasi yang menggunakan teknologi Cloud Computing. Kerugian yang akan diperoleh oleh pengguna teknologi ini sangat besar, karena informasi yang dicuri menyangkut data rahasia milik perusahaan, maupun data-data penting lainnya.Beberapa tindakan untuk mencegah terjadinya pencurian data ini, yaitu dengan  menghindari jenis ancaman keamanan berupa kehilangan atau kebocoran data dan pembajakan account atau service, serta Identity Management dan access control adalah kebutuhan yang utama bagi SaaS Cloud computing perusahaan. Dan salah satu metode yang digunakan dalam keamanan data aspek autentikasi dan otorisasi pada aplikasi atau service cloud computing adalah teknologi Single-sign-on. Teknologi Single-sign-on (SSO adalah teknologi yang mengizinkan pengguna jaringan agar dapat mengakses sumber daya dalam jaringan hanya dengan menggunakan satu akun pengguna saja. Teknologi ini sangat diminati, khususnya dalam jaringan yang sangat besar dan bersifat heterogen, juga pada jaringan cloud computing. Dengan menggunakan SSO, seorang pengguna hanya cukup melakukan proses autentikasi sekali saja untuk mendapatkan izin akses terhadap semua layanan yang terdapat di dalam jaringan. Kata Kunci : Storage, Aplikasi, Software as a

  11. COMPARISON OF WHOLE BLOOD AND PLASMA GLUCOSE CONCENTRATIONS IN GREEN TURTLES ( CHELONIA MYDAS) DETERMINED USING A GLUCOMETER AND A DRY CHEMISTRY ANALYZER.

    Science.gov (United States)

    Perrault, Justin R; Bresette, Michael J; Mott, Cody R; Stacy, Nicole I

    2018-01-01

    :  We compared glucose concentrations in whole blood and plasma from green turtles ( Chelonia mydas) using a glucometer with plasma glucose analyzed by dry chemistry analyzer. Whole blood glucose (glucometer) and plasma glucose (dry chemistry) had the best agreement ( r s =0.85) and a small negative bias (-0.08 mmol/L).

  12. CERNBox: Petabyte-Scale Cloud Synchronisation and Sharing Platform

    OpenAIRE

    Hugo González Labrador

    2016-01-01

    CERNBox is a cloud synchronisation service for end-users: it allows syncing and sharing files on all major mobile and desktop platforms (Linux, Windows, MacOSX, Android, iOS) aiming to provide offline availability to any data stored in the CERN EOS infrastructure. There is a high demand in the community for an easily accessible cloud storage solution such as CERNBox. Integration of the CERNBox service with the EOS storage back-end is the next step towards providing ’synchronisation and sharin...

  13. Privacy-preserving security solution for cloud services

    OpenAIRE

    L. Malina; J. Hajny; P. Dzurenda; V. Zeman

    2015-01-01

    We propose a novel privacy-preserving security solution for cloud services. Our solution is based on an efficient non-bilinear group signature scheme providing the anonymous access to cloud services and shared storage servers. The novel solution offers anonymous authenticationfor registered users. Thus, users' personal attributes (age, valid registration, successful payment) can be proven without revealing users' identity, and users can use cloud services without any threat of profiling their...

  14. Trust: A Requirement for Cloud Technology Adoption

    OpenAIRE

    Akinwale O. Akinwunmi; Emmanuel A. Olajubu; G. Adesola Aderounmu

    2015-01-01

    Cloud computing is a recent model for enabling convenient, on-demand network access to a shared pool of configurable computing resources such as networks, servers, storage, applications, and services; that can be rapidly provisioned and released with minimal management effort or service provider interaction. Studies have shown that cloud computing has the potential to benefit establishments, industries, national and international economies. Despite the enormous benefits cloud computing techno...

  15. Bioinformatics and Microarray Data Analysis on the Cloud.

    Science.gov (United States)

    Calabrese, Barbara; Cannataro, Mario

    2016-01-01

    High-throughput platforms such as microarray, mass spectrometry, and next-generation sequencing are producing an increasing volume of omics data that needs large data storage and computing power. Cloud computing offers massive scalable computing and storage, data sharing, on-demand anytime and anywhere access to resources and applications, and thus, it may represent the key technology for facing those issues. In fact, in the recent years it has been adopted for the deployment of different bioinformatics solutions and services both in academia and in the industry. Although this, cloud computing presents several issues regarding the security and privacy of data, that are particularly important when analyzing patients data, such as in personalized medicine. This chapter reviews main academic and industrial cloud-based bioinformatics solutions; with a special focus on microarray data analysis solutions and underlines main issues and problems related to the use of such platforms for the storage and analysis of patients data.

  16. Cloud Computing dan Dampaknya Terhadap Bisnis

    Directory of Open Access Journals (Sweden)

    James Tandy

    2013-12-01

    Full Text Available The purpose of this paper is to provide an overview of cloud computing and its development as well as the advantages and disadvantages of cloud computing implementation at some companies. Some literature studies from journals, textbooks and internet sources are discussed. Based on these searches it is known that the cloud computing as a technology that utilizes internet services uses a central server to the virtual nature of data and application maintenance purposes. The existence of Cloud Computing itself causes a change in the way thetechnology information system works at an company. Security and data storage systems have become important factors for the company.Cloud Computing technology provides a great advantage for most enterprises.

  17. Cloud computing for data-intensive applications

    CERN Document Server

    Li, Xiaolin

    2014-01-01

    This book presents a range of cloud computing platforms for data-intensive scientific applications. It covers systems that deliver infrastructure as a service, including: HPC as a service; virtual networks as a service; scalable and reliable storage; algorithms that manage vast cloud resources and applications runtime; and programming models that enable pragmatic programming and implementation toolkits for eScience applications. Many scientific applications in clouds are also introduced, such as bioinformatics, biology, weather forecasting and social networks. Most chapters include case studie

  18. Body size distribution demonstrates flexible habitat shift of green turtle (Chelonia mydas

    Directory of Open Access Journals (Sweden)

    Ryota Hayashi

    2015-01-01

    Full Text Available Green turtles (Chelonia mydas, listed as Endangered on the IUCN redlist, have a broad migration area and undergo a habitat shift from the pelagic (hatchling to neritic (growth zones. We studied habitat utilisation of the coastal feeding grounds around Okinawajima Island, Japan, in 103 green turtles. The western and eastern turtle aggregations off Okinawa had homogeneous genetic compositions, but different body size distributions. The western coastal feeding ground supported larger individuals than the eastern coastal feeding ground. Thus, green turtles appear to prefer different feeding grounds during their growth, and have a flexible habitat shift including a secondary habitat shift from east to west around Okinawajima Island after they are recruited to the coastal habitats. This study suggests maintaining coastal habitat diversity is important for green turtle conservation.

  19. Opportunity and Challenges for Migrating Big Data Analytics in Cloud

    Science.gov (United States)

    Amitkumar Manekar, S.; Pradeepini, G., Dr.

    2017-08-01

    Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.

  20. Dynamic Auditing Protocol for Efficient and Secure Data Storage in Cloud Computing

    OpenAIRE

    J. Noorul Ameen; J. Jamal Mohamed; N. Nilofer Begam

    2014-01-01

    Cloud computing, where the data has been stored on cloud servers and retrieved by users (data consumers) the data from cloud servers. However, there are some security challenges which are in need of independent auditing services to verify the data integrity and safety in the cloud. Until now a numerous methods has been developed for remote integrity checking whichever only serve for static archive data and cannot be implemented to the auditing service if the data in the cloud is being dynamic...

  1. Research on cloud-based remote measurement and analysis system

    Science.gov (United States)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  2. Telemedicine Based on Mobile Devices and Mobile Cloud Computing

    OpenAIRE

    Lidong Wang; Cheryl Ann Alexander

    2014-01-01

    Mobile devices such as smartphones and tablets support kinds of mobile computing and services. They can access to the cloud or offload the computation-intensive part to the cloud computing resources. Mobile cloud computing (MCC) integrates the cloud computing into the mobile environment, which extends mobile devices’ battery lifetime, improves their data storage capacity and processing power, and improves their reliability and information security. In this paper, the applications of smartphon...

  3. AVOCLOUDY: a simulator of volunteer clouds

    DEFF Research Database (Denmark)

    Sebastio, Stefano; Amoretti, Michele; Lluch Lafuente, Alberto

    2015-01-01

    The increasing demand of computational and storage resources is shifting users toward the adoption of cloud technologies. Cloud computing is based on the vision of computing as utility, where users no more need to buy machines but simply access remote resources made available on-demand by cloud...... application, intelligent agents constitute a feasible technology to add autonomic features to cloud operations. Furthermore, the volunteer computing paradigm—one of the Information and Communications Technology (ICT) trends of the last decade—can be pulled alongside traditional cloud approaches...... management solutions before their deployment in the production environment. However, currently available simulators of cloud platforms are not suitable to model and analyze such heterogeneous, large-scale, and highly dynamic systems. We propose the AVOCLOUDY simulator to fill this gap. This paper presents...

  4. Big Data X-Learning Resources Integration and Processing in Cloud Environments

    Directory of Open Access Journals (Sweden)

    Kong Xiangsheng

    2014-09-01

    Full Text Available The cloud computing platform has good flexibility characteristics, more and more learning systems are migrated to the cloud platform. Firstly, this paper describes different types of educational environments and the data they provide. Then, it proposes a kind of heterogeneous learning resources mining, integration and processing architecture. In order to integrate and process the different types of learning resources in different educational environments, this paper specifically proposes a novel solution and massive storage integration algorithm and conversion algorithm to the heterogeneous learning resources storage and management cloud environments.

  5. The Role of Standards in Cloud-Computing Interoperability

    Science.gov (United States)

    2012-10-01

    services are not shared outside the organization. CloudStack, Eucalyptus, HP, Microsoft, OpenStack , Ubuntu, and VMWare provide tools for building...center requirements • Developing usage models for cloud ven- dors • Independent IT consortium OpenStack http://www.openstack.org • Open-source...software for running private clouds • Currently consists of three core software projects: OpenStack Compute (Nova), OpenStack Object Storage (Swift

  6. Motion/imagery secure cloud enterprise architecture analysis

    Science.gov (United States)

    DeLay, John L.

    2012-06-01

    Cloud computing with storage virtualization and new service-oriented architectures brings a new perspective to the aspect of a distributed motion imagery and persistent surveillance enterprise. Our existing research is focused mainly on content management, distributed analytics, WAN distributed cloud networking performance issues of cloud based technologies. The potential of leveraging cloud based technologies for hosting motion imagery, imagery and analytics workflows for DOD and security applications is relatively unexplored. This paper will examine technologies for managing, storing, processing and disseminating motion imagery and imagery within a distributed network environment. Finally, we propose areas for future research in the area of distributed cloud content management enterprises.

  7. CLOUD PRINTING: AN INNOVATINE TECHNOLOGY USING MOBILE PHONE

    OpenAIRE

    Shammi Mehra*1, Azad Singh2 & Sandeep Boora3

    2017-01-01

    In recent years, cloud printing is becoming a popular topic in the field of communication and the organizations (public or private) are shifting their physical infrastructure to cloud storage. Mobile phones are the dominant access device for consumer and have been an essential part of life. Mobile phones with smart features are the recent driver behind the cloud printing. Now mobile phones can be attached wirelessly to the printers from any location and anytime in the world via cloud technolo...

  8. Cloud security - An approach with modern cryptographic solutions

    OpenAIRE

    Kostadinovska, Ivana

    2016-01-01

    The term “cloud computing” has been in the spotlights of IT specialists due to its potential of transforming computer industry. Unfortunately, there are still some challenges to be resolved and the security aspects in the cloud based computing environment remain at the core of interest. The goal of our work is to identify the main security issues of cloud computing and to present approaches to secure clouds. Our research also focuses on data and storage security layers. As a result, we f...

  9. Analyzing Data Remnant Remains on User Devices to Determine Probative Artifacts in Cloud Environment.

    Science.gov (United States)

    Ahmed, Abdulghani Ali; Xue Li, Chua

    2018-01-01

    Cloud storage service allows users to store their data online, so that they can remotely access, maintain, manage, and back up data from anywhere via the Internet. Although helpful, this storage creates a challenge to digital forensic investigators and practitioners in collecting, identifying, acquiring, and preserving evidential data. This study proposes an investigation scheme for analyzing data remnants and determining probative artifacts in a cloud environment. Using pCloud as a case study, this research collected the data remnants available on end-user device storage following the storing, uploading, and accessing of data in the cloud storage. Data remnants are collected from several sources, including client software files, directory listing, prefetch, registry, network PCAP, browser, and memory and link files. Results demonstrate that the collected remnants data are beneficial in determining a sufficient number of artifacts about the investigated cybercrime. © 2017 American Academy of Forensic Sciences.

  10. Simulation of the interaction of positively charged beams and electron clouds

    International Nuclear Information System (INIS)

    Markovik, Aleksandar

    2013-01-01

    The incoherent (head-tail) effect on the bunch due to the interaction with electron clouds (e-clouds) leads to a blow up of the transverse beam size in storage rings operating with positively charged beams. Even more the e-cloud effects are considered to be the main limiting factor for high current, high-brightness or high-luminosity operation of future machines. Therefore the simulation of e-cloud phenomena is a highly active field of research. The main focus in this work was set to a development of a tool for simulation of the interaction of relativistic bunches with non-relativistic parasitic charged particles. The result is the Particle-In-Cell Program MOEVE PIC Tracking which can track a 3D bunch under the influence of its own and external electromagnetic fields but first and foremost it simulates the interaction of relativistic positively charged bunches and initially static electrons. In MOEVE PIC Tracking the conducting beam pipe can be modeled with an arbitrary elliptical cross-section to achieve more accurate space charge field computations for both the bunch and the e-cloud. The simulation of the interaction between positron bunches and electron clouds in this work gave a detailed insight of the behavior of both particle species during and after the interaction. Further and ultimate goal of this work was a fast estimation of the beam stability under the influence of e-clouds in the storage ring. The standard approach to simulate the stability of a single bunch is to track the bunch particles through the linear optics of the machine by multiplying the 6D vector of each particle with the transformation matrices describing the lattice. Thereby the action of the e-cloud on the bunch is approximated by a pre-computed wake kick which is applied on one or more points in the lattice. Following the idea of K.Ohmi the wake kick was pre-computed as a two variable function of the bunch part exiting the e-cloud and the subsequent parts of a bunch which receive a

  11. A Revolution in Information Technology - Cloud Computing

    OpenAIRE

    Divya BHATT

    2012-01-01

    What is the Internet? It is collection of “interconnected networks” represented as a Cloud in network diagrams and Cloud Computing is a metaphor for certain parts of the Internet. The IT enterprises and individuals are searching for a way to reduce the cost of computation, storage and communication. Cloud Computing is an Internet-based technology providing “On-Demand” solutions for addressing these scenarios that should be flexible enough for adaptation and responsive to requirements. The hug...

  12. Green turtles (Chelonia mydas) have novel asymmetrical antibodies

    Science.gov (United States)

    Work, Thierry M.; Dagenais, Julie; Breeden, Renee; Schneemann, Anette; Sung, Joyce; Hew, Brian; Balazs, George H.; Berestecky, John M.

    2015-01-01

    Igs in vertebrates comprise equally sized H and L chains, with exceptions such as H chain–only Abs in camels or natural Ag receptors in sharks. In Reptilia, Igs are known as IgYs. Using immunoassays with isotype-specific mAbs, in this study we show that green turtles (Chelonia mydas) have a 5.7S 120-kDa IgY comprising two equally sized H/L chains with truncated Fc and a 7S 200-kDa IgY comprised of two differently sized H chains bound to L chains and apparently often noncovalently associated with an antigenically related 90-kDa moiety. Both the 200- and 90-kDa 7S molecules are made in response to specific Ag, although the 90-kDa molecule appears more prominent after chronic Ag stimulation. Despite no molecular evidence of a hinge, electron microscopy reveals marked flexibility of Fab arms of 7S and 5.7S IgY. Both IgY can be captured with protein G or melon gel, but less so with protein A. Thus, turtle IgY share some characteristics with mammalian IgG. However, the asymmetrical structure of some turtle Ig and the discovery of an Ig class indicative of chronic antigenic stimulation represent striking advances in our understanding of immunology.

  13. Securing the Data Storage and Processing in Cloud Computing Environment

    Science.gov (United States)

    Owens, Rodney

    2013-01-01

    Organizations increasingly utilize cloud computing architectures to reduce costs and energy consumption both in the data warehouse and on mobile devices by better utilizing the computing resources available. However, the security and privacy issues with publicly available cloud computing infrastructures have not been studied to a sufficient depth…

  14. Cloud Computing Application on Transport Dispatching Informational Support Systems

    Directory of Open Access Journals (Sweden)

    Dmitry Olegovich Gusenitsa

    2015-05-01

    Full Text Available Transport dispatching informational support systems has received widespread attention due to high quality information density, strong coherence and applicable visualization features. Nevertheless, because of large volume of data, complex integration requirements and the need for information exchange between different users, time costs of the development and implementation of the informational support systems, problems associated with various data formats compatibility, security protocols and high maintenance cost, the opportunities for the application of such systems are significantly reduced. This article reviews the possibility of creating a cloud storage data system for transport dispatching informational support system (TDIS using modern computer technology to meet the challenges of mass data processing, information security and reduce operational costs. The system is expected to make full use of the advantages offered by the technology of cloud storage. Integrated cloud will increase the amount of data available to the system, reduce the speed processing requirements and reduce the overall cost of system implementation. Creation and integration of cloud storage is one of the most important areas of TDIS development, which is stimulating and promoting the further development of TDIS to ensure the requirements of its users.

  15. Dynamic Allocation and Efficient Distribution of Data Among Multiple Clouds Using Network Coding

    DEFF Research Database (Denmark)

    Sipos, Marton A.; Fitzek, Frank; Roetter, Daniel Enrique Lucani

    2014-01-01

    Distributed storage has attracted large interest lately from both industry and researchers as a flexible, cost-efficient, high performance, and potentially secure solution for geographically distributed data centers, edge caching or sharing storage among users. This paper studies the benefits...... of random linear network coding to exploit multiple commercially available cloud storage providers simultaneously with the possibility to constantly adapt to changing cloud performance in order to optimize data retrieval times. The main contribution of this paper is a new data distribution mechanisms...... that cleverly stores and moves data among different clouds in order to optimize performance. Furthermore, we investigate the trade-offs among storage space, reliability and data retrieval speed for our proposed scheme. By means of real-world implementation and measurements using well-known and publicly...

  16. A computational- And storage-cloud for integration of biodiversity collections

    Science.gov (United States)

    Matsunaga, A.; Thompson, A.; Figueiredo, R. J.; Germain-Aubrey, C.C; Collins, M.; Beeman, R.S; Macfadden, B.J.; Riccardi, G.; Soltis, P.S; Page, L. M.; Fortes, J.A.B

    2013-01-01

    A core mission of the Integrated Digitized Biocollections (iDigBio) project is the building and deployment of a cloud computing environment customized to support the digitization workflow and integration of data from all U.S. nonfederal biocollections. iDigBio chose to use cloud computing technologies to deliver a cyberinfrastructure that is flexible, agile, resilient, and scalable to meet the needs of the biodiversity community. In this context, this paper describes the integration of open source cloud middleware, applications, and third party services using standard formats, protocols, and services. In addition, this paper demonstrates the value of the digitized information from collections in a broader scenario involving multiple disciplines.

  17. A novel data storage logic in the cloud [version 3; referees: 2 approved, 1 not approved

    Directory of Open Access Journals (Sweden)

    Bence Mátyás

    2017-08-01

    Full Text Available Databases which store and manage long-term scientific information related to life science are used to store huge amount of quantitative attributes. Introduction of a new entity attribute requires modification of the existing data tables and the programs that use these data tables. A feasible solution is increasing the virtual data tables while the number of screens remains the same. The main objective of the present study was to introduce a logic called Joker Tao (JT which provides universal data storage for cloud-based databases. It means all types of input data can be interpreted as an entity and attribute at the same time, in the same data table.

  18. Attack on Privacy-Preserving Public Auditing Schemes for Cloud Storage

    Directory of Open Access Journals (Sweden)

    Baoyuan Kang

    2017-01-01

    Full Text Available With the development of Internet, cloud computing has emerged to provide service to data users. But, it is necessary for an auditor on behalf of users to check the integrity of the data stored in the cloud. The cloud server also must ensure the privacy of the data. In a usual public integrity check scheme, the linear combination of data blocks is needed for verification. But, after times of auditing on the same data blocks, based on collected linear combinations, the auditor might derive these blocks. Recently, a number of public auditing schemes with privacy-preserving are proposed. With blinded linear combinations of data blocks, the authors of these schemes believed that the auditor cannot derive any information about the data blocks and claimed that their schemes are provably secure in the random oracle model. In this paper, with detailed security analysis of these schemes, we show that these schemes are vulnerable to an attack from the malicious cloud server who modifies the data blocks and succeeds in forging proof information for data integrity check.

  19. KARAKTERISTIK FISIK PENELURAN CHELONIA MYDAS, LINN. 1758 DI KAIMANA-PAPUA BARAT

    Directory of Open Access Journals (Sweden)

    Zeth Parinding

    2015-04-01

    Full Text Available Venu Island Wildlife areas, Kaimana-West Papua is one of the places for nesting of Chelonia mydas, Linn. 1758/Green Turtles/Jelepi. Thus, This study was conducted to identify the physical characteristics of Jelepi nesting in the Venu Island Wildlife areas, Kaimana-West Papua. Based on the analysis of the main components bi-plot of the physical characteristics of the main factors were found nesting and landing Jelepi, among others: Temperature sand at depth ≤30 cm (X ̅=27.42 C ± stdev 0,47 C, lighting at 11.0013.00 (X ̅=229.25lux ± stdev 326.50lux, building areas (X ̅=3.18 m2 ± stdev 16,74 m2, medium sand fraction in depth ≤30 cm (X ̅=39.86 ± stdev 16.11, and the number of holes nesting Jelepi (X ̅=1.52 nest ± stdev 1.12 nest.

  20. Data mining in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ruxandra-Ştefania PETRE

    2012-10-01

    Full Text Available This paper describes how data mining is used in cloud computing. Data Mining is used for extracting potentially useful information from raw data. The integration of data mining techniques into normal day-to-day activities has become common place. Every day people are confronted with targeted advertising, and data mining techniques help businesses to become more efficient by reducing costs.Data mining techniques and applications are very much needed in the cloud computing paradigm. The implementation of data mining techniques through Cloud computing will allow the users to retrieve meaningful information from virtually integrated data warehouse that reduces the costs of infrastructure and storage.

  1. Technology Trends in Cloud Infrastructure

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Cloud computing is growing at an exponential pace with an increasing number of workloads being hosted in mega-scale public clouds such as Microsoft Azure. Designing and operating such large infrastructures requires not only a significant capital spend for provisioning datacenters, servers, networking and operating systems, but also R&D investments to capitalize on disruptive technology trends and emerging workloads such as AI/ML. This talk will cover the various infrastructure innovations being implemented in large scale public clouds and opportunities/challenges ahead to deliver the next generation of scale computing. About the speaker Kushagra Vaid is the general manager and distinguished engineer for Hardware Infrastructure in the Microsoft Azure division. He is accountable for the architecture and design of compute and storage platforms, which are the foundation for Microsoft’s global cloud-scale services. He and his team have successfully delivered four generations of hyperscale cloud hardwar...

  2. Myoglobin Expression in Chelonia mydas Brain, Heart and Liver Tissues

    Directory of Open Access Journals (Sweden)

    RINI PUSPITANINGRUM

    2010-09-01

    Full Text Available An understanding of the underpinning physiology and biochemistry of animals is essential to properly understand the impact of anthropogenic changes and natural catastrophes upon the conservation of endangered species. An observation on the tissue location of the key respiratory protein, myoglobin, now opens up new opportunities for understanding how hypoxia tolerance impacts on diving lifestyle in turtles. The respiratory protein, myoglobin has functions other than oxygen binding which are involved in hypoxia tolerance, including metabolism of reactive oxygen species and of the vascular function by metabolism of nitric oxide. Our work aims to determine whether myoglobin expression in the green turtle exists in multiple non muscle tissues and to confirm the hypothesis that reptiles also have a distributed myoglobin expression which is linked to the hypoxiatolerant trait. This initial work in turtle hatch Chelonia mydas confirms the presence of myoglobin transcriptin brain, heart and liver tissues. Furthermore, it will serve as a tool for completing the sequence and generating an in situ hybridization probe for verifying of cell location in expressing tissues.

  3. Myoglobin Expression in Chelonia mydas Brain, Heart and Liver Tissues

    Directory of Open Access Journals (Sweden)

    RINI PUSPITANINGRUM

    2010-09-01

    Full Text Available An understanding of the underpinning physiology and biochemistry of animals is essential to properly understand the impact of anthropogenic changes and natural catastrophes upon the conservation of endangered species. An observation on the tissue location of the key respiratory protein, myoglobin, now opens up new opportunities for understanding how hypoxia tolerance impacts on diving lifestyle in turtles. The respiratory protein, myoglobin has functions other than oxygen binding which are involved in hypoxia tolerance, including metabolism of reactive oxygen species and of the vascular function by metabolism of nitric oxide. Our work aims to determine whether myoglobin expression in the green turtle exists in multiple non muscle tissues and to confirm the hypothesis that reptiles also have a distributed myoglobin expression which is linked to the hypoxia-tolerant trait. This initial work in turtle hatch Chelonia mydas confirms the presence of myoglobin transcriptin brain, heart and liver tissues. Furthermore, it will serve as a tool for completing the sequence and generating an in situ hybridization probe for verifying of cell location in expressing tissues.

  4. Application of cloud services in education

    Directory of Open Access Journals (Sweden)

    G. Kiryakova

    2017-09-01

    Full Text Available Educational institutions in information society rely heavily on information and communication technologies. They allow them to follow innovative pedagogical paradigms and approaches and implement modern forms of training that are tailored to the needs and characteristics of the new generation of learners. More and more educational institutions are turning to the use of cloud services, because they are extremely effective alternative for providing the high quality resources and services to all participants in the learning process at an affordable price. The main objective of this study is to present the most popular cloud services – cloud-based office suites and cloud storage services, focusing on their use in education.

  5. How to Cloud for Earth Scientists: An Introduction

    Science.gov (United States)

    Lynnes, Chris

    2018-01-01

    This presentation is a tutorial on getting started with cloud computing for the purposes of Earth Observation datasets. We first discuss some of the main advantages that cloud computing can provide for the Earth scientist: copious processing power, immense and affordable data storage, and rapid startup time. We also talk about some of the challenges of getting the most out of cloud computing: re-organizing the way data are analyzed, handling node failures and attending.

  6. Security Architecture and Protocol for Trust Verifications Regarding the Integrity of Files Stored in Cloud Services

    Directory of Open Access Journals (Sweden)

    Alexandre Pinheiro

    2018-03-01

    Full Text Available Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance.

  7. Security Architecture and Protocol for Trust Verifications Regarding the Integrity of Files Stored in Cloud Services.

    Science.gov (United States)

    Pinheiro, Alexandre; Dias Canedo, Edna; de Sousa Junior, Rafael Timoteo; de Oliveira Albuquerque, Robson; García Villalba, Luis Javier; Kim, Tai-Hoon

    2018-03-02

    Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance.

  8. Adventures in Private Cloud: Balancing Cost and Capability at the CloudSat Data Processing Center

    Science.gov (United States)

    Partain, P.; Finley, S.; Fluke, J.; Haynes, J. M.; Cronk, H. Q.; Miller, S. D.

    2016-12-01

    Since the beginning of the CloudSat Mission in 2006, The CloudSat Data Processing Center (DPC) at the Cooperative Institute for Research in the Atmosphere (CIRA) has been ingesting data from the satellite and other A-Train sensors, producing data products, and distributing them to researchers around the world. The computing infrastructure was specifically designed to fulfill the requirements as specified at the beginning of what nominally was a two-year mission. The environment consisted of servers dedicated to specific processing tasks in a rigid workflow to generate the required products. To the benefit of science and with credit to the mission engineers, CloudSat has lasted well beyond its planned lifetime and is still collecting data ten years later. Over that period requirements of the data processing system have greatly expanded and opportunities for providing value-added services have presented themselves. But while demands on the system have increased, the initial design allowed for very little expansion in terms of scalability and flexibility. The design did change to include virtual machine processing nodes and distributed workflows but infrastructure management was still a time consuming task when system modification was required to run new tests or implement new processes. To address the scalability, flexibility, and manageability of the system Cloud computing methods and technologies are now being employed. The use of a public cloud like Amazon Elastic Compute Cloud or Google Compute Engine was considered but, among other issues, data transfer and storage cost becomes a problem especially when demand fluctuates as a result of reprocessing and the introduction of new products and services. Instead, the existing system was converted to an on premises private Cloud using the OpenStack computing platform and Ceph software defined storage to reap the benefits of the Cloud computing paradigm. This work details the decisions that were made, the benefits that

  9. Cloud computing and digital media fundamentals, techniques, and applications

    CERN Document Server

    Li, Kuan-Ching; Shih, Timothy K

    2014-01-01

    Cloud Computing and Digital Media: Fundamentals, Techniques, and Applications presents the fundamentals of cloud and media infrastructure, novel technologies that integrate digital media with cloud computing, and real-world applications that exemplify the potential of cloud computing for next-generation digital media. It brings together technologies for media/data communication, elastic media/data storage, security, authentication, cross-network media/data fusion, interdevice media interaction/reaction, data centers, PaaS, SaaS, and more.The book covers resource optimization for multimedia clo

  10. Enabling Large-Scale Biomedical Analysis in the Cloud

    Directory of Open Access Journals (Sweden)

    Ying-Chih Lin

    2013-01-01

    Full Text Available Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable.

  11. Cloud Based Educational Systems and Its Challenges and Opportunities and Issues

    Science.gov (United States)

    Paul, Prantosh Kr.; Lata Dangwal, Kiran

    2014-01-01

    Cloud Computing (CC) is actually is a set of hardware, software, networks, storage, services an interface combines to deliver aspects of computing as a service. Cloud Computing (CC) actually uses the central remote servers to maintain data and applications. Practically Cloud Computing (CC) is extension of Grid computing with independency and…

  12. Merging Agents and Cloud Services in Industrial Applications

    OpenAIRE

    Francisco P. Maturana; Juan L. Asenjo; Neethu S. Philip; Shweta Chatrola

    2014-01-01

    A novel idea to combine agent technology and cloud computing for monitoring a plant floor system is presented. Cloud infrastructure has been leveraged as the main mechanism for hosting the data and processing needs of a modern industrial information system. The cloud offers unlimited storage and data processing in a near real-time fashion. This paper presents a software-as-a-service (SaaS) architecture for augmenting industrial plant-floor reporting capabilities. This reporting capability has...

  13. A framework for secure data sharing in the cloud | Akomolafe ...

    African Journals Online (AJOL)

    Cloud storage is not a new technology and it is being embraced more every day. Security and privacy concern of the data on the cloud is growing every day, this ... a framework that allows user revocation without re-encrypting previous data.

  14. Health condition of juvenile Chelonia mydas related to fibropapillomatosis in southeast Brazil

    Science.gov (United States)

    Renan de Deus Santos, Marcello; Silva Martins, Agnaldo; Baptistotte, Cecília; Work, Thierry M.

    2015-01-01

    Packed cell volume (PCV), plasma biochemistry, visual body condition (BC), and calculated body condition index (BCI) were evaluated in 170 wild juvenile green sea turtles Chelonia mydas from an aggregation in the effluent canal of a steel mill in Brazil. Occurrence of cutaneous fibropapillomatosis (FP) was observed in 44.1% of the animals examined. BCI alone did not differ significantly between healthy animals and those afflicted with FP. However, all turtles with low BCI were severely afflicted and were uremic, hypoglycemic, and anemic in relation to healthy animals. Severe FP was not always reflected by a poor health condition of the individual. Clinical evaluation and plasma biochemistry indicated that most animals afflicted with FP were in good health condition. Differences in FP manifestations and associated health conditions in different geographic regions must be assessed by long-term health monitoring programs to help define priorities for conservation efforts.

  15. Provenance based data integrity checking and verification in cloud environments

    Science.gov (United States)

    Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais

    2017-01-01

    Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user’s data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user’s data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called “Data Provenance”. Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking. PMID:28545151

  16. Provenance based data integrity checking and verification in cloud environments.

    Science.gov (United States)

    Imran, Muhammad; Hlavacs, Helmut; Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais

    2017-01-01

    Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.

  17. Provenance based data integrity checking and verification in cloud environments.

    Directory of Open Access Journals (Sweden)

    Muhammad Imran

    Full Text Available Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.

  18. Electron Cloud Generation and Trapping in a Quadrupole Magnet at the Los Alamos Proton Storage Ring

    International Nuclear Information System (INIS)

    Macek, Robert J.; Browman, Andrew A.; Ledford, John E.; TechSource, Santa Fe; Los Alamos; Borden, Michael J.; O'Hara, James F.; McCrady, Rodney C.; Rybarcyk, Lawrence J.; Spickermann, Thomas; Zaugg, Thomas J.; Pivi, Mauro T.F.

    2008-01-01

    Recent beam physics studies on the two-stream e-p instability at the LANL proton storage ring (PSR) have focused on the role of the electron cloud generated in quadrupole magnets where primary electrons, which seed beam-induced multipacting, are expected to be largest due to grazing angle losses from the beam halo. A new diagnostic to measure electron cloud formation and trapping in a quadrupole magnet has been developed, installed, and successfully tested at PSR. Beam studies using this diagnostic show that the 'prompt' electron flux striking the wall in a quadrupole is comparable to the prompt signal in the adjacent drift space. In addition, the 'swept' electron signal, obtained using the sweeping feature of the diagnostic after the beam was extracted from the ring, was larger than expected and decayed slowly with an exponential time constant of 50 to 100 (micro)s. Other measurements include the cumulative energy spectra of prompt electrons and the variation of both prompt and swept electron signals with beam intensity. Experimental results were also obtained which suggest that a good fraction of the electrons observed in the adjacent drift space for the typical beam conditions in the 2006 run cycle were seeded by electrons ejected from the quadrupole

  19. Electron cloud generation and trapping in a quadrupole magnet at the Los Alamos proton storage ring

    Directory of Open Access Journals (Sweden)

    Robert J. Macek

    2008-01-01

    Full Text Available Recent beam physics studies on the two-stream e-p instability at the LANL proton storage ring (PSR have focused on the role of the electron cloud generated in quadrupole magnets where primary electrons, which seed beam-induced multipacting, are expected to be largest due to grazing angle losses from the beam halo. A new diagnostic to measure electron cloud formation and trapping in a quadrupole magnet has been developed, installed, and successfully tested at PSR. Beam studies using this diagnostic show that the “prompt” electron flux striking the wall in a quadrupole is comparable to the prompt signal in the adjacent drift space. In addition, the “swept” electron signal, obtained using the sweeping feature of the diagnostic after the beam was extracted from the ring, was larger than expected and decayed slowly with an exponential time constant of 50 to 100  μs. Other measurements include the cumulative energy spectra of prompt electrons and the variation of both prompt and swept electron signals with beam intensity. Experimental results were also obtained which suggest that a good fraction of the electrons observed in the adjacent drift space for the typical beam conditions in the 2006 run cycle were seeded by electrons ejected from the quadrupole.

  20. Security Architecture and Protocol for Trust Verifications Regarding the Integrity of Files Stored in Cloud Services †

    Science.gov (United States)

    2018-01-01

    Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance. PMID:29498641

  1. Electron cloud effects: codes and simulations at KEK

    International Nuclear Information System (INIS)

    Ohmi, K

    2013-01-01

    Electron cloud effects had been studied at KEK-Photon Factory since 1995. e-p instability had been studied in proton rings since 1965 in BINP, ISR and PSR. Study of electron cloud effects with the present style, which was based on numerical simulations, started at 1995 in positron storage rings. The instability observed in KEKPF gave a strong impact to B factories, KEKB and PEPII, which were final stage of their design in those days. History of cure for electron cloud instability overlapped the progress of luminosity performance in KEKB. The studies on electron cloud codes and simulations in KEK are presented. (author)

  2. Safety considerations on LPG storage tanks

    International Nuclear Information System (INIS)

    Paff, R.

    1993-01-01

    The safety of liquefied petroleum gas (LPG) storage tanks, in refineries, petrochemicals plants, or distribution storage, is an important concern. Some serious accidents in recent years, have highlighted the need for a good safety policy for such equipment. Accidents in LPG storage are mainly due to losses of containment of the LPG. Formation of a cloud can lead to a ''Unconfined Vapor Cloud Explosion'' (UVCE). Liquid leakage can lead to pool fires in the retention area. In some circumstances the heat input of the tank, combined with the loss of mechanical resistance of the steel under high temperature, can lead to a BLEVE ''Boiling Liquid Expanding Vapor Explosion''. It is obvious that such equipment needs a proper design, maintenance and operating policy. The details to be considered are set out. (4 figures). (Author)

  3. Capturing and analyzing wheelchair maneuvering patterns with mobile cloud computing.

    Science.gov (United States)

    Fu, Jicheng; Hao, Wei; White, Travis; Yan, Yuqing; Jones, Maria; Jan, Yih-Kuen

    2013-01-01

    Power wheelchairs have been widely used to provide independent mobility to people with disabilities. Despite great advancements in power wheelchair technology, research shows that wheelchair related accidents occur frequently. To ensure safe maneuverability, capturing wheelchair maneuvering patterns is fundamental to enable other research, such as safe robotic assistance for wheelchair users. In this study, we propose to record, store, and analyze wheelchair maneuvering data by means of mobile cloud computing. Specifically, the accelerometer and gyroscope sensors in smart phones are used to record wheelchair maneuvering data in real-time. Then, the recorded data are periodically transmitted to the cloud for storage and analysis. The analyzed results are then made available to various types of users, such as mobile phone users, traditional desktop users, etc. The combination of mobile computing and cloud computing leverages the advantages of both techniques and extends the smart phone's capabilities of computing and data storage via the Internet. We performed a case study to implement the mobile cloud computing framework using Android smart phones and Google App Engine, a popular cloud computing platform. Experimental results demonstrated the feasibility of the proposed mobile cloud computing framework.

  4. Provable Data Possession of Resource-constrained Mobile Devices in Cloud Computing

    OpenAIRE

    Jian Yang; Haihang Wang; Jian Wang; Chengxiang Tan; Dingguo Yu

    2011-01-01

    Benefited from cloud storage services, users can save their cost of buying expensive storage and application servers, as well as deploying and maintaining applications. Meanwhile they lost the physical control of their data. So effective methods are needed to verify the correctness of the data stored at cloud servers, which are the research issues the Provable Data Possession (PDP) faced. The most important features in PDP are: 1) supporting for public, unlimited numbers of times of verificat...

  5. Research on high-performance mass storage system

    International Nuclear Information System (INIS)

    Cheng Yaodong; Wang Lu; Huang Qiulan; Zheng Wei

    2010-01-01

    With the enlargement of scientific experiments, more and more data will be produced, which brings great challenge to storage system. Large storage capacity and high data access performance are both important to Mass storage system. This paper firstly reviews some kinds of popular storage systems including network storage system, SAN-based sharing system, WAN File system, object-based parallel file system, hierarchical storage system and cloud storage systems. Then some key technologies are presented. Finally, this paper takes BES storage system as an example and introduces its requirements, architecture and operation results. (authors)

  6. Crushing data silos with ownCloud

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    More and more people store their personal files and documents in cloud services like Dropbox, Google Drive, Skydrive or iCloud. The reason is that they provide convenient features to sync your files between devices and share them with others. We are heading full speed into a future where a huge piece of the personal information of the world is stored in very few centralized services. Questions emerge what the impact on user privacy, surveillance, lawfulness of content and storage cost will be in in the long run. I don't think that a world where most of the personal data of the world is stored on servers of a hand full companies is a good one. This talk will discuss the problems of a future with centralized cloud file sync and share services and will present ownCloud as a possible solution. ownCloud is a free software project that offers a decentralized alternative to proprietary cloud services where everybody can run an own cloud service comparable with Dropbox but on own hardware and with full ...

  7. Utilizing HDF4 File Content Maps for the Cloud

    Science.gov (United States)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  8. Automated Grid Monitoring for LHCb through HammerCloud

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The HammerCloud system is used by CERN IT to monitor the status of the Worldwide LHC Computing Grid (WLCG). HammerCloud automatically submits jobs to WLCG computing resources, closely replicating the workflow of Grid users (e.g. physicists analyzing data). This allows computation nodes and storage resources to be monitored, software to be tested (somewhat like continuous integration), and new sites to be stress tested with a heavy job load before commissioning. The HammerCloud system has been in use for ATLAS and CMS experiments for about five years. This summer's work involved porting the HammerCloud suite of tools to the LHCb experiment. The HammerCloud software runs functional tests and provides data visualizations. HammerCloud's LHCb variant is written in Python, using the Django web framework and Ganga/DIRAC for job management.

  9. Privacy-preserving public auditing for data integrity in cloud

    Science.gov (United States)

    Shaik Saleem, M.; Murali, M.

    2018-04-01

    Cloud computing which has collected extent concentration from communities of research and with industry research development, a large pool of computing resources using virtualized sharing method like storage, processing power, applications and services. The users of cloud are vend with on demand resources as they want in the cloud computing. Outsourced file of the cloud user can easily tampered as it is stored at the third party service providers databases, so there is no integrity of cloud users data as it has no control on their data, therefore providing security assurance to the users data has become one of the primary concern for the cloud service providers. Cloud servers are not responsible for any data loss as it doesn’t provide the security assurance to the cloud user data. Remote data integrity checking (RDIC) licenses an information to data storage server, to determine that it is really storing an owners data truthfully. RDIC is composed of security model and ID-based RDIC where it is responsible for the security of every server and make sure the data privacy of cloud user against the third party verifier. Generally, by running a two-party Remote data integrity checking (RDIC) protocol the clients would themselves be able to check the information trustworthiness of their cloud. Within the two party scenario the verifying result is given either from the information holder or the cloud server may be considered as one-sided. Public verifiability feature of RDIC gives the privilege to all its users to verify whether the original data is modified or not. To ensure the transparency of the publicly verifiable RDIC protocols, Let’s figure out there exists a TPA who is having knowledge and efficiency to verify the work to provide the condition clearly by publicly verifiable RDIC protocols.

  10. Performance Measurements And Comparison For Gluster FS And Azure Blob Storage

    Directory of Open Access Journals (Sweden)

    Roopali VIj

    2015-08-01

    Full Text Available Abstract as the world of knowledge based systems and digital knowledge sharing grows business models involving document management and storage of large blocks of files is becoming the need of the hour. It is very essential to select the correct and efficient file system to store the files in order to ease the retrieving and addition of files. Alhough using cloud technologies to save such data offers flexibility the biggest challenge is to select whether to opt for a distributed file system mounted over cloud virtual machines or use Paas based file storages available as platforms. Here we compare the performance of two environments both deployed on a same storage account on Azure cloud. One is GlusterFS file system mounted on a virtual machine on Azure and the other is Paas based Azure blob storage using a website for encryption hosted on Azure.

  11. Identification of CD3+ T lymphocytes in the green turtle Chelonia mydas

    Science.gov (United States)

    Munoz, F.A.; Estrada-Parra, S.; Romero-Rojas, A.; Work, Thierry M.; Gonzalez-Ballesteros, E.; Estrada-Garcia, I.

    2009-01-01

    To understand the role of the immune system with respect to disease in reptiles, there is the need to develop tools to assess the host's immune response. An important tool is the development of molecular markers to identify immune cells, and these are limited for reptiles. We developed a technique for the cryopreservation of peripheral blood mononuclear cells and showed that a commercially available anti-CD3 epsilon chain antibody detects a subpopulation of CD3 positive peripheral blood lymphocytes in the marine turtle Chelonia mydas. In the thymus and in skin inoculated with phytohemagglutinin, the same antibody showed the classical staining pattern observed in mammals and birds. For Western blot, the anti-CD3 antibodies identified a 17.6 kDa band in membrane proteins of peripheral blood mononuclear cell compatible in weight to previously described CD3 molecules. This is the first demostration of CD3+ cells in reptiles using specific antibodies.

  12. Bioinformatics clouds for big data manipulation.

    Science.gov (United States)

    Dai, Lin; Gao, Xin; Guo, Yan; Xiao, Jingfa; Zhang, Zhang

    2012-11-28

    As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor.

  13. TRUSTED CLOUD COMPUTING FRAMEWORK FOR HEALTHCARE SECTOR

    OpenAIRE

    Mervat Adib Bamiah; Sarfraz Nawaz Brohi; Suriayati Chuprat; Jamalul-lail Ab Manan

    2014-01-01

    Cloud computing is rapidly evolving due to its efficient characteristics such as cost-effectiveness, availability and elasticity. Healthcare organizations and consumers lose control when they outsource their sensitive data and computing resources to a third party Cloud Service Provider (CSP), which may raise security and privacy concerns related to data loss and misuse appealing threats. Lack of consumers’ knowledge about their data storage location may lead to violating rules and r...

  14. CIMS: A Context-Based Intelligent Multimedia System for Ubiquitous Cloud Computing

    Directory of Open Access Journals (Sweden)

    Abhilash Sreeramaneni

    2015-06-01

    Full Text Available Mobile users spend a tremendous amount of time surfing multimedia contents over the Internet to pursue their interests. A resource-constrained smart device demands more intensive computing tasks and lessens the battery life. To address the resource limitations (i.e., memory, lower maintenance cost, easier access, computing tasks in mobile devices, mobile cloud computing is needed. Several approaches have been proposed to confront the challenges of mobile cloud computing, but difficulties still remain. However, in the coming years, context collecting, processing, and interchanging the results on a heavy network will cause vast computations and reduce the battery life in mobiles. In this paper, we propose a “context-based intelligent multimedia system” (CIMS for ubiquitous cloud computing. The main goal of this research is to lessen the computing percentage, storage complexities, and battery life for mobile users by using pervasive cloud computing. Moreover, to reduce the computing and storage concerns in mobiles, the cloud server collects several groups of user profiles with similarities by executing K-means clustering on users’ data (context and multimedia contents. The distribution process conveys real-time notifications to smartphone users, according to what is stated in his/her profile. We considered a mobile cloud offloading system, which decides the offloading actions to/from cloud servers. Context-aware decision-making (CAD customizes the mobile device performance with different specifications such as short response time and lesser energy consumption. The analysis says that our CIMS takes advantage of cost-effective features to produce high-quality information for mobile (or smart device users in real time. Moreover, our CIMS lessens the computation and storage complexities for mobile users, as well as cloud servers. Simulation analysis suggests that our approach is more efficient than existing domains.

  15. GTZ: a fast compression and cloud transmission tool optimized for FASTQ files.

    Science.gov (United States)

    Xing, Yuting; Li, Gen; Wang, Zhenguo; Feng, Bolun; Song, Zhuo; Wu, Chengkun

    2017-12-28

    The dramatic development of DNA sequencing technology is generating real big data, craving for more storage and bandwidth. To speed up data sharing and bring data to computing resource faster and cheaper, it is necessary to develop a compression tool than can support efficient compression and transmission of sequencing data onto the cloud storage. This paper presents GTZ, a compression and transmission tool, optimized for FASTQ files. As a reference-free lossless FASTQ compressor, GTZ treats different lines of FASTQ separately, utilizes adaptive context modelling to estimate their characteristic probabilities, and compresses data blocks with arithmetic coding. GTZ can also be used to compress multiple files or directories at once. Furthermore, as a tool to be used in the cloud computing era, it is capable of saving compressed data locally or transmitting data directly into cloud by choice. We evaluated the performance of GTZ on some diverse FASTQ benchmarks. Results show that in most cases, it outperforms many other tools in terms of the compression ratio, speed and stability. GTZ is a tool that enables efficient lossless FASTQ data compression and simultaneous data transmission onto to cloud. It emerges as a useful tool for NGS data storage and transmission in the cloud environment. GTZ is freely available online at: https://github.com/Genetalks/gtz .

  16. The possibilities of cloud storage for business and education. Contemporary aspect

    Directory of Open Access Journals (Sweden)

    Fomicheva T.L.

    2017-01-01

    Full Text Available the article describes the possibilities of cloud computing application to make business and educational process more effective. Cloud computing gives an opportunity to optimize the activity, to increase the effectiveness of work in almost every aspect of livelihood, right up to solving the personal tasks of an ordinary user.

  17. Mash-up of techniques between data crawling/transfer, data preservation/stewardship and data processing/visualization technologies on a science cloud system designed for Earth and space science: a report of successful operation and science projects of the NICT Science Cloud

    Science.gov (United States)

    Murata, K. T.

    2014-12-01

    Data-intensive or data-centric science is 4th paradigm after observational and/or experimental science (1st paradigm), theoretical science (2nd paradigm) and numerical science (3rd paradigm). Science cloud is an infrastructure for 4th science methodology. The NICT science cloud is designed for big data sciences of Earth, space and other sciences based on modern informatics and information technologies [1]. Data flow on the cloud is through the following three techniques; (1) data crawling and transfer, (2) data preservation and stewardship, and (3) data processing and visualization. Original tools and applications of these techniques have been designed and implemented. We mash up these tools and applications on the NICT Science Cloud to build up customized systems for each project. In this paper, we discuss science data processing through these three steps. For big data science, data file deployment on a distributed storage system should be well designed in order to save storage cost and transfer time. We developed a high-bandwidth virtual remote storage system (HbVRS) and data crawling tool, NICTY/DLA and Wide-area Observation Network Monitoring (WONM) system, respectively. Data files are saved on the cloud storage system according to both data preservation policy and data processing plan. The storage system is developed via distributed file system middle-ware (Gfarm: GRID datafarm). It is effective since disaster recovery (DR) and parallel data processing are carried out simultaneously without moving these big data from storage to storage. Data files are managed on our Web application, WSDBank (World Science Data Bank). The big-data on the cloud are processed via Pwrake, which is a workflow tool with high-bandwidth of I/O. There are several visualization tools on the cloud; VirtualAurora for magnetosphere and ionosphere, VDVGE for google Earth, STICKER for urban environment data and STARStouch for multi-disciplinary data. There are 30 projects running on the NICT

  18. Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open Campus Quick-Start Guide

    Science.gov (United States)

    2016-03-01

    The GIFT Account allows users to log into GIFT Cloud , manage their personal storage in GIFT Cloud , download GIFT Local, and access resources...ARL-CR-0796 ● MAR 2016 US Army Research Laboratory Generalized Intelligent Framework for Tutoring (GIFT) Cloud / Virtual Open...originator. ARL-CR-0796 ● MAR 2016 US Army Research Laboratory Generalized Intelligent Framework for Tutoring (GIFT) Cloud / Virtual

  19. Integration of XRootD into the cloud infrastructure for ALICE data analysis

    CERN Document Server

    Kompaniets, Mikhail; Svirin, Pavlo; Yurchenko, Volodymyr; Zarochentsev, Andrey

    2015-01-01

    Cloud technologies allow easy load balancing between different tasks and projects. From the viewpoint of the data analysis in the ALICE experiment, cloud allows to deploy software using Cern Virtual Machine (CernVM) and CernVM File System (CVMFS), to run different (including outdated) versions of software for long term data preservation and to dynamically allocate resources for different computing activities, e.g. grid site, ALICE Analysis Facility (AAF) and possible usage for local projects or other LHC experiments.We present a cloud solution for Tier-3 sites based on OpenStack and Ceph distributed storage with an integrated XRootD based storage element (SE). One of the key features of the solution is based on idea that Ceph has been used as a backend for Cinder Block Storage service for OpenStack, and in the same time as a storage backend for XRootD, with redundancy and availability of data preserved by Ceph settings. For faster and easier OpenStack deployment was applied the Packstack solution, which is ba...

  20. Evaluation of NetApp Cloud ONTAP and AltaVault using Amazon Web Services

    CERN Document Server

    Weisz, Michael

    2015-01-01

    As of now, the storage infrastructure at CERN almost exclusively consists of on-premise storage, i.e. storage which physically resides in the institution’s data center. While this offers certain advantages such as full control regarding data security, it also holds many challenges, most importantly in terms of flexibility and scalability. For instance, the provisioning of new on-site storage takes some time, since the required storage needs to be ordered, delivered, and installed first, before it can be used. Furthermore, there is certain maintenance work involved even after the initial setup inflicting ongoing costs of upkeep. At the same time, various cloud providers such as Amazon Web Services and Microsoft Azure have emerged during the last years, offering services to flexibly provision storage resources in the cloud in a scalable way. This project tries to explore and evaluate to what extend the on-site storage infrastructure at CERN could be extended using virtual NetApp storage offerings such as Clou...

  1. Quality of Experience Assessment of Video Quality in Social Clouds

    Directory of Open Access Journals (Sweden)

    Asif Ali Laghari

    2017-01-01

    Full Text Available Video sharing on social clouds is popular among the users around the world. High-Definition (HD videos have big file size so the storing in cloud storage and streaming of videos with high quality from cloud to the client are a big problem for service providers. Social clouds compress the videos to save storage and stream over slow networks to provide quality of service (QoS. Compression of video decreases the quality compared to original video and parameters are changed during the online play as well as after download. Degradation of video quality due to compression decreases the quality of experience (QoE level of end users. To assess the QoE of video compression, we conducted subjective (QoE experiments by uploading, sharing, and playing videos from social clouds. Three popular social clouds, Facebook, Tumblr, and Twitter, were selected to upload and play videos online for users. The QoE was recorded by using questionnaire given to users to provide their experience about the video quality they perceive. Results show that Facebook and Twitter compressed HD videos more as compared to other clouds. However, Facebook gives a better quality of compressed videos compared to Twitter. Therefore, users assigned low ratings for Twitter for online video quality compared to Tumblr that provided high-quality online play of videos with less compression.

  2. Design Private Cloud of Oil and Gas SCADA System

    Directory of Open Access Journals (Sweden)

    Liu Miao

    2014-05-01

    Full Text Available SCADA (Supervisory Control and Data Acquisition system is computer control system based on supervisory. SCADA system is very important to oil and gas pipeline engineering. Cloud computing is fundamentally altering the expectations for how and when computing, storage and networking resources should be allocated, managed and consumed. In order to increase resource utilization, reliability and availability of oil and gas pipeline SCADA system, the SCADA system based on cloud computing is proposed in the paper. This paper introduces the system framework of SCADA system based on cloud computing and the realization details about the private cloud platform of SCADA system.

  3. Cloud Computing with iPlant Atmosphere.

    Science.gov (United States)

    McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos

    2013-10-15

    Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.

  4. Mobile Cloud Computing for Telemedicine Solutions

    Directory of Open Access Journals (Sweden)

    Mihaela GHEORGHE

    2014-01-01

    Full Text Available Mobile Cloud Computing is a significant technology which combines emerging domains such as mobile computing and cloud computing which has conducted to the development of one of the most IT industry challenging and innovative trend. This is still at the early stage of devel-opment but its main characteristics, advantages and range of services which are provided by an internet-based cluster system have a strong impact on the process of developing telemedi-cine solutions for overcoming the wide challenges the medical system is confronting with. Mo-bile Cloud integrates cloud computing into the mobile environment and has the advantage of overcoming obstacles related to performance (e.g. battery life, storage, and bandwidth, envi-ronment (e.g. heterogeneity, scalability, availability and security (e.g. reliability and privacy which are commonly present at mobile computing level. In this paper, I will present a compre-hensive overview on mobile cloud computing including definitions, services and the use of this technology for developing telemedicine application.

  5. Measurement of Electron Cloud Effects in SPS

    CERN Document Server

    Jiménez, J M

    2004-01-01

    The electron cloud is not a new phenomenon, indeed, it was observed already in other machines like the proton storage rings in BINP Novosibirsk or in the Intersecting Storage Ring (ISR) at CERN. Inside an accelerator beam pipe, the electrons can collectively and coherently interact with the beam potential and degrade the performance of the accelerators operating with intense positively charged bunched beams. In the LHC, electron multipacting is expected to take place in the cold and warm beam pipe due to the presence of the high intensities bunched beams, creating an electron cloud. The additional heat load induced by the electron cloud onto the LHC beam screens of the cold magnets of the LHC bending sections (the arcs represent ~21 km in length) was, and is still, considered as one of the main possible limitation of LHC performances. Since 1997 and in parallel with the SPS studies with LHC-type beams, measurements in other machines or in the laboratory have been made to provide the input parameters required ...

  6. Mobile Computing and Cloud maturity - Introducing Machine Learning for ERP Configuration Automation

    Directory of Open Access Journals (Sweden)

    Elena Geanina ULARU

    2013-01-01

    Full Text Available Nowadays the smart phone market is clearly growing due to the new type of functionalities that mobile devices have and the role that they play in everyday life. Their utility and benefits rely on the applications that can be installed on the device (the so-called mobile apps. Cloud computing is a way to enhance the world of mobile application by providing disk space and freeing the user of the local storage needs, this way providing cheaper storage, wider acces-sibility and greater speed for business. In this paper we introduce various aspects of mobile computing and we stress the importance of obtaining cloud maturity by using machine learning for automating configurations of software applications deployed on cloud nodes using the open source application ERP5 and SlapOS, an open source operating system for Decentralized Cloud Computing.

  7. Cloud Based Educational Systems 
And Its Challenges And Opportunities And Issues

    OpenAIRE

    PAUL, Prantosh Kr.; DANGWAL, Kiran LATA

    2014-01-01

    Cloud Computing (CC) is actually is a set of hardware, software, networks, storage, services an interface combines to deliver aspects of computing as a service. Cloud Computing (CC) actually uses the central remote servers to maintain data and applications. Practically Cloud Computing (CC) is extension of Grid computing with independency and smarter tools and technological gradients. Healthy Cloud Computing helps in sharing of software, hardware, application and other packages with the help o...

  8. Data Security, Privacy, Availability and Integrity in Cloud Computing: Issues and Current Solutions

    OpenAIRE

    Sultan Aldossary; William Allen

    2016-01-01

    Cloud computing changed the world around us. Now people are moving their data to the cloud since data is getting bigger and needs to be accessible from many devices. Therefore, storing the data on the cloud becomes a norm. However, there are many issues that counter data stored in the cloud starting from virtual machine which is the mean to share resources in cloud and ending on cloud storage itself issues. In this paper, we present those issues that are preventing people from adopting the cl...

  9. Bioinformatics clouds for big data manipulation

    Directory of Open Access Journals (Sweden)

    Dai Lin

    2012-11-01

    Full Text Available Abstract As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS, Software as a Service (SaaS, Platform as a Service (PaaS, and Infrastructure as a Service (IaaS, and present our perspectives on the adoption of cloud computing in bioinformatics. Reviewers This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor.

  10. Bioinformatics clouds for big data manipulation

    KAUST Repository

    Dai, Lin

    2012-11-28

    As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics.This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor. 2012 Dai et al.; licensee BioMed Central Ltd.

  11. Biomedical cloud computing with Amazon Web Services.

    Science.gov (United States)

    Fusaro, Vincent A; Patil, Prasad; Gafni, Erik; Wall, Dennis P; Tonellato, Peter J

    2011-08-01

    In this overview to biomedical computing in the cloud, we discussed two primary ways to use the cloud (a single instance or cluster), provided a detailed example using NGS mapping, and highlighted the associated costs. While many users new to the cloud may assume that entry is as straightforward as uploading an application and selecting an instance type and storage options, we illustrated that there is substantial up-front effort required before an application can make full use of the cloud's vast resources. Our intention was to provide a set of best practices and to illustrate how those apply to a typical application pipeline for biomedical informatics, but also general enough for extrapolation to other types of computational problems. Our mapping example was intended to illustrate how to develop a scalable project and not to compare and contrast alignment algorithms for read mapping and genome assembly. Indeed, with a newer aligner such as Bowtie, it is possible to map the entire African genome using one m2.2xlarge instance in 48 hours for a total cost of approximately $48 in computation time. In our example, we were not concerned with data transfer rates, which are heavily influenced by the amount of available bandwidth, connection latency, and network availability. When transferring large amounts of data to the cloud, bandwidth limitations can be a major bottleneck, and in some cases it is more efficient to simply mail a storage device containing the data to AWS (http://aws.amazon.com/importexport/). More information about cloud computing, detailed cost analysis, and security can be found in references.

  12. DPM — efficient storage in diverse environments

    International Nuclear Information System (INIS)

    Hellmich, Martin; Furano, Fabrizio; Smith, David; Rocha, Ricardo Brito da; Ayllón, Alejandro Álvarez; Manzi, Andrea; Keeble, Oliver; Calvet, Ivan; Regala, Miguel Antonio

    2014-01-01

    Recent developments, including low power devices, cluster file systems and cloud storage, represent an explosion in the possibilities for deploying and managing grid storage. In this paper we present how different technologies can be leveraged to build a storage service with differing cost, power, performance, scalability and reliability profiles, using the popular storage solution Disk Pool Manager (DPM/dmlite) as the enabling technology. The storage manager DPM is designed for these new environments, allowing users to scale up and down as they need it, and optimizing their computing centers energy efficiency and costs. DPM runs on high-performance machines, profiting from multi-core and multi-CPU setups. It supports separating the database from the metadata server, the head node, largely reducing its hard disk requirements. Since version 1.8.6, DPM is released in EPEL and Fedora, simplifying distribution and maintenance, but also supporting the ARM architecture beside i386 and x86 6 4, allowing it to run the smallest low-power machines such as the Raspberry Pi or the CuBox. This usage is facilitated by the possibility to scale horizontally using a main database and a distributed memcached-powered namespace cache. Additionally, DPM supports a variety of storage pools in the backend, most importantly HDFS, S3-enabled storage, and cluster file systems, allowing users to fit their DPM installation exactly to their needs. In this paper, we investigate the power-efficiency and total cost of ownership of various DPM configurations. We develop metrics to evaluate the expected performance of a setup both in terms of namespace and disk access considering the overall cost including equipment, power consumptions, or data/storage fees. The setups tested range from the lowest scale using Raspberry Pis with only 700 MHz single cores and a 100 Mbps network connections, over conventional multi-core servers to typical virtual machine instances in cloud settings. We evaluate the

  13. Secure Architectures in the Cloud

    NARCIS (Netherlands)

    De Capitani di Vimercati, Sabrina; Pieters, Wolter; Probst, Christian W.

    2011-01-01

    This report documents the outcomes of Dagstuhl Seminar 11492 “Secure Architectures in the Cloud‿. In cloud computing, data storage and processing are offered as services, and data are managed by external providers that reside outside the control of the data owner. The use of such services reduces

  14. CloudNeo: a cloud pipeline for identifying patient-specific tumor neoantigens.

    Science.gov (United States)

    Bais, Preeti; Namburi, Sandeep; Gatti, Daniel M; Zhang, Xinyu; Chuang, Jeffrey H

    2017-10-01

    We present CloudNeo, a cloud-based computational workflow for identifying patient-specific tumor neoantigens from next generation sequencing data. Tumor-specific mutant peptides can be detected by the immune system through their interactions with the human leukocyte antigen complex, and neoantigen presence has recently been shown to correlate with anti T-cell immunity and efficacy of checkpoint inhibitor therapy. However computing capabilities to identify neoantigens from genomic sequencing data are a limiting factor for understanding their role. This challenge has grown as cancer datasets become increasingly abundant, making them cumbersome to store and analyze on local servers. Our cloud-based pipeline provides scalable computation capabilities for neoantigen identification while eliminating the need to invest in local infrastructure for data transfer, storage or compute. The pipeline is a Common Workflow Language (CWL) implementation of human leukocyte antigen (HLA) typing using Polysolver or HLAminer combined with custom scripts for mutant peptide identification and NetMHCpan for neoantigen prediction. We have demonstrated the efficacy of these pipelines on Amazon cloud instances through the Seven Bridges Genomics implementation of the NCI Cancer Genomics Cloud, which provides graphical interfaces for running and editing, infrastructure for workflow sharing and version tracking, and access to TCGA data. The CWL implementation is at: https://github.com/TheJacksonLaboratory/CloudNeo. For users who have obtained licenses for all internal software, integrated versions in CWL and on the Seven Bridges Cancer Genomics Cloud platform (https://cgc.sbgenomics.com/, recommended version) can be obtained by contacting the authors. jeff.chuang@jax.org. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  15. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2016-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value

  16. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00396985; The ATLAS collaboration; Keeble, Oliver; Quadt, Arnulf; Kawamura, Gen

    2017-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the Cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value.

  17. Características de la anidación de la tortuga verde Chelonia mydas (Testudinata, Cheloniidae en la playa Caleta de los Piojos, Cuba, a partir de marcaciones externas

    Directory of Open Access Journals (Sweden)

    Ferrer Sánchez, Y.

    2007-12-01

    Full Text Available Nesting characteristics of the Green turtle Chelonia mydas (Testudinata, Cheloniidae at Caleta de los Piojos Beach, Cuba, determined from tagging studies Green turtle (Chelonia mydas females nesting during 2002 and 2003 nesting seasons at Caleta de los Piojos Beach, Cuba, were studied using data from individual tagging. Nesting occurred on average twice per season with a mean interval of 10.9 days. A high number of turtles nested only once per season (39% and 40% respectively. The percentage of failed multiple nesting attempts was high in both seasons. However, the percentage of failed attempts prior to the first nesting was higher in the 2003 season. Vegetation areas seem to be the most suitable sites for nesting and have a significant effect on nest-site selection behaviour. Fidelity to first nest-site was high, 50.3% and 72.9% respectively for 2002 and 2003. Observed mean clutch size (117 eggs was closely related to body dimensions.

  18. Electron-cloud updated simulation results for the PSR, and recent results for the SNS

    International Nuclear Information System (INIS)

    Pivi, M.; Furman, M.A.

    2002-01-01

    Recent simulation results for the main features of the electron cloud in the storage ring of the Spallation Neutron Source (SNS) at Oak Ridge, and updated results for the Proton Storage Ring (PSR) at Los Alamos are presented in this paper. A refined model for the secondary emission process including the so called true secondary, rediffused and backscattered electrons has recently been included in the electron-cloud code

  19. Shielded button electrodes for time-resolved measurements of electron cloud buildup

    International Nuclear Information System (INIS)

    Crittenden, J.A.; Billing, M.G.; Li, Y.; Palmer, M.A.; Sikora, J.P.

    2014-01-01

    We report on the design, deployment and signal analysis for shielded button electrodes sensitive to electron cloud buildup at the Cornell Electron Storage Ring. These simple detectors, derived from a beam-position monitor electrode design, have provided detailed information on the physical processes underlying the local production and the lifetime of electron densities in the storage ring. Digitizing oscilloscopes are used to record electron fluxes incident on the vacuum chamber wall in 1024 time steps of 100 ps or more. The fine time steps provide a detailed characterization of the cloud, allowing the independent estimation of processes contributing on differing time scales and providing sensitivity to the characteristic kinetic energies of the electrons making up the cloud. By varying the spacing and population of electron and positron beam bunches, we map the time development of the various cloud production and re-absorption processes. The excellent reproducibility of the measurements also permits the measurement of long-term conditioning of vacuum chamber surfaces

  20. INTERNATIONAL EXPERIENCE OF CLOUD ORIENTED LEARNING ENVIRONMENT DESIGN IN SECONDARY SCHOOLS

    Directory of Open Access Journals (Sweden)

    Svitlana G. Lytvynova

    2014-06-01

    Full Text Available The article highlights the foreign experience of designing of cloud oriented learning environments (COLE in general secondary education. The projects in Russia, Germany, Czech Republic, Australia, China, Israel, Africa, Singapore, Brazil, Egypt, Colombia and the United States are analyzed. The analysis of completed projects found out the common problems of implementing of cloud oriented learning environments (security of personal data, technical problems of integration of cloud environments with existing systems, and productivity of cloud services and their advantages for secondary education (mobility of participants, volumetric cloud data storage, universally accessibility, regular software updating, ease of use, etc..

  1. ownCloud project at CNRS

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    CNRS will launch next November an ownCloud based service with the intend to serve CNRS research units. The first step is to deploy this service as a beta solution for 2 months and 2 000 end users, and then to generalize this offer to the whole CNRS users (potentialy 100 000 users). Our platform is based on ownCloud 7 community edition, with VMWare for virtualization, a Galera/MariaDB cluster database and Scality for the distributed storage backend. We will try to present during this workshop our service implementation in detail, and discuss about our choices, our concerns, … our troubles :)

  2. A new privacy preserving technique for cloud service user endorsement using multi-agents

    Directory of Open Access Journals (Sweden)

    D. Chandramohan

    2016-01-01

    Full Text Available In data analysis the present focus on storage services are leveraged to attain its crucial part while user data get compromised. In the recent years service user’s valuable information has been utilized by unauthorized users and service providers. This paper examines the privacy awareness and importance of user’s secrecy preserving in the current cloud computing era. Gradually the information kept under the cloud environment gets increased due to its elasticity and availability. However, highly sensitive information is in a serious attack from various sources. Once private information gets misused, the probability of privacy breaching increases which thereby reduces user’s trust on cloud providers. In the modern internet world, information management and maintenance is one among the most decisive tasks. Information stored in the cloud by the finance, healthcare, government sectors, etc. makes it all the more challenging since such tasks are to be handled globally. The present scenario therefore demands a new Petri-net Privacy Preserving Framework (PPPF for safeguarding user’s privacy and, providing consistent and breach-less services from the cloud. This paper illustrates the design of PPPF and mitigates the cloud provider’s trust among users. The proposed technique conveys and collaborates with Privacy Preserving Cohesion Technique (PPCT, to develop validate, promote, adapt and also increase the need for data privacy. Moreover, this paper focuses on clinching and verification of unknown user intervention into the confidential data present in storage area and ensuring the performance of the cloud services. It also acts as an information preserving guard for high secrecy data storage areas.

  3. One of the Approaches to Creation of Hybrid Cloud Secure Environment

    Directory of Open Access Journals (Sweden)

    Andrey Konstantinovich Kachko

    2014-02-01

    Full Text Available In response to the ever growing needs in the storage and processing of data the main position are occupied by informational-telecommunication systems, operating on the basis of cloud computing. In this case, the key point in the use of cloud computing is the problem of information security. This article is primarily intended to cover the main information safety issues that occur in cloud environments and ways of their solutions in the construction of an integrated information security management system on the cloud architecture.

  4. Calibration of LOFAR data on the cloud

    Science.gov (United States)

    Sabater, J.; Sánchez-Expósito, S.; Best, P.; Garrido, J.; Verdes-Montenegro, L.; Lezzi, D.

    2017-04-01

    New scientific instruments are starting to generate an unprecedented amount of data. The Low Frequency Array (LOFAR), one of the Square Kilometre Array (SKA) pathfinders, is already producing data on a petabyte scale. The calibration of these data presents a huge challenge for final users: (a) extensive storage and computing resources are required; (b) the installation and maintenance of the software required for the processing is not trivial; and (c) the requirements of calibration pipelines, which are experimental and under development, are quickly evolving. After encountering some limitations in classical infrastructures like dedicated clusters, we investigated the viability of cloud infrastructures as a solution. We found that the installation and operation of LOFAR data calibration pipelines is not only possible, but can also be efficient in cloud infrastructures. The main advantages were: (1) the ease of software installation and maintenance, and the availability of standard APIs and tools, widely used in the industry; this reduces the requirement for significant manual intervention, which can have a highly negative impact in some infrastructures; (2) the flexibility to adapt the infrastructure to the needs of the problem, especially as those demands change over time; (3) the on-demand consumption of (shared) resources. We found that a critical factor (also in other infrastructures) is the availability of scratch storage areas of an appropriate size. We found no significant impediments associated with the speed of data transfer, the use of virtualization, the use of external block storage, or the memory available (provided a minimum threshold is reached). Finally, we considered the cost-effectiveness of a commercial cloud like Amazon Web Services. While a cloud solution is more expensive than the operation of a large, fully-utilized cluster completely dedicated to LOFAR data reduction, we found that its costs are competitive if the number of datasets to be

  5. Trust Model to Enhance Security and Interoperability of Cloud Environment

    Science.gov (United States)

    Li, Wenjuan; Ping, Lingdi

    Trust is one of the most important means to improve security and enable interoperability of current heterogeneous independent cloud platforms. This paper first analyzed several trust models used in large and distributed environment and then introduced a novel cloud trust model to solve security issues in cross-clouds environment in which cloud customer can choose different providers' services and resources in heterogeneous domains can cooperate. The model is domain-based. It divides one cloud provider's resource nodes into the same domain and sets trust agent. It distinguishes two different roles cloud customer and cloud server and designs different strategies for them. In our model, trust recommendation is treated as one type of cloud services just like computation or storage. The model achieves both identity authentication and behavior authentication. The results of emulation experiments show that the proposed model can efficiently and safely construct trust relationship in cross-clouds environment.

  6. Sensory signals and neuronal groups involved in guiding the sea-ward motor behavior in turtle hatchlings of Chelonia agassizi

    Science.gov (United States)

    Fuentes, A. L.; Camarena, V.; Ochoa, G.; Urrutia, J.; Gutierrez, G.

    2007-05-01

    Turtle hatchlings orient display sea-ward oriented movements as soon as they emerge from the nest. Although most studies have emphasized the role of the visual information in this process, less attention has been paid to other sensory modalities. Here, we evaluated the nature of sensory cues used by turtle hatchlings of Chelonia agassizi to orient their movements towards the ocean. We recorded the time they took to crawl from the nest to the beach front (120m long) in control conditions and in visually, olfactory and magnetically deprived circumstances. Visually-deprived hatchlings displayed a high degree of disorientation. Olfactory deprivation and magnetic field distortion impaired, but not abolished, sea-ward oriented movements. With regard to the neuronal mapping experiments, visual deprivation reduced dramatically c-fos expression in the whole brain. Hatchlings with their nares blocked revealed neurons with c-fos expression above control levels principally in the c and d areas, while those subjected to magnetic field distortion had a wide spread activation of neurons throughout the brain predominantly in the dorsal ventricular ridge The present results support that Chelonia agassizi hatchlings use predominantly visual cues to orient their movements towards the sea. Olfactory and magnetic cues may also be use but their influence on hatchlings oriented motor behavior is not as clear as it is for vision. This conclusion is supported by the fact that in the absence of olfactory and magnetic cues, the brain turns on the expression of c- fos in neuronal groups that, in the intact hatchling, are not normally involved in accomplishing the task.

  7. Managing IaaS and DBaaS clouds with Oracle Enterprise Manager Cloud Control 12c

    CERN Document Server

    Antani, Ved

    2013-01-01

    This book is a step-by-step tutorial filled with practical examples which will show readers how to configure and manage IaaS and DBaaS with Oracle Enterprise Manager.If you are a cloud administrator or a user of self-service provisioning systems offered by Enterprise Manager, this book is ideal for you. It will also help administrators who want to understand the chargeback mechanism offered by Enterprise Manager.An understanding of the basic building blocks of cloud computing such as networking, virtualization, storage, and so on, is needed by those of you interested in this book

  8. Mobile Cloud Computing: Resource Discovery, Session Connectivity and Other Open Issues

    NARCIS (Netherlands)

    Schüring, Markus; Karagiannis, Georgios

    2011-01-01

    Abstract—Cloud computing can be considered as a model that provides network access to a shared pool of resources, such as storage and computing power, which can be rapidly provisioned and released with minimal management effort. This paper describes a research activity in the area of mobile cloud

  9. Impact of different cloud deployments on real-time video applications for mobile video cloud users

    Science.gov (United States)

    Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos

    2015-02-01

    The latest trend to access mobile cloud services through wireless network connectivity has amplified globally among both entrepreneurs and home end users. Although existing public cloud service vendors such as Google, Microsoft Azure etc. are providing on-demand cloud services with affordable cost for mobile users, there are still a number of challenges to achieve high-quality mobile cloud based video applications, especially due to the bandwidth-constrained and errorprone mobile network connectivity, which is the communication bottleneck for end-to-end video delivery. In addition, existing accessible clouds networking architectures are different in term of their implementation, services, resources, storage, pricing, support and so on, and these differences have varied impact on the performance of cloud-based real-time video applications. Nevertheless, these challenges and impacts have not been thoroughly investigated in the literature. In our previous work, we have implemented a mobile cloud network model that integrates localized and decentralized cloudlets (mini-clouds) and wireless mesh networks. In this paper, we deploy a real-time framework consisting of various existing Internet cloud networking architectures (Google Cloud, Microsoft Azure and Eucalyptus Cloud) and a cloudlet based on Ubuntu Enterprise Cloud over wireless mesh networking technology for mobile cloud end users. It is noted that the increasing trend to access real-time video streaming over HTTP/HTTPS is gaining popularity among both research and industrial communities to leverage the existing web services and HTTP infrastructure in the Internet. To study the performance under different deployments using different public and private cloud service providers, we employ real-time video streaming over the HTTP/HTTPS standard, and conduct experimental evaluation and in-depth comparative analysis of the impact of different deployments on the quality of service for mobile video cloud users. Empirical

  10. Merging Agents and Cloud Services in Industrial Applications

    Directory of Open Access Journals (Sweden)

    Francisco P. Maturana

    2014-01-01

    Full Text Available A novel idea to combine agent technology and cloud computing for monitoring a plant floor system is presented. Cloud infrastructure has been leveraged as the main mechanism for hosting the data and processing needs of a modern industrial information system. The cloud offers unlimited storage and data processing in a near real-time fashion. This paper presents a software-as-a-service (SaaS architecture for augmenting industrial plant-floor reporting capabilities. This reporting capability has been architected using networked agents, worker roles, and scripts for building a scalable data pipeline and analytics system.

  11. Secure system for personal finances on the cloud

    OpenAIRE

    Quintana i Vidal, Xavier

    2015-01-01

    Este documento contiene la memoria de la realización del Trabajo Final de Máster que tiene como objetivo la creación de un Cloud para el almacenamiento de forma segura de las facturas electrónicas. This document contains the memory of the accomplishment of the final Project, which aims to create a Cloud for secure storage of the electronic invoices. Aquest document conté la memòria de la realització del Treball Final de Màster que té com a objectiu la creació d'un Cloud per a l'emmagatz...

  12. Centralized Duplicate Removal Video Storage System with Privacy Preservation in IoT

    Directory of Open Access Journals (Sweden)

    Hongyang Yan

    2018-06-01

    Full Text Available In recent years, the Internet of Things (IoT has found wide application and attracted much attention. Since most of the end-terminals in IoT have limited capabilities for storage and computing, it has become a trend to outsource the data from local to cloud computing. To further reduce the communication bandwidth and storage space, data deduplication has been widely adopted to eliminate the redundant data. However, since data collected in IoT are sensitive and closely related to users’ personal information, the privacy protection of users’ information becomes a challenge. As the channels, like the wireless channels between the terminals and the cloud servers in IoT, are public and the cloud servers are not fully trusted, data have to be encrypted before being uploaded to the cloud. However, encryption makes the performance of deduplication by the cloud server difficult because the ciphertext will be different even if the underlying plaintext is identical. In this paper, we build a centralized privacy-preserving duplicate removal storage system, which supports both file-level and block-level deduplication. In order to avoid the leakage of statistical information of data, Intel Software Guard Extensions (SGX technology is utilized to protect the deduplication process on the cloud server. The results of the experimental analysis demonstrate that the new scheme can significantly improve the deduplication efficiency and enhance the security. It is envisioned that the duplicated removal system with privacy preservation will be of great use in the centralized storage environment of IoT.

  13. Centralized Duplicate Removal Video Storage System with Privacy Preservation in IoT.

    Science.gov (United States)

    Yan, Hongyang; Li, Xuan; Wang, Yu; Jia, Chunfu

    2018-06-04

    In recent years, the Internet of Things (IoT) has found wide application and attracted much attention. Since most of the end-terminals in IoT have limited capabilities for storage and computing, it has become a trend to outsource the data from local to cloud computing. To further reduce the communication bandwidth and storage space, data deduplication has been widely adopted to eliminate the redundant data. However, since data collected in IoT are sensitive and closely related to users' personal information, the privacy protection of users' information becomes a challenge. As the channels, like the wireless channels between the terminals and the cloud servers in IoT, are public and the cloud servers are not fully trusted, data have to be encrypted before being uploaded to the cloud. However, encryption makes the performance of deduplication by the cloud server difficult because the ciphertext will be different even if the underlying plaintext is identical. In this paper, we build a centralized privacy-preserving duplicate removal storage system, which supports both file-level and block-level deduplication. In order to avoid the leakage of statistical information of data, Intel Software Guard Extensions (SGX) technology is utilized to protect the deduplication process on the cloud server. The results of the experimental analysis demonstrate that the new scheme can significantly improve the deduplication efficiency and enhance the security. It is envisioned that the duplicated removal system with privacy preservation will be of great use in the centralized storage environment of IoT.

  14. Dynamic federations: storage aggregation using open tools and protocols

    CERN Document Server

    Fabrizio Furano, F F; Ricardo Brito da Rocha, R R; Adrien Devresse, A D; Oliver Keeble, O K; Alejandro Alvarez Ayllon, A A

    2012-01-01

    A number of storage elements now offer standard protocol interfaces like NFS 4.1/pNFS and WebDAV, for access to their data repositories, in line with the standardization effort of the European Middleware Initiative (EMI). Also the LCG FileCatalogue (LFC) can offer such features. Here we report on work that seeks to exploit the federation potential of these protocols and build a system that offers a unique view of the storage and metadata ensemble and the possibility of integration of other compatible resources such as those from cloud providers. The challenge, here undertaken by the providers of dCache and DPM, and pragmatically open to other Grid and Cloud storage solutions, is to build such a system while being able to accommodate name translations from existing catalogues (e.g. LFCs), experiment- based metadata catalogues, or stateless algorithmic name translations, also known as ”trivial file catalogues”. Such so-called storage federations of standard protocols-based storage elements give a unique vie...

  15. CLOUD ACCOUNTING – A NEW PARADIGM OF ACCOUNTING POLICIES

    Directory of Open Access Journals (Sweden)

    Cristina PRICHICI

    2015-04-01

    Full Text Available In the current economic background companies invest in finding complete solutions for the integration of all business functions (sales, logistics, accounting aso., control, centralized coordination and harmonization of systems and financial management operations, data storage and resilience of services as well as cost savings. Technological trend of recent years brings forward the concept of cloud computing, an innovative model of processing and storage of data that allows companies to run business processes on IT infrastructures in conditions of economical optimization. Cloud computing allows companies to effectively and economically use IT applications and infrastructures through the model "use as you need and pay as you go". However, before deploying the data and applications in the virtual environment, organizations must take into account the implications of such a decision on the financial reporting process. In this respect, the paper aims to analyze the impact of cloud computing technology onthe main operational modules used for obtaining accounting data for financial reporting.

  16. Blood profiles for a wild population of green turtles (Chelonia mydas) in the southern Bahamas: size-specific and sex-specific relationships.

    Science.gov (United States)

    Bolten, A B; Bjorndal, K A

    1992-07-01

    Blood biochemical profiles and packed cell volumes were determined for 100 juvenile green turtles, Chelonia mydas, from a wild population in the southern Bahamas. There was a significant correlation of body size to 13 of the 26 blood parameters measured. Only plasma uric acid and cholesterol were significantly different between male and female turtles. The relationship between total plasma proteins and plasma refractive index was significant. The equation for converting refractive index (Y) to total plasma proteins (X) is Y = 1.34 + 0.00217(X).

  17. Integration of XRootD into the cloud infrastructure for ALICE data analysis

    Science.gov (United States)

    Kompaniets, Mikhail; Shadura, Oksana; Svirin, Pavlo; Yurchenko, Volodymyr; Zarochentsev, Andrey

    2015-12-01

    Cloud technologies allow easy load balancing between different tasks and projects. From the viewpoint of the data analysis in the ALICE experiment, cloud allows to deploy software using Cern Virtual Machine (CernVM) and CernVM File System (CVMFS), to run different (including outdated) versions of software for long term data preservation and to dynamically allocate resources for different computing activities, e.g. grid site, ALICE Analysis Facility (AAF) and possible usage for local projects or other LHC experiments. We present a cloud solution for Tier-3 sites based on OpenStack and Ceph distributed storage with an integrated XRootD based storage element (SE). One of the key features of the solution is based on idea that Ceph has been used as a backend for Cinder Block Storage service for OpenStack, and in the same time as a storage backend for XRootD, with redundancy and availability of data preserved by Ceph settings. For faster and easier OpenStack deployment was applied the Packstack solution, which is based on the Puppet configuration management system. Ceph installation and configuration operations are structured and converted to Puppet manifests describing node configurations and integrated into Packstack. This solution can be easily deployed, maintained and used even in small groups with limited computing resources and small organizations, which usually have lack of IT support. The proposed infrastructure has been tested on two different clouds (SPbSU & BITP) and integrates successfully with the ALICE data analysis model.

  18. E-HEALTH CLOUD FOR NIGERIAN TEACHING HOSPITALS

    African Journals Online (AJOL)

    Administrator

    massive data storage and availability of resources on demand. With well over ... this issue by proposing a Cloud computing infrastructure for e-Health solutions in Nigeria. This will ... security and privacy as each application has its own virtual.

  19. Cloud and fog computing in 5G mobile networks emerging advances and applications

    CERN Document Server

    Markakis, Evangelos; Mavromoustakis, Constandinos X; Pallis, Evangelos

    2017-01-01

    This book focuses on the challenges and solutions related to cloud and fog computing for 5G mobile networks, and presents novel approaches to the frameworks and schemes that carry out storage, communication, computation and control in the fog/cloud paradigm.

  20. Satellite Cloud and Radiative Property Processing and Distribution System on the NASA Langley ASDC OpenStack and OpenShift Cloud Platform

    Science.gov (United States)

    Nguyen, L.; Chee, T.; Palikonda, R.; Smith, W. L., Jr.; Bedka, K. M.; Spangenberg, D.; Vakhnin, A.; Lutz, N. E.; Walter, J.; Kusterer, J.

    2017-12-01

    Cloud Computing offers new opportunities for large-scale scientific data producers to utilize Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) IT resources to process and deliver data products in an operational environment where timely delivery, reliability, and availability are critical. The NASA Langley Research Center Atmospheric Science Data Center (ASDC) is building and testing a private and public facing cloud for users in the Science Directorate to utilize as an everyday production environment. The NASA SatCORPS (Satellite ClOud and Radiation Property Retrieval System) team processes and derives near real-time (NRT) global cloud products from operational geostationary (GEO) satellite imager datasets. To deliver these products, we will utilize the public facing cloud and OpenShift to deploy a load-balanced webserver for data storage, access, and dissemination. The OpenStack private cloud will host data ingest and computational capabilities for SatCORPS processing. This paper will discuss the SatCORPS migration towards, and usage of, the ASDC Cloud Services in an operational environment. Detailed lessons learned from use of prior cloud providers, specifically the Amazon Web Services (AWS) GovCloud and the Government Cloud administered by the Langley Managed Cloud Environment (LMCE) will also be discussed.

  1. Analytical treatment of the nonlinear electron cloud effect and the combined effects with beam-beam and space charge nonlinear forces in storage rings

    International Nuclear Information System (INIS)

    Gao Jie

    2009-01-01

    In this paper we treat first some nonlinear beam dynamics problems in storage rings, such as beam dynamic apertures due to magnetic multipoles, wiggles, beam-beam effects, nonlinear space charge effect, and then nonlinear electron cloud effect combined with beam-beam and space charge effects, analytically. This analytical treatment is applied to BEPC II. The corresponding analytical expressions developed in this paper are useful both in understanding the physics behind these problems and also in making practical quick hand estimations. (author)

  2. Keeping Genomic Data Safe on the Cloud

    OpenAIRE

    Rilak, Z.; Wernicke, S.; Bogicevic, I.

    2014-01-01

    With rapidly improving technology and decreasing costs, the amount of genetic data generated through Next Generation Sequencing (NGS) continues to grow at an exponential pace. Managing and processing this data requires significant compute power and storage. Cloud-based solutions and platforms offer virtually unlimited compute power and storage to meet this requirement, but raise concerns about data security. Standards such as HIPAA in the United States attempt to establish best practices for ...

  3. Automated Grid Monitoring for the LHCb Experiment Through HammerCloud

    CERN Document Server

    Dice, Bradley

    2015-01-01

    The HammerCloud system is used by CERN IT to monitor the status of the Worldwide LHC Computing Grid (WLCG). HammerCloud automatically submits jobs to WLCG computing resources, closely replicating the workflow of Grid users (e.g. physicists analyzing data). This allows computation nodes and storage resources to be monitored, software to be tested (somewhat like continuous integration), and new sites to be stress tested with a heavy job load before commissioning. The HammerCloud system has been in use for ATLAS and CMS experiments for about five years. This summer's work involved porting the HammerCloud suite of tools to the LHCb experiment. The HammerCloud software runs functional tests and provides data visualizations. HammerCloud's LHCb variant is written in Python, using the Django web framework and Ganga/DIRAC for job management.

  4. The Impact of Cloud Computing Technologies in E-learning

    Directory of Open Access Journals (Sweden)

    Hosam Farouk El-Sofany

    2013-01-01

    Full Text Available Cloud computing is a new computing model which is based on the grid computing, distributed computing, parallel computing and virtualization technologies define the shape of a new technology. It is the core technology of the next generation of network computing platform, especially in the field of education, cloud computing is the basic environment and platform of the future E-learning. It provides secure data storage, convenient internet services and strong computing power. This article mainly focuses on the research of the application of cloud computing in E-learning environment. The research study shows that the cloud platform is valued for both students and instructors to achieve the course objective. The paper presents the nature, benefits and cloud computing services, as a platform for e-learning environment.

  5. Electron cloud observations: a retrospective

    International Nuclear Information System (INIS)

    Harkay, K.

    2004-01-01

    A growing number of observations of electron cloud effects (ECEs) have been reported in positron and proton rings. Low-energy, background electrons ubiquitous in high-intensity particle accelerators. Amplification of electron cloud (EC) can occur under certain operating conditions, potentially giving rise to numerous effects that can seriously degrade accelerator performance. EC observations and diagnostics have contributed to a better understanding of ECEs, in particular, details of beam-induced multipacting and cloud saturation effects. Such experimental results can be used to provide realistic limits on key input parameters for modeling efforts and analytical calculations to improve prediction capability. Electron cloud effects are increasingly important phenomena in high luminosity, high brightness, or high intensity machines - Colliders, Storage rings, Damping rings, Heavy ion beams. EC generation and instability modeling increasingly complex and benchmarked against in situ data: (delta), (delta) 0 , photon reflectivity, and SE energy distributions important. Surface conditioning and use of solenoidal windings in field-free regions are successful cures: will they be enough? What are new observations and how do they contribute to body of work and understanding physics of EC?

  6. A WSN based Environment and Parameter Monitoring System for Human Health Comfort: A Cloud Enabled Approach

    Directory of Open Access Journals (Sweden)

    Manohara Pai

    2014-05-01

    Full Text Available The number and type of sensors measuring physical and physiological parameters have seen dramatic increase due to progress in the MEMS and Nano Technology. The Wireless Sensor Networks (WSNs in turn is bringing new applications in environment monitoring and healthcare in order to improve the quality of service especially in hospitals. The adequacy of WSNs to gather critical information has provided solution but with limited storage, computation and scalability. This limitation is addressed by integrating WSN with cloud services. But, once the data enters the cloud the owner has no control over it. Hence confidentiality and integrity of the data being stored in the cloud are compromised. In this proposed work, secure sensor-cloud architecture for the applications in healthcare is implemented by integrating two different clouds. The sink node of WSN outsources data into the cloud after performing operations to secure the data. Since the SaaS and IaaS environments of Cloud Computing are provided by two different cloud service providers (CSPs, both the CSPs will not have complete information of the architecture. This provides inherent security as data storage and data processing are done on different clouds.

  7. Atlas2 Cloud: a framework for personal genome analysis in the cloud.

    Science.gov (United States)

    Evani, Uday S; Challis, Danny; Yu, Jin; Jackson, Andrew R; Paithankar, Sameer; Bainbridge, Matthew N; Jakkamsetti, Adinarayana; Pham, Peter; Coarfa, Cristian; Milosavljevic, Aleksandar; Yu, Fuli

    2012-01-01

    Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms.

  8. Migrating Educational Data and Services to Cloud Computing: Exploring Benefits and Challenges

    Science.gov (United States)

    Lahiri, Minakshi; Moseley, James L.

    2013-01-01

    "Cloud computing" is currently the "buzzword" in the Information Technology field. Cloud computing facilitates convenient access to information and software resources as well as easy storage and sharing of files and data, without the end users being aware of the details of the computing technology behind the process. This…

  9. Security and privacy preserving approaches in the eHealth clouds with disaster recovery plan.

    Science.gov (United States)

    Sahi, Aqeel; Lai, David; Li, Yan

    2016-11-01

    Cloud computing was introduced as an alternative storage and computing model in the health sector as well as other sectors to handle large amounts of data. Many healthcare companies have moved their electronic data to the cloud in order to reduce in-house storage, IT development and maintenance costs. However, storing the healthcare records in a third-party server may cause serious storage, security and privacy issues. Therefore, many approaches have been proposed to preserve security as well as privacy in cloud computing projects. Cryptographic-based approaches were presented as one of the best ways to ensure the security and privacy of healthcare data in the cloud. Nevertheless, the cryptographic-based approaches which are used to transfer health records safely remain vulnerable regarding security, privacy, or the lack of any disaster recovery strategy. In this paper, we review the related work on security and privacy preserving as well as disaster recovery in the eHealth cloud domain. Then we propose two approaches, the Security-Preserving approach and the Privacy-Preserving approach, and a disaster recovery plan. The Security-Preserving approach is a robust means of ensuring the security and integrity of Electronic Health Records, and the Privacy-Preserving approach is an efficient authentication approach which protects the privacy of Personal Health Records. Finally, we discuss how the integrated approaches and the disaster recovery plan can ensure the reliability and security of cloud projects. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Benefits of cloud computing for PACS and archiving.

    Science.gov (United States)

    Koch, Patrick

    2012-01-01

    The goal of cloud-based services is to provide easy, scalable access to computing resources and IT services. The healthcare industry requires a private cloud that adheres to government mandates designed to ensure privacy and security of patient data while enabling access by authorized users. Cloud-based computing in the imaging market has evolved from a service that provided cost effective disaster recovery for archived data to fully featured PACS and vendor neutral archiving services that can address the needs of healthcare providers of all sizes. Healthcare providers worldwide are now using the cloud to distribute images to remote radiologists while supporting advanced reading tools, deliver radiology reports and imaging studies to referring physicians, and provide redundant data storage. Vendor managed cloud services eliminate large capital investments in equipment and maintenance, as well as staffing for the data center--creating a reduction in total cost of ownership for the healthcare provider.

  11. A hazy outlook for cloud computing.

    Science.gov (United States)

    Perna, Gabriel

    2012-01-01

    Because of competing priorities as well as cost, security, and implementation concerns, cloud-based storage development has gotten off to a slow start in healthcare. CIOs, CTOs, and other healthcare IT leaders are adopting a variety of strategies in this area, based on their organizations' needs, resources, and priorities.

  12. BI-LEVEL AUTHENTICATION FOR EFFECTIVE DATA SHARING IN CLOUD VIA PRIVACY-PRESERVING AUTHENTICATION PROTOCOL

    OpenAIRE

    J. Jeya Praise; A. Sam Silva

    2017-01-01

    Cloud computing is an emerging technology of distributed computing where users can remotely store their data in cloud storage and enjoy the on-demand cloud applications and services from a shared pool of configurable computing resources, without the burden of local infrastructure and maintenance. During data accessing, different users may share their data to achieve productive benefits. Storing the data in third party’s cloud system causes serious concern over the data confidentiality. The e...

  13. The design of an m-Health monitoring system based on a cloud computing platform

    Science.gov (United States)

    Xu, Boyi; Xu, Lida; Cai, Hongming; Jiang, Lihong; Luo, Yang; Gu, Yizhi

    2017-01-01

    Compared to traditional medical services provided within hospitals, m-Health monitoring systems (MHMSs) face more challenges in personalised health data processing. To achieve personalised and high-quality health monitoring by means of new technologies, such as mobile network and cloud computing, in this paper, a framework of an m-Health monitoring system based on a cloud computing platform (Cloud-MHMS) is designed to implement pervasive health monitoring. Furthermore, the modules of the framework, which are Cloud Storage and Multiple Tenants Access Control Layer, Healthcare Data Annotation Layer, and Healthcare Data Analysis Layer, are discussed. In the data storage layer, a multiple tenant access method is designed to protect patient privacy. In the data annotation layer, linked open data are adopted to augment health data interoperability semantically. In the data analysis layer, the process mining algorithm and similarity calculating method are implemented to support personalised treatment plan selection. These three modules cooperate to implement the core functions in the process of health monitoring, which are data storage, data processing, and data analysis. Finally, we study the application of our architecture in the monitoring of antimicrobial drug usage to demonstrate the usability of our method in personal healthcare analysis.

  14. A Decision Matrix and Monitoring based Framework for Infrastructure Performance Enhancement in A Cloud based Environment

    OpenAIRE

    Alam, Mansaf; Shakil, Kashish Ara

    2014-01-01

    Cloud environment is very different from traditional computing environment and therefore tracking the performance of cloud leverages additional requirements. The movement of data in cloud is very fast. Hence, it requires that resources and infrastructure available at disposal must be equally competent. Infrastructure level performance in cloud involves the performance of servers, network and storage which act as the heart and soul for driving the entire cloud business. Thus a constant improve...

  15. Towards a Cloud Computing Environment: Near Real-time Cloud Product Processing and Distribution for Next Generation Satellites

    Science.gov (United States)

    Nguyen, L.; Chee, T.; Minnis, P.; Palikonda, R.; Smith, W. L., Jr.; Spangenberg, D.

    2016-12-01

    The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) processes and derives near real-time (NRT) global cloud products from operational geostationary satellite imager datasets. These products are being used in NRT to improve forecast model, aircraft icing warnings, and support aircraft field campaigns. Next generation satellites, such as the Japanese Himawari-8 and the upcoming NOAA GOES-R, present challenges for NRT data processing and product dissemination due to the increase in temporal and spatial resolution. The volume of data is expected to increase to approximately 10 folds. This increase in data volume will require additional IT resources to keep up with the processing demands to satisfy NRT requirements. In addition, these resources are not readily available due to cost and other technical limitations. To anticipate and meet these computing resource requirements, we have employed a hybrid cloud computing environment to augment the generation of SatCORPS products. This paper will describe the workflow to ingest, process, and distribute SatCORPS products and the technologies used. Lessons learn from working on both AWS Clouds and GovCloud will be discussed: benefits, similarities, and differences that could impact decision to use cloud computing and storage. A detail cost analysis will be presented. In addition, future cloud utilization, parallelization, and architecture layout will be discussed for GOES-R.

  16. Effects of Egg Incubation Methods on Locomotor Performances of Green Turtle (Chelonia mydas) Hatchlings

    International Nuclear Information System (INIS)

    Mohd Uzair Rusli; Joseph, J.; Hock-Chark, L.; Zainudin Bachol

    2015-01-01

    Effects of different incubation methods on crawling and swimming ability of post-emergence green sea turtle (Chelonia mydas) hatchlings at Cherating (Kuantan, Pahang) and Chagar Hutang (Pulau Redang, Terengganu) Turtle Sanctuary were analysed during nesting season in 2009. Mean crawling speed of hatchlings incubated in styrofoam box, beach hatchery and in situ were at 0.042±0.008, 0.136±0.026 and 0.143±0.045 m/ s, respectively. Crawling performance of hatclings from styrofoam box can be improved by keeping them for at least 48 h after their emergence. For swimming performance, all types of incubation methods showed significant differences in mean power-stroke rate during their early swimming effort ranging at 93-114 strokes/ min. However, no correlation was found between morphological characteristics of hatchlings and swimming performance. The results from this study may give different perspective in evaluating hatchling production, which is in terms of hatchling morphological characteristics and their locomotor performance. (author)

  17. Bio and health informatics meets cloud : BioVLab as an example.

    Science.gov (United States)

    Chae, Heejoon; Jung, Inuk; Lee, Hyungro; Marru, Suresh; Lee, Seong-Whan; Kim, Sun

    2013-01-01

    The exponential increase of genomic data brought by the advent of the next or the third generation sequencing (NGS) technologies and the dramatic drop in sequencing cost have driven biological and medical sciences to data-driven sciences. This revolutionary paradigm shift comes with challenges in terms of data transfer, storage, computation, and analysis of big bio/medical data. Cloud computing is a service model sharing a pool of configurable resources, which is a suitable workbench to address these challenges. From the medical or biological perspective, providing computing power and storage is the most attractive feature of cloud computing in handling the ever increasing biological data. As data increases in size, many research organizations start to experience the lack of computing power, which becomes a major hurdle in achieving research goals. In this paper, we review the features of publically available bio and health cloud systems in terms of graphical user interface, external data integration, security and extensibility of features. We then discuss about issues and limitations of current cloud systems and conclude with suggestion of a biological cloud environment concept, which can be defined as a total workbench environment assembling computational tools and databases for analyzing bio/medical big data in particular application domains.

  18. Teaching, Learning, and Collaborating in the Cloud: Applications of Cloud Computing for Educators in Post-Secondary Institutions

    Science.gov (United States)

    Aaron, Lynn S.; Roche, Catherine M.

    2012-01-01

    "Cloud computing" refers to the use of computing resources on the Internet instead of on individual personal computers. The field is expanding and has significant potential value for educators. This is discussed with a focus on four main functions: file storage, file synchronization, document creation, and collaboration--each of which has…

  19. A survey on User’s security in cloud

    OpenAIRE

    Indal Singh; Rajesh Rai

    2014-01-01

    Cloud computing is a new wave in the field of information technology. Some see it as an emerging field in computer science. It consists of a set of resources and services offered through the Internet. Hence, “cloud computing” is also called “Internet computing.” The word “cloud” is a metaphor for describing the Web as a space where computing has been preinstalled and exists as a service. Operating systems, applications, storage, data, and processing capacity all exist on the W...

  20. Design Private Cloud of Oil and Gas SCADA System

    OpenAIRE

    Liu Miao; Mancang Yuan; Guodong Li

    2014-01-01

    SCADA (Supervisory Control and Data Acquisition) system is computer control system based on supervisory. SCADA system is very important to oil and gas pipeline engineering. Cloud computing is fundamentally altering the expectations for how and when computing, storage and networking resources should be allocated, managed and consumed. In order to increase resource utilization, reliability and availability of oil and gas pipeline SCADA system, the SCADA system based on cloud computing is propos...

  1. A Novel Market-Oriented Dynamic Collaborative Cloud Service Platform

    Science.gov (United States)

    Hassan, Mohammad Mehedi; Huh, Eui-Nam

    In today's world the emerging Cloud computing (Weiss, 2007) offer a new computing model where resources such as computing power, storage, online applications and networking infrastructures can be shared as "services" over the internet. Cloud providers (CPs) are incentivized by the profits to be made by charging consumers for accessing these services. Consumers, such as enterprises, are attracted by the opportunity for reducing or eliminating costs associated with "in-house" provision of these services.

  2. Policy and context management in dynamically provisioned access control service for virtualized Cloud infrastructures

    NARCIS (Netherlands)

    Ngo, C.; Membrey, P.; Demchenko, Y.; de Laat, C.

    2012-01-01

    Cloud computing is developing as a new wave of ICT technologies, offering a common approach to on-demand provisioning of computation, storage and network resources which are generally referred to as infrastructure services. Most of currently available commercial Cloud services are built and

  3. Electron-cloud simulation results for the PSR and SNS

    International Nuclear Information System (INIS)

    Pivi, M.; Furman, M.A.

    2002-01-01

    We present recent simulation results for the main features of the electron cloud in the storage ring of the Spallation Neutron Source (SNS) at Oak Ridge, and updated results for the Proton Storage Ring (PSR) at Los Alamos. In particular, a complete refined model for the secondary emission process including the so called true secondary, rediffused and backscattered electrons has been included in the simulation code

  4. DAФNE Operation with Electron-Cloud-Clearing Electrodes

    CERN Document Server

    Alesini, D; Gallo, A; Guiducci, S; Milardi, C; Stella, A; Zobov, Mikhail; De Santis, S; Demma, Theo; Raimondi, P

    2013-01-01

    The effects of an electron cloud (e-cloud) on beam dynamics are one of the major factors limiting performances of high intensity positron, proton, and ion storage rings. In the electron-positron collider DAΦNE, namely, a horizontal beam instability due to the electron-cloud effect has been identified as one of the main limitations on the maximum stored positron beam current and as a source of beam quality deterioration. During the last machine shutdown in order to mitigate such instability, special electrodes have been inserted in all dipole and wiggler magnets of the positron ring. It has been the first installation all over the world of this type since long metallic electrodes have been installed in all arcs of the collider positron ring and are currently used during the machine operation in collision. This has allowed a number of unprecedented measurements (e-cloud instabilities growth rate, transverse beam size variation, tune shifts along the bunch train) where the e-cloud contribution is clearly eviden...

  5. On Network Coded Distributed Storage

    DEFF Research Database (Denmark)

    Cabrera Guerrero, Juan Alberto; Roetter, Daniel Enrique Lucani; Fitzek, Frank Hanns Paul

    2016-01-01

    systems typically rely on expensive infrastructure with centralized control to store, repair and access the data. This approach introduces a large delay for accessing and storing the data driven in part by a high RTT between users and the cloud. These characteristics are at odds with the massive increase......This paper focuses on distributed fog storage solutions, where a number of unreliable devices organize themselves in Peer-to-Peer (P2P) networks with the purpose to store reliably their data and that of other devices and/or local users and provide lower delay and higher throughput. Cloud storage...... of devices and generated data in coming years as well as the requirements of low latency in many applications. We focus on characterizing optimal solutions for maintaining data availability when nodes in the fog continuously leave the network. In contrast with state-of-the-art data repair formulations, which...

  6. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  7. Sleep Scheduling Schemes Based on Location of Mobile User in Sensor-Cloud

    OpenAIRE

    N. Mahendran; R. Priya

    2016-01-01

    The mobile cloud computing (MCC) with wireless sensor networks (WSNs) technology gets more attraction by research scholars because its combines the sensors data gathering ability with the cloud data processing capacity. This approach overcomes the limitation of data storage capacity and computational ability of sensor nodes. Finally, the stored data are sent to the mobile users when the user sends the request. The most of the integrated sensor-cloud schemes fail to observe the following crite...

  8. Cloud Data Storage Federation for Scientific Applications

    NARCIS (Netherlands)

    Koulouzis, S.; Vasyunin, D.; Cushing, R.; Belloum, A.; Bubak, M.; an Mey, D.; Alexander, M.; Bientinesi, P.; Cannataro, M.; Clauss, C.; Costan, A.; Kecskemeti, G.; Morin, C.; Ricci, L.; Sahuquillo, J.; Schulz, M.; Scarano, V.; Scott, S.L.; Weidendorfer, J.

    2014-01-01

    Nowadays, data-intensive scientific research needs storage capabilities that enable efficient data sharing. This is of great importance for many scientific domains such as the Virtual Physiological Human. In this paper, we introduce a solution that federates a variety of systems ranging from file

  9. Reviews on Security Issues and Challenges in Cloud Computing

    Science.gov (United States)

    An, Y. Z.; Zaaba, Z. F.; Samsudin, N. F.

    2016-11-01

    Cloud computing is an Internet-based computing service provided by the third party allowing share of resources and data among devices. It is widely used in many organizations nowadays and becoming more popular because it changes the way of how the Information Technology (IT) of an organization is organized and managed. It provides lots of benefits such as simplicity and lower costs, almost unlimited storage, least maintenance, easy utilization, backup and recovery, continuous availability, quality of service, automated software integration, scalability, flexibility and reliability, easy access to information, elasticity, quick deployment and lower barrier to entry. While there is increasing use of cloud computing service in this new era, the security issues of the cloud computing become a challenges. Cloud computing must be safe and secure enough to ensure the privacy of the users. This paper firstly lists out the architecture of the cloud computing, then discuss the most common security issues of using cloud and some solutions to the security issues since security is one of the most critical aspect in cloud computing due to the sensitivity of user's data.

  10. MUSE: An Efficient and Accurate Verifiable Privacy-Preserving Multikeyword Text Search over Encrypted Cloud Data

    OpenAIRE

    Xiangyang, Zhu; Hua, Dai; Xun, Yi; Geng, Yang; Xiao, Li

    2017-01-01

    With the development of cloud computing, services outsourcing in clouds has become a popular business model. However, due to the fact that data storage and computing are completely outsourced to the cloud service provider, sensitive data of data owners is exposed, which could bring serious privacy disclosure. In addition, some unexpected events, such as software bugs and hardware failure, could cause incomplete or incorrect results returned from clouds. In this paper, we propose an efficient ...

  11. A Cost-Effective Strategy for Storing Scientific Datasets with Multiple Service Providers in the Cloud

    OpenAIRE

    Yuan, Dong; Cui, Lizhen; Liu, Xiao; Fu, Erjiang; Yang, Yun

    2016-01-01

    Cloud computing provides scientists a platform that can deploy computation and data intensive applications without infrastructure investment. With excessive cloud resources and a decision support system, large generated data sets can be flexibly 1 stored locally in the current cloud, 2 deleted and regenerated whenever reused or 3 transferred to cheaper cloud service for storage. However, due to the pay for use model, the total application cost largely depends on the usage of computation, stor...

  12. Electron Cloud at Low Emittance in CesrTA

    CERN Document Server

    Palmer, Mark; Billing, Michael; Calvey, Joseph; Conolly, Christopher; Crittenden, James; Dobbins, John; Dugan, Gerald; Eggert, Nicholas; Fontes, Ernest; Forster, Michael; Gallagher, Richard; Gray, Steven; Greenwald, Shlomo; Hartill, Donald; Hopkins, Walter; Kreinick, David; Kreis, Benjamin; Leong, Zhidong; Li, Yulin; Liu, Xianghong; Livezey, Jesse; Lyndaker, Aaron; Makita, Junki; McDonald, Michael; Medjidzade, Valeri; Meller, Robert; O'Connell, Tim; Peck, Stuart; Peterson, Daniel; Ramirez, Gabriel; Rendina, Matthew; Revesz, Peter; Rider, Nate; Rice, David; Rubin, David; Sagan, David; Savino, James; Schwartz, Robert; Seeley, Robert; Sexton, James; Shanks, James; Sikora, John; Smith, Eric; Strohman, Charles; Williams, Heather; Antoniou, Fanouria; Calatroni, Sergio; Gasior, Marek; Jones, Owain Rhodri; Papaphilippou, Yannis; Pfingstner, Juergen; Rumolo, Giovanni; Schmickler, Hermann; Taborelli, Mauro; Asner, David; Boon, Laura; Garfinkel, Arthur; Byrd, John; Celata, Christine; Corlett, John; De Santis, Stefano; Furman, Miguel; Jackson, Alan; Kraft, Rick; Munson, Dawn; Penn, Gregory; Plate, David; Venturini, Marco; Carlson, Benjamin; Demma, Theo; Dowd, Rohan; Flanagan, John; Jain, Puneet; Kanazawa, Ken-ichi; Kubo, Kiyoshi; Ohmi, Kazuhito; Sakai, Hiroshi; Shibata, Kyo; Suetsugu, Yusuke; Tobiyama, Makoto; Gonnella, Daniel; Guo, Weiming; Harkay, Katherine; Holtzapple, Robert; Jones, James; Wolski, Andrzej; Kharakh, David; Ng, Johnny; Pivi, Mauro; Wang, Lanfa; Ross, Marc; Tan, Cheng-Yang; Zwaska, Robert; Schachter, Levi; Wilkinson, Eric

    2010-01-01

    The Cornell Electron Storage Ring (CESR) has been reconfigured as a test accelerator (CesrTA) for a program of electron cloud (EC) research at ultra low emittance. The instrumentation in the ring has been upgraded with local diagnostics for measurement of cloud density and with improved beam diagnostics for the characterization of both the low emittance performance and the beam dynamics of high intensity bunch trains interacting with the cloud. A range of EC mitigation methods have been deployed and tested and their effectiveness is discussed. Measurements of the electron cloud’s effect on the beam under a range of conditions are discussed along with the simulations being used to quantitatively understand these results

  13. Redundant VoD Streaming Service in a Private Cloud: Availability Modeling and Sensitivity Analysis

    OpenAIRE

    Rosangela Maria De Melo; Maria Clara Bezerra; Jamilson Dantas; Rubens Matos; Ivanildo José De Melo Filho; Paulo Maciel

    2014-01-01

    For several years cloud computing has been generating considerable debate and interest within IT corporations. Since cloud computing environments provide storage and processing systems that are adaptable, efficient, and straightforward, thereby enabling rapid infrastructure modifications to be made according to constantly varying workloads, organizations of every size and type are migrating to web-based cloud supported solutions. Due to the advantages of the pay-per-use ...

  14. CLOUD EDUCATIONAL RESOURCES FOR PHYSICS LEARNING RESEARCHES SUPPORT

    Directory of Open Access Journals (Sweden)

    Oleksandr V. Merzlykin

    2015-10-01

    Full Text Available The definition of cloud educational resource is given in paper. Its program and information components are characterized. The virtualization as the technological ground of transforming from traditional electronic educational resources to cloud ones is reviewed. Such levels of virtualization are described: data storage device virtualization (Data as Service, hardware virtualization (Hardware as Service, computer virtualization (Infrastructure as Service, software system virtualization (Platform as Service, «desktop» virtualization (Desktop as Service, software user interface virtualization (Software as Service. Possibilities of designing the cloud educational resources system for physics learning researches support taking into account standards of learning objects metadata (accessing via OAI-PMH protocol and standards of learning tools interoperability (LTI are shown. The example of integration cloud educational resources into Moodle learning management system with use of OAI-PMH and LTI is given.

  15. Towards extending IFC with point cloud data

    NARCIS (Netherlands)

    Krijnen, T.F.; Beetz, J.; Ochmann, S.; Vock, R.; Wessel, R.

    2015-01-01

    In this paper we suggest an extension to the Industry Foundation Classes model to integrate point cloud datasets. The proposal includes a schema extension to the core model allowing the storage of points either as Cartesian coordinates, points in parametric space of a surface associated with a

  16. Cloud computing: pushing the right managerial buttons

    NARCIS (Netherlands)

    S. Khanagha (Saeed)

    2015-01-01

    textabstractDespite the obvious organisational advantages offered by cloud computing, not all firms have adopted and adapted to the rapid changes that this new form of remote data storage represents. Recent research and practice have focused on the issue from the perspective of firms as a whole,

  17. Mobile cloud networking: mobile network, compute, and storage as one service on-demand

    NARCIS (Netherlands)

    Jamakovic, Almerima; Bohnert, Thomas Michael; Karagiannis, Georgios; Galis, A.; Gavras, A.

    2013-01-01

    The Future Communication Architecture for Mobile Cloud Services: Mobile Cloud Networking (MCN)1 is a EU FP7 Large scale Integrating Project (IP) funded by the European Commission. MCN project was launched in November 2012 for the period of 36 month. In total top-tier 19 partners from industry and

  18. Cloud Computing: Key to IT Development in West Africa

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-12-01

    Dec 1, 2013 ... Paper, explores the basic concepts of Cloud Computing and how the emerging technology could be harnessed to .... recovery of information than traditional system. Storage ... quickly meet business demand was an important ...

  19. Using cloud technologies to complement environmental information systems

    International Nuclear Information System (INIS)

    Schlachter, Thorsten; Duepmeier, Clemens; Weidemann, Rainer

    2013-01-01

    Cloud services can help to close the gap between available and published data by providing infrastructure, storage, services, or even whole applications. Within this paper we present some fundamental ideas on the use of cloud services for the construction of powerful services in order to toughen up environmental information systems for the needs of state of the art web, portal, and mobile technologies. We include uses cases for the provision of environmental information as well as for the collection of user generated data. (orig.)

  20. Mobile healthcare information management utilizing Cloud Computing and Android OS.

    Science.gov (United States)

    Doukas, Charalampos; Pliakas, Thomas; Maglogiannis, Ilias

    2010-01-01

    Cloud Computing provides functionality for managing information data in a distributed, ubiquitous and pervasive manner supporting several platforms, systems and applications. This work presents the implementation of a mobile system that enables electronic healthcare data storage, update and retrieval using Cloud Computing. The mobile application is developed using Google's Android operating system and provides management of patient health records and medical images (supporting DICOM format and JPEG2000 coding). The developed system has been evaluated using the Amazon's S3 cloud service. This article summarizes the implementation details and presents initial results of the system in practice.

  1. Reliable IoT Storage: Minimizing Bandwidth Use in Storage Without Newcomer Nodes

    DEFF Research Database (Denmark)

    Zhao, Xiaobo; Lucani Rötter, Daniel Enrique; Shen, Xiaohong

    2018-01-01

    This letter characterizes the optimal policies for bandwidth use and storage for the problem of distributed storage in Internet of Things (IoT) scenarios, where lost nodes cannot be replaced by new nodes as is typically assumed in Data Center and Cloud scenarios. We develop an information flow...... model that captures the overall process of data transmission between IoT devices, from the initial preparation stage (generating redundancy from the original data) to the different repair stages with fewer and fewer devices. Our numerical results show that in a system with 10 nodes, the proposed optimal...

  2. Communicating Geography with the Cloud. GI_Forum|GI_Forum 2015 – Geospatial Minds for Society|

    OpenAIRE

    Silva, Durval; Donert, Karl

    2016-01-01

    Cloud computing is one of the hottest technological trends in education. According to MICROSOFT (2012), “With Cloud computing in education, you get powerful software and massive computing resources where and when you need them. […] Cloud services can be used to combine on-demand computing and storage, familiar experience with on-demand scalability and online services for anywhere, anytime access to powerful web-based tools.” This paper discusses the potentials that the cloud delivers for educ...

  3. D1.1 Research communities identification and definition report : Europeana Cloud: Unlocking Europe’s Research via The Cloud; WP1

    NARCIS (Netherlands)

    Waterman, MA Kees-Jan; Links, P.; Ekman, Stefan; Sjögren, Björn; Benardou, Agiatis

    Europeana Cloud’s main objectives are to provide new content, new metadata, a new linked storage system, new tools and services for researchers and a new platform, Europeana Research. For Europeana Cloud to achieve these objectives, it is essential that researchers’ needs are thoroughly understood.

  4. Secure and robust cloud computing for high-throughput forensic microsatellite sequence analysis and databasing.

    Science.gov (United States)

    Bailey, Sarah F; Scheible, Melissa K; Williams, Christopher; Silva, Deborah S B S; Hoggan, Marina; Eichman, Christopher; Faith, Seth A

    2017-11-01

    Next-generation Sequencing (NGS) is a rapidly evolving technology with demonstrated benefits for forensic genetic applications, and the strategies to analyze and manage the massive NGS datasets are currently in development. Here, the computing, data storage, connectivity, and security resources of the Cloud were evaluated as a model for forensic laboratory systems that produce NGS data. A complete front-to-end Cloud system was developed to upload, process, and interpret raw NGS data using a web browser dashboard. The system was extensible, demonstrating analysis capabilities of autosomal and Y-STRs from a variety of NGS instrumentation (Illumina MiniSeq and MiSeq, and Oxford Nanopore MinION). NGS data for STRs were concordant with standard reference materials previously characterized with capillary electrophoresis and Sanger sequencing. The computing power of the Cloud was implemented with on-demand auto-scaling to allow multiple file analysis in tandem. The system was designed to store resulting data in a relational database, amenable to downstream sample interpretations and databasing applications following the most recent guidelines in nomenclature for sequenced alleles. Lastly, a multi-layered Cloud security architecture was tested and showed that industry standards for securing data and computing resources were readily applied to the NGS system without disadvantageous effects for bioinformatic analysis, connectivity or data storage/retrieval. The results of this study demonstrate the feasibility of using Cloud-based systems for secured NGS data analysis, storage, databasing, and multi-user distributed connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Cloud archiving and data mining of High-Resolution Rapid Refresh forecast model output

    Science.gov (United States)

    Blaylock, Brian K.; Horel, John D.; Liston, Samuel T.

    2017-12-01

    Weather-related research often requires synthesizing vast amounts of data that need archival solutions that are both economical and viable during and past the lifetime of the project. Public cloud computing services (e.g., from Amazon, Microsoft, or Google) or private clouds managed by research institutions are providing object data storage systems potentially appropriate for long-term archives of such large geophysical data sets. We illustrate the use of a private cloud object store developed by the Center for High Performance Computing (CHPC) at the University of Utah. Since early 2015, we have been archiving thousands of two-dimensional gridded fields (each one containing over 1.9 million values over the contiguous United States) from the High-Resolution Rapid Refresh (HRRR) data assimilation and forecast modeling system. The archive is being used for retrospective analyses of meteorological conditions during high-impact weather events, assessing the accuracy of the HRRR forecasts, and providing initial and boundary conditions for research simulations. The archive is accessible interactively and through automated download procedures for researchers at other institutions that can be tailored by the user to extract individual two-dimensional grids from within the highly compressed files. Characteristics of the CHPC object storage system are summarized relative to network file system storage or tape storage solutions. The CHPC storage system is proving to be a scalable, reliable, extensible, affordable, and usable archive solution for our research.

  6. CLOUD TECHNOLOGY AS A WAY OF UKRAINIAN EDUCATION DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    T. Zaytseva

    2014-06-01

    Full Text Available This article is devoted to defining the forms and the required components cloud technology usage during studying of subject teachers. In order to improve the learning process it’s necessary to use such powerful technology as ‘cloud computing’. They support traditional forms of education and also are a new step in the development of education. Cloud technologies are-effective, efficient and flexible way to satisfy the needs of students during getting of new knowledge. Nowadays a characteristic feature of our time is rapid growing of using cloud technology. That is why we are spectators of implementation of cloud technologies and services in the system of higher and secondary education, too. A common information space in education using mostly cloud technologies that provide Microsoft and Google is creating now. Google Apps for Education containing free tools that allows teachers and students to communicate, teach and learn more effectively and efficiently. Significant advantage of using cloud services is providing application development and storage of large amounts of data on servers in distributed information processing centers via the Internet. That is why cloud technology is a powerful tool to activate students' self-guidance work. Surely, growing demand for professionals who knows the technology of cloud computing will increase slowly.

  7. NASA Cloud-Based Climate Data Services

    Science.gov (United States)

    McInerney, M. A.; Schnase, J. L.; Duffy, D. Q.; Tamkin, G. S.; Strong, S.; Ripley, W. D., III; Thompson, J. H.; Gill, R.; Jasen, J. E.; Samowich, B.; Pobre, Z.; Salmon, E. M.; Rumney, G.; Schardt, T. D.

    2012-12-01

    Cloud-based scientific data services are becoming an important part of NASA's mission. Our technological response is built around the concept of specialized virtual climate data servers, repetitive cloud provisioning, image-based deployment and distribution, and virtualization-as-a-service (VaaS). A virtual climate data server (vCDS) is an Open Archive Information System (OAIS) compliant, iRODS-based data server designed to support a particular type of scientific data collection. iRODS is data grid middleware that provides policy-based control over collection-building, managing, querying, accessing, and preserving large scientific data sets. We have deployed vCDS Version 1.0 in the Amazon EC2 cloud using S3 object storage and are using the system to deliver a subset of NASA's Intergovernmental Panel on Climate Change (IPCC) data products to the latest CentOS federated version of Earth System Grid Federation (ESGF), which is also running in the Amazon cloud. vCDS-managed objects are exposed to ESGF through FUSE (Filesystem in User Space), which presents a POSIX-compliant filesystem abstraction to applications such as the ESGF server that require such an interface. A vCDS manages data as a distinguished collection for a person, project, lab, or other logical unit. A vCDS can manage a collection across multiple storage resources using rules and microservices to enforce collection policies. And a vCDS can federate with other vCDSs to manage multiple collections over multiple resources, thereby creating what can be thought of as an ecosystem of managed collections. With the vCDS approach, we are trying to enable the full information lifecycle management of scientific data collections and make tractable the task of providing diverse climate data services. In this presentation, we describe our approach, experiences, lessons learned, and plans for the future.; (A) vCDS/ESG system stack. (B) Conceptual architecture for NASA cloud-based data services.

  8. A NEW CLOUD BASED SUPERVISORY CONTROL AND DATA ACQUISITION IMPLEMENTATION TO ENHANCE THE LEVEL OF SECURITY USING TESTBED

    OpenAIRE

    A. Shahzad; S. Musa; A. Aborujilah; M. Irfan

    2014-01-01

    Now days cloud computing is an important and hot topic in arena of information technology and computer system. Several companies and educational institutes have been deployed cloud infrastructures to overcome their problems such as easy data access, software updates with minimal cost, large or unlimited storage, efficient cost factor, backup storage and disaster recovery and several other benefits compare with the traditional network infrastructures. In this research paper; Supervisory Contro...

  9. A new community on the horizon: CS3 — Cloud Services for Synchronization and Sharing

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    CS3 (Cloud Services for Synchronization and Sharing) is a new and growing community, which was initiated at the kick-off event in 2014 at CERN.  Over the past few years CS3 workshops have become a point of excellence for cloud storage services and featured four editions hosted across Europe: CERN, ETH Zurich, SURFSara Amsterdam and CYFRONET Krakow. The distinct mark of CS3 is that it grew bottom-up, without a central entity being a sponsor or providing top-down funding.  At CS3 the industry leaders such as Dropbox present alongside startups or confirmed SMEs. CS3 also provides a discussion forum: what is the future of on-premise services deployed “in your lab” in the era of global transformation of the IT industry with the advent of commercial  (“public”) cloud services on a massive scale? One of the strengths of the on-premise model for research institutions is the integration of cloud storage and file sharing services with other services available on site: Data Analysis Services, Cloud Infrastr...

  10. A Survey on Design and Implementation of Protected Searchable Data in the Cloud

    OpenAIRE

    Dowsley, R.; Michalas, A.; Nagel, M.; Paladi, N.

    2017-01-01

    While cloud computing has exploded in popularity in recent years thanks to the potential efficiency and cost savings of outsourcing the storage and management of data and applications, a number of vulnerabilities that led to multiple attacks have deterred many potential users. As a result, experts in the field argued that new mechanisms are needed in order to create trusted and secure cloud services. Such mechanisms would eradicate the suspicion of users towards cloud computing by providing t...

  11. Dynamic Delegation Approach Based on Access Control in the Electronic Evidence Storage of the Cloud Platform%基于云平台的电子证据存储中访问控制的动态委托方法

    Institute of Scientific and Technical Information of China (English)

    何月; 陈明

    2013-01-01

    对动态云平台电子证据存储代表的必要性的研究,证明所有资源网站的可信任性是安全云平台电子证据存储计算的一个先决条件.分析了现有的代表团类和电网计划,并提出了云平台电子证据存储访问控制的模糊信任、委托模型(FTDM)和基于云平台电子证据存储实体之间信任关系的两个阶段的模糊推理过程,进而决定类的代表团和信任级别授权访问的可能性.并且引用数字取证分析云平台资源的K-means聚类算法的基础上的CGRS资源选择算法,并在实务部门数字取证分析平台的可扩展环境中实施.通过分析和实验结果验证了分析平台的资源选择算法的有效性和正确性.%Study on the necessity of electronic evidence storage on dynamic cloud platform of electronic evidence storage.The trustworthiness of the web resources is a prerequisite for secure cloud platform of electronic evidence storage calculation.I analyze the existing classes of delegation and plan for power grid,put forward the new fuzzy trust and FTDM based on the cloud platform of electronic evidence storage access,advance the fuzzy reasoning process of two stage of trust relationship between electronic evidence storage entities based on the cloud platform,and then decide the possibility of classes of delegation and trust level of authorization access.I use CGRS resource selection algorithm which is based on the K-means clustering algorithm in digital forensics analysis of cloud platform resources,and the algorithm is used to scalable environment for digital forensics analysis platform in substantive departments.The analysis and the experimental results,verify the correctness and effectiveness of the analysis platform of resource selection algoritm.

  12. Geometric data perturbation-based personal health record transactions in cloud computing.

    Science.gov (United States)

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  13. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    Science.gov (United States)

    Balasubramaniam, S.; Kavitha, V.

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826

  14. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    Directory of Open Access Journals (Sweden)

    S. Balasubramaniam

    2015-01-01

    Full Text Available Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  15. Cloud Computing Boosts Business Intelligence of Telecommunication Industry

    Science.gov (United States)

    Xu, Meng; Gao, Dan; Deng, Chao; Luo, Zhiguo; Sun, Shaoling

    Business Intelligence becomes an attracting topic in today's data intensive applications, especially in telecommunication industry. Meanwhile, Cloud Computing providing IT supporting Infrastructure with excellent scalability, large scale storage, and high performance becomes an effective way to implement parallel data processing and data mining algorithms. BC-PDM (Big Cloud based Parallel Data Miner) is a new MapReduce based parallel data mining platform developed by CMRI (China Mobile Research Institute) to fit the urgent requirements of business intelligence in telecommunication industry. In this paper, the architecture, functionality and performance of BC-PDM are presented, together with the experimental evaluation and case studies of its applications. The evaluation result demonstrates both the usability and the cost-effectiveness of Cloud Computing based Business Intelligence system in applications of telecommunication industry.

  16. Secure Skyline Queries on Cloud Platform.

    Science.gov (United States)

    Liu, Jinfei; Yang, Juncheng; Xiong, Li; Pei, Jian

    2017-04-01

    Outsourcing data and computation to cloud server provides a cost-effective way to support large scale data storage and query processing. However, due to security and privacy concerns, sensitive data (e.g., medical records) need to be protected from the cloud server and other unauthorized users. One approach is to outsource encrypted data to the cloud server and have the cloud server perform query processing on the encrypted data only. It remains a challenging task to support various queries over encrypted data in a secure and efficient way such that the cloud server does not gain any knowledge about the data, query, and query result. In this paper, we study the problem of secure skyline queries over encrypted data. The skyline query is particularly important for multi-criteria decision making but also presents significant challenges due to its complex computations. We propose a fully secure skyline query protocol on data encrypted using semantically-secure encryption. As a key subroutine, we present a new secure dominance protocol, which can be also used as a building block for other queries. Finally, we provide both serial and parallelized implementations and empirically study the protocols in terms of efficiency and scalability under different parameter settings, verifying the feasibility of our proposed solutions.

  17. A New Trusted and Collaborative Agent Based Approach for Ensuring Cloud Security

    OpenAIRE

    Pal, Shantanu; Khatua, Sunirmal; Chaki, Nabendu; Sanyal, Sugata

    2011-01-01

    In order to determine the user's trust is a growing concern for ensuring privacy and security in a cloud computing environment. In cloud, user's data is stored in one or more remote server(s) which poses more security challenges for the system. One of the most important concerns is to protect user's sensitive information from other users and hackers that may cause data leakage in cloud storage. Having this security challenge in mind, this paper focuses on the development of a more secure clou...

  18. Development of a Survivable Cloud Multi-Robot Framework for Heterogeneous Environments

    Directory of Open Access Journals (Sweden)

    Isaac Osunmakinde

    2014-10-01

    Full Text Available Cloud robotics is a paradigm that allows for robots to offload computationally intensive and data storage requirements into the cloud by providing a secure and customizable environment. The challenge for cloud robotics is the inherent problem of cloud disconnection. A major assumption made in the development of the current cloud robotics frameworks is that the connection between the cloud and the robot is always available. However, for multi-robots working in heterogeneous environments, the connection between the cloud and the robots cannot always be guaranteed. This work serves to assist with the challenge of disconnection in cloud robotics by proposing a survivable cloud multi-robotics (SCMR framework for heterogeneous environments. The SCMR framework leverages the combination of a virtual ad hoc network formed by robot-to-robot communication and a physical cloud infrastructure formed by robot-to-cloud communications. The quality of service (QoS on the SCMR framework was tested and validated by determining the optimal energy utilization and time of response (ToR on drivability analysis with and without cloud connection. The design trade-off, including the result, is between the computation energy for the robot execution and the offloading energy for the cloud execution.

  19. SECURITY AND PRIVACY ISSUES IN CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    Amina AIT OUAHMAN

    2014-10-01

    Full Text Available Today, cloud computing is defined and talked about across the ICT industry under different contexts and with different definitions attached to it. It is a new paradigm in the evolution of Information Technology, as it is one of the biggest revolutions in this field to have taken place in recent times. According to the National Institute for Standards and Technology (NIST, “cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services that can be rapidly provisioned and released with minimal management effort or service provider interaction” [1]. The importance of Cloud Computing is increasing and it is receiving a growing attention in the scientific and industrial communities. A study by Gartner [2] considered Cloud Computing as the first among the top 10 most important technologies and with a better prospect in successive years by companies and organizations. Clouds bring out tremendous benefits for both individuals and enterprises. Clouds support economic savings, outsourcing mechanisms, resource sharing, any-where any-time accessibility, on-demand scalability, and service flexibility. Clouds minimize the need for user involvement by masking technical details such as software upgrades, licenses, and maintenance from its customers. Clouds could also offer better security advantages over individual server deployments. Since a cloud aggregates resources, cloud providers charter expert security personnel while typical companies could be limited with a network administrator who might not be well versed in cyber security issues. The new concepts introduced by the clouds, such as computation outsourcing, resource sharing, and external data warehousing, increase the security and privacy concerns and create new security challenges. Moreover, the large scale of the clouds, the proliferation of mobile access devices (e

  20. A State-of-the-Art Review of Cloud Forensics

    Directory of Open Access Journals (Sweden)

    Sameera Abdulrahman Almulla

    2014-12-01

    Full Text Available Cloud computing and digital forensics are emerging fields of technology. Unlike traditional digital forensics where the target environment can be almost completely acquired, isolated and can be under the investigators control; in cloud environments, the distribution of computation and storage poses unique and complex challenges to the investigators.Recently, the term "cloud forensics" has an increasing presence in the field of digital forensics. In this state-of-the-art review, we included the most recent research efforts that used "cloud forensics" as a keyword and then classify the literature in to three dimensions, (1 survey-based, (2 technology-based and (3 forensics procedural-based.We discuss widely accepted international standard bodies and their efforts to cope with the current trend of cloud forensics. Our aim is not only to reference related work based on the discussed dimensions, but also to analyze them and generate a mind map that will help in identifying research gaps. Finally, we summarize existing digital forensics tools and, the available simulation environments that can be used for evidence acquisition, examination and cloud forensics test purposes.

  1. Effects of glyphosate herbicide on the gastrointestinal microflora of Hawaiian green turtles (Chelonia mydas) Linnaeus.

    Science.gov (United States)

    Kittle, Ronald P; McDermid, Karla J; Muehlstein, Lisa; Balazs, George H

    2018-02-01

    In Hawaii, glyphosate-based herbicides frequently sprayed near shorelines may be affecting non-target marine species. Glyphosate inhibits aromatic amino acid biosynthesis (shikimate pathway), and is toxic to beneficial gut bacteria in cattle and chickens. Effects of glyphosate on gut bacteria in marine herbivorous turtles were assessed in vitro. When cultures of mixed bacterial communities from gastrointestinal tracts of freshly euthanized green turtles (Chelonia mydas), were exposed for 24h to six glyphosate concentrations (plus deionized water control), bacterial density was significantly lower at glyphosate concentrations≥2.2×10 -4 gL -1 (absorbance measured at 600nm wavelength). Using a modified Kirby-Bauer disk diffusion assay, the growth of four bacterial isolates (Pantoea, Proteus, Shigella, and Staphylococcus) was significantly inhibited by glyphosate concentrations≥1.76×10 -3 gL -1 . Reduced growth or lower survival of gut bacteria in green turtles exposed to glyphosate could have adverse effects on turtle digestion and overall health. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. An approximate dynamic programming approach to resource management in multi-cloud scenarios

    Science.gov (United States)

    Pietrabissa, Antonio; Priscoli, Francesco Delli; Di Giorgio, Alessandro; Giuseppi, Alessandro; Panfili, Martina; Suraci, Vincenzo

    2017-03-01

    The programmability and the virtualisation of network resources are crucial to deploy scalable Information and Communications Technology (ICT) services. The increasing demand of cloud services, mainly devoted to the storage and computing, requires a new functional element, the Cloud Management Broker (CMB), aimed at managing multiple cloud resources to meet the customers' requirements and, simultaneously, to optimise their usage. This paper proposes a multi-cloud resource allocation algorithm that manages the resource requests with the aim of maximising the CMB revenue over time. The algorithm is based on Markov decision process modelling and relies on reinforcement learning techniques to find online an approximate solution.

  3. Report of Enodiotrema megachondrus (Looss, 1899 Looss, 1901 (Digenea: Plagiorchiidae in a green turtle Chelonia mydas Linnaeus, 1758 (Testudines, Cheloniidae from Brazil

    Directory of Open Access Journals (Sweden)

    Werneck M. R.

    2016-12-01

    Full Text Available This paper describes the occurrence of Enodiotrema megachondrus (Looss, 1899 Looss, 1901 in a juvenile green sea turtle (Chelonia mydas Linnaeus, 1758 found on the coast of Brazil. This parasite has been described in Caretta caretta from Egypt, France, the Mediterranean Sea, the Madeira Archipelago, the Adriatic Sea and the USA, in C. mydas from Egypt and the USA, in Eretmochelys imbricata from Cuba, in Lepidochelys olivacea from Mexico and Costa Rica and in Lepidochelys kempii from USA. This note represents the first report of E. megachondus in a green sea turtle in the South-West Atlantic Ocean.

  4. Towards Dynamic Remote Data Auditing in Computational Clouds

    Science.gov (United States)

    Khurram Khan, Muhammad; Anuar, Nor Badrul

    2014-01-01

    Cloud computing is a significant shift of computational paradigm where computing as a utility and storing data remotely have a great potential. Enterprise and businesses are now more interested in outsourcing their data to the cloud to lessen the burden of local data storage and maintenance. However, the outsourced data and the computation outcomes are not continuously trustworthy due to the lack of control and physical possession of the data owners. To better streamline this issue, researchers have now focused on designing remote data auditing (RDA) techniques. The majority of these techniques, however, are only applicable for static archive data and are not subject to audit the dynamically updated outsourced data. We propose an effectual RDA technique based on algebraic signature properties for cloud storage system and also present a new data structure capable of efficiently supporting dynamic data operations like append, insert, modify, and delete. Moreover, this data structure empowers our method to be applicable for large-scale data with minimum computation cost. The comparative analysis with the state-of-the-art RDA schemes shows that the proposed scheme is secure and highly efficient in terms of the computation and communication overhead on the auditor and server. PMID:25121114

  5. Towards Dynamic Remote Data Auditing in Computational Clouds

    Directory of Open Access Journals (Sweden)

    Mehdi Sookhak

    2014-01-01

    Full Text Available Cloud computing is a significant shift of computational paradigm where computing as a utility and storing data remotely have a great potential. Enterprise and businesses are now more interested in outsourcing their data to the cloud to lessen the burden of local data storage and maintenance. However, the outsourced data and the computation outcomes are not continuously trustworthy due to the lack of control and physical possession of the data owners. To better streamline this issue, researchers have now focused on designing remote data auditing (RDA techniques. The majority of these techniques, however, are only applicable for static archive data and are not subject to audit the dynamically updated outsourced data. We propose an effectual RDA technique based on algebraic signature properties for cloud storage system and also present a new data structure capable of efficiently supporting dynamic data operations like append, insert, modify, and delete. Moreover, this data structure empowers our method to be applicable for large-scale data with minimum computation cost. The comparative analysis with the state-of-the-art RDA schemes shows that the proposed scheme is secure and highly efficient in terms of the computation and communication overhead on the auditor and server.

  6. Identification of the Rice Wines with Different Marked Ages by Electronic Nose Coupled with Smartphone and Cloud Storage Platform.

    Science.gov (United States)

    Wei, Zhebo; Xiao, Xize; Wang, Jun; Wang, Hui

    2017-10-31

    In this study, a portable electronic nose (E-nose) was self-developed to identify rice wines with different marked ages-all the operations of the E-nose were controlled by a special Smartphone Application. The sensor array of the E-nose was comprised of 12 MOS sensors and the obtained response values were transmitted to the Smartphone thorough a wireless communication module. Then, Aliyun worked as a cloud storage platform for the storage of responses and identification models. The measurement of the E-nose was composed of the taste information obtained phase (TIOP) and the aftertaste information obtained phase (AIOP). The area feature data obtained from the TIOP and the feature data obtained from the TIOP-AIOP were applied to identify rice wines by using pattern recognition methods. Principal component analysis (PCA), locally linear embedding (LLE) and linear discriminant analysis (LDA) were applied for the classification of those wine samples. LDA based on the area feature data obtained from the TIOP-AIOP proved a powerful tool and showed the best classification results. Partial least-squares regression (PLSR) and support vector machine (SVM) were applied for the predictions of marked ages and SVM (R² = 0.9942) worked much better than PLSR.

  7. Identification of the Rice Wines with Different Marked Ages by Electronic Nose Coupled with Smartphone and Cloud Storage Platform

    Directory of Open Access Journals (Sweden)

    Zhebo Wei

    2017-10-01

    Full Text Available In this study, a portable electronic nose (E-nose was self-developed to identify rice wines with different marked ages—all the operations of the E-nose were controlled by a special Smartphone Application. The sensor array of the E-nose was comprised of 12 MOS sensors and the obtained response values were transmitted to the Smartphone thorough a wireless communication module. Then, Aliyun worked as a cloud storage platform for the storage of responses and identification models. The measurement of the E-nose was composed of the taste information obtained phase (TIOP and the aftertaste information obtained phase (AIOP. The area feature data obtained from the TIOP and the feature data obtained from the TIOP-AIOP were applied to identify rice wines by using pattern recognition methods. Principal component analysis (PCA, locally linear embedding (LLE and linear discriminant analysis (LDA were applied for the classification of those wine samples. LDA based on the area feature data obtained from the TIOP-AIOP proved a powerful tool and showed the best classification results. Partial least-squares regression (PLSR and support vector machine (SVM were applied for the predictions of marked ages and SVM (R2 = 0.9942 worked much better than PLSR.

  8. Leveraging the Cloud for Robust and Efficient Lunar Image Processing

    Science.gov (United States)

    Chang, George; Malhotra, Shan; Wolgast, Paul

    2011-01-01

    The Lunar Mapping and Modeling Project (LMMP) is tasked to aggregate lunar data, from the Apollo era to the latest instruments on the LRO spacecraft, into a central repository accessible by scientists and the general public. A critical function of this task is to provide users with the best solution for browsing the vast amounts of imagery available. The image files LMMP manages range from a few gigabytes to hundreds of gigabytes in size with new data arriving every day. Despite this ever-increasing amount of data, LMMP must make the data readily available in a timely manner for users to view and analyze. This is accomplished by tiling large images into smaller images using Hadoop, a distributed computing software platform implementation of the MapReduce framework, running on a small cluster of machines locally. Additionally, the software is implemented to use Amazon's Elastic Compute Cloud (EC2) facility. We also developed a hybrid solution to serve images to users by leveraging cloud storage using Amazon's Simple Storage Service (S3) for public data while keeping private information on our own data servers. By using Cloud Computing, we improve upon our local solution by reducing the need to manage our own hardware and computing infrastructure, thereby reducing costs. Further, by using a hybrid of local and cloud storage, we are able to provide data to our users more efficiently and securely. 12 This paper examines the use of a distributed approach with Hadoop to tile images, an approach that provides significant improvements in image processing time, from hours to minutes. This paper describes the constraints imposed on the solution and the resulting techniques developed for the hybrid solution of a customized Hadoop infrastructure over local and cloud resources in managing this ever-growing data set. It examines the performance trade-offs of using the more plentiful resources of the cloud, such as those provided by S3, against the bandwidth limitations such use

  9. Summary: Electron-cloud effects and fast-ion instability

    International Nuclear Information System (INIS)

    Furman, Miguel A.

    2000-01-01

    This is my summary of the talks on the electron-cloud effect and the fast-ion instability that were presented at the 8th ICFA Beam Dynamics Mini-Work shop on Two-Stream Instabilities in Particle Accelerators and Storage Rings,Santa Fe, NM, February 16--18, 2000

  10. Point-Cloud Compression for Vehicle-Based Mobile Mapping Systems Using Portable Network Graphics

    Science.gov (United States)

    Kohira, K.; Masuda, H.

    2017-09-01

    A mobile mapping system is effective for capturing dense point-clouds of roads and roadside objects Point-clouds of urban areas, residential areas, and arterial roads are useful for maintenance of infrastructure, map creation, and automatic driving. However, the data size of point-clouds measured in large areas is enormously large. A large storage capacity is required to store such point-clouds, and heavy loads will be taken on network if point-clouds are transferred through the network. Therefore, it is desirable to reduce data sizes of point-clouds without deterioration of quality. In this research, we propose a novel point-cloud compression method for vehicle-based mobile mapping systems. In our compression method, point-clouds are mapped onto 2D pixels using GPS time and the parameters of the laser scanner. Then, the images are encoded in the Portable Networking Graphics (PNG) format and compressed using the PNG algorithm. In our experiments, our method could efficiently compress point-clouds without deteriorating the quality.

  11. POINT-CLOUD COMPRESSION FOR VEHICLE-BASED MOBILE MAPPING SYSTEMS USING PORTABLE NETWORK GRAPHICS

    Directory of Open Access Journals (Sweden)

    K. Kohira

    2017-09-01

    Full Text Available A mobile mapping system is effective for capturing dense point-clouds of roads and roadside objects.Point-clouds of urban areas, residential areas, and arterial roads are useful for maintenance of infrastructure, map creation, and automatic driving. However, the data size of point-clouds measured in large areas is enormously large. A large storage capacity is required to store such point-clouds, and heavy loads will be taken on network if point-clouds are transferred through the network. Therefore, it is desirable to reduce data sizes of point-clouds without deterioration of quality. In this research, we propose a novel point-cloud compression method for vehicle-based mobile mapping systems. In our compression method, point-clouds are mapped onto 2D pixels using GPS time and the parameters of the laser scanner. Then, the images are encoded in the Portable Networking Graphics (PNG format and compressed using the PNG algorithm. In our experiments, our method could efficiently compress point-clouds without deteriorating the quality.

  12. DICOM relay over the cloud.

    Science.gov (United States)

    Silva, Luís A Bastião; Costa, Carlos; Oliveira, José Luis

    2013-05-01

    Healthcare institutions worldwide have adopted picture archiving and communication system (PACS) for enterprise access to images, relying on Digital Imaging Communication in Medicine (DICOM) standards for data exchange. However, communication over a wider domain of independent medical institutions is not well standardized. A DICOM-compliant bridge was developed for extending and sharing DICOM services across healthcare institutions without requiring complex network setups or dedicated communication channels. A set of DICOM routers interconnected through a public cloud infrastructure was implemented to support medical image exchange among institutions. Despite the advantages of cloud computing, new challenges were encountered regarding data privacy, particularly when medical data are transmitted over different domains. To address this issue, a solution was introduced by creating a ciphered data channel between the entities sharing DICOM services. Two main DICOM services were implemented in the bridge: Storage and Query/Retrieve. The performance measures demonstrated it is quite simple to exchange information and processes between several institutions. The solution can be integrated with any currently installed PACS-DICOM infrastructure. This method works transparently with well-known cloud service providers. Cloud computing was introduced to augment enterprise PACS by providing standard medical imaging services across different institutions, offering communication privacy and enabling creation of wider PACS scenarios with suitable technical solutions.

  13. The Risk of Polychlorinated Biphenyls Facilitating Tumors in Hawaiian Green Sea Turtles (Chelonia mydas

    Directory of Open Access Journals (Sweden)

    Muting Yan

    2018-06-01

    Full Text Available The Hawaiian green turtle (Chelonia mydas is on the list of threatened species protected under the U.S. Endangered Species Act in 1978 in large part due to a severe tumor-forming disease named fibropapillomatosis. Chemical pollution is a prime suspect threatening the survival of C. mydas. In this study, PCBs concentrations were determined in 43 C. mydas plasma samples archived on Tern Island. The total PCBs concentration in male C. mydas (mean 1.10 ng/mL was two times more than that of females (mean 0.43 ng/mL. The relationship between straight carapace length and PCBs concentration in females has also been studied, which was negatively related. To figure out the possible existence of correlations between PCBs and tumor status, we measured the PCBs concentration in turtles with no tumor, moderate or severe tumor affliction. PCBs concentration of two afflicted groups was much higher than the healthy group, suggesting that PCBs may play a role in fibropapillomatosis in Hawaiian green turtle.

  14. OpenStack Object Storage (Swift) essentials

    CERN Document Server

    Kapadia, Amar; Varma, Sreedhar

    2015-01-01

    If you are an IT administrator and you want to enter the world of cloud storage using OpenStack Swift, then this book is ideal for you. Basic knowledge of Linux and server technology is beneficial to get the most out of the book.

  15. Theory and measurement of the electron cloud effect

    CERN Document Server

    Harkay, K C

    1999-01-01

    Photoelectrons produced through the interaction of synchrotron radiation and the vacuum chamber walls can be accelerated by a charged particle beam, acquiring sufficient energy to produce secondary electrons (SEs) in collisions with the walls. If the secondary-electron yield (SEY) coefficient of the wall material is greater than one, a runaway condition can develop. In addition to the SEY, the degree of amplification depends on the beam intensity and temporal distribution. As the electron cloud builds up along a train of stored bunches, a transverse perturbation of the head bunch can be communicated to trailing bunches in a wakefield-like interaction with the cloud. The electron cloud effect is especially of concern for the high-intensity PEP-II (SLAC) and KEK B-factories and at the Large Hadron Collider (LHC) at CERN. An initiative was undertaken at the Advanced Photon Source (APS) storage ring to characterize the electron cloud in order to provide realistic limits on critical input parameters in the models ...

  16. Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing

    Science.gov (United States)

    Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane

    2012-01-01

    Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then

  17. Cloud computing development in Armenia

    Directory of Open Access Journals (Sweden)

    Vazgen Ghazaryan

    2014-10-01

    Full Text Available Purpose – The purpose of the research is to clarify benefits and risks in regards with data protection, cost; business can have by the use of this new technologies for the implementation and management of organization’s information systems.Design/methodology/approach – Qualitative case study of the results obtained via interviews. Three research questions were raised: Q1: How can company benefit from using Cloud Computing compared to other solutions?; Q2: What are possible issues that occur with Cloud Computing?; Q3: How would Cloud Computing change an organizations’ IT infrastructure?Findings – The calculations provided in the interview section prove the financial advantages, even though the precise degree of flexibility and performance has not been assessed. Cloud Computing offers great scalability. Another benefit that Cloud Computing offers, in addition to better performance and flexibility, is reliable and simple backup data storage, physically distributed and so almost invulnerable to damage. Although the advantages of Cloud Computing more than compensate for the difficulties associated with it, the latter must be carefully considered. Since the cloud architecture is relatively new, so far the best guarantee against all risks it entails, from a single company's perspective, is a well-formulated service-level agreement, where the terms of service and the shared responsibility and security roles between the client and the provider are defined.Research limitations/implications – study was carried out on the bases of two companies, which gives deeper view, but for more widely applicable results, a wider analysis is necessary.Practical implications:Originality/Value – novelty of the research depends on the fact that existing approaches on this problem mainly focus on technical side of computing.Research type: case study

  18. Provider-Independent Use of the Cloud

    Science.gov (United States)

    Harmer, Terence; Wright, Peter; Cunningham, Christina; Perrott, Ron

    Utility computing offers researchers and businesses the potential of significant cost-savings, making it possible for them to match the cost of their computing and storage to their demand for such resources. A utility compute provider enables the purchase of compute infrastructures on-demand; when a user requires computing resources a provider will provision a resource for them and charge them only for their period of use of that resource. There has been a significant growth in the number of cloud computing resource providers and each has a different resource usage model, application process and application programming interface (API)-developing generic multi-resource provider applications is thus difficult and time consuming. We have developed an abstraction layer that provides a single resource usage model, user authentication model and API for compute providers that enables cloud-provider neutral applications to be developed. In this paper we outline the issues in using external resource providers, give examples of using a number of the most popular cloud providers and provide examples of developing provider neutral applications. In addition, we discuss the development of the API to create a generic provisioning model based on a common architecture for cloud computing providers.

  19. Helminth fauna of Chelonia mydas (Linnaeus, 1758 in the south of Espírito Santo State in Brasil

    Directory of Open Access Journals (Sweden)

    Binoti E.

    2016-06-01

    Full Text Available Due to an inadequate knowledge about threats to the sea turtle, we aimed to evaluate the helminth fauna of Chelonia mydas which had died on the southern coast of Espirito Santo, Brasil and described the associated tissue pathological lesions. Retrospective and prospective studies on turtle parasites were conducted and tissues samples were collected. 106 of 212 of sea turtles (50 % were parasitized, and 47 of 106 of infected animals 43.0 % (47/106 were in poor health condition. Seven trematoda families covering 19 different helminths species were identified. Turtles were inhabited with one or more species of parasites, and there was no significant association between parasitism and weakness of the animals. Trematode eggs, with or without giant cells in tissues of various organs were observed.

  20. Selected heavy metals and selenium in the blood of black sea turtle (Chelonia mydas agasiizzi) from Sonora, Mexico.

    Science.gov (United States)

    Ley-Quiñónez, C P; Zavala-Norzagaray, A A; Réndon-Maldonado, J G; Espinosa-Carreón, T L; Canizales-Román, A; Escobedo-Urías, D C; Leal-Acosta, M L; Hart, C E; Aguirre, A A

    2013-12-01

    The concentration of heavy metals (Zn, Cd, Ni, Cu, Mn) and selenium (Se) was analyzed in blood collected from 12 black turtles (Chelonia mydas agasiizzi) captured in Canal del Infiernillo, Punta Chueca, Mexico. The most abundant metals were Zn (63.58 μg g(-1)) and Se (7.66 μg g(-1)), and Cd was the lower (0.99 μg g(-1)). The sequential concentrations of trace metals were Zn > Se > Cu > Mn > Ni > Cd. In conclusion, this information is important as a baseline when using blood as tissue analysis of heavy metals; however, these levels could represent recent exposure in foraging grounds of black turtles in the Sea of Cortez.

  1. Effective ASCII-HEX steganography for secure cloud

    International Nuclear Information System (INIS)

    Afghan, S.

    2015-01-01

    There are many reasons of cloud computing popularity some of the most important are; backup and rescue, cost effective, nearly limitless storage, automatic software amalgamation, easy access to information and many more. Pay-as-you-go model is followed to provide everything as a service. Data is secured by using standard security policies available at cloud end. In spite of its many benefits, as mentioned above, cloud computing has also some security issues. Provider as well as customer has to provide and collect data in a secure manner. Both of these issues plus efficient transmitting of data over cloud are very critical issues and needed to be resolved. There is need of security during the travel time of sensitive data over the network that can be processed or stored by the customer. Security to the customer's data at the provider end can be provided by using current security algorithms, which are not known by the customer. There is reliability problem due to existence of multiple boundaries in the cloud resource access. ASCII and HEX security with steganography is used to propose an algorithm that stores the encrypted data/cipher text in an image file which will be then sent to the cloud end. This is done by using CDM (Common Deployment Model). In future, an algorithm should be proposed and implemented for the security of virtual images in the cloud computing. (author)

  2. Eucalyptus: an open-source cloud computing infrastructure

    International Nuclear Information System (INIS)

    Nurmi, Daniel; Wolski, Rich; Grzegorczyk, Chris; Obertelli, Graziano; Soman, Sunil; Youseff, Lamia; Zagorodnov, Dmitrii

    2009-01-01

    Utility computing, elastic computing, and cloud computing are all terms that refer to the concept of dynamically provisioning processing time and storage space from a ubiquitous 'cloud' of computational resources. Such systems allow users to acquire and release the resources on demand and provide ready access to data from processing elements, while relegating the physical location and exact parameters of the resources. Over the past few years, such systems have become increasingly popular, but nearly all current cloud computing offerings are either proprietary or depend upon software infrastructure that is invisible to the research community. In this work, we present Eucalyptus, an open-source software implementation of cloud computing that utilizes compute resources that are typically available to researchers, such as clusters and workstation farms. In order to foster community research exploration of cloud computing systems, the design of Eucalyptus emphasizes modularity, allowing researchers to experiment with their own security, scalability, scheduling, and interface implementations. In this paper, we outline the design of Eucalyptus, describe our own implementations of the modular system components, and provide results from experiments that measure performance and scalability of a Eucalyptus installation currently deployed for public use. The main contribution of our work is the presentation of the first research-oriented open-source cloud computing system focused on enabling methodical investigations into the programming, administration, and deployment of systems exploring this novel distributed computing model.

  3. Big Data in the Cloud - Processing and Performance

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    **Anthony F. Voellm** is currently leading the Google Cloud Performance Team and has a wide range of experience from kernel and database engines to graphics and automated image and map extraction from satellite images. Anthony is an avid inventor with 7 technology patents issued. In his current role at Google Anthony is focused on delivering Price Performance to existing products like Google Compute Engine and Google Cloud Storage while also innovating new offerings. Anthony holds a Master of Science from George Washington University, BA in Physics and a BS in Computer Science and Mathematics from the University of Vermont.

  4. A Secure and Effective Anonymous Integrity Checking Protocol for Data Storage in Multicloud

    Directory of Open Access Journals (Sweden)

    Lingwei Song

    2015-01-01

    Full Text Available How to verify the integrity of outsourced data is an important problem in cloud storage. Most of previous work focuses on three aspects, which are providing data dynamics, public verifiability, and privacy against verifiers with the help of a third party auditor. In this paper, we propose an identity-based data storage and integrity verification protocol on untrusted cloud. And the proposed protocol can guarantee fair results without any third verifying auditor. The theoretical analysis and simulation results show that our protocols are secure and efficient.

  5. An enhanced technique for mobile cloudlet offloading with reduced computation using compression in the cloud

    Science.gov (United States)

    Moro, A. C.; Nadesh, R. K.

    2017-11-01

    The cloud computing paradigm has transformed the way we do business in today’s world. Services on cloud have come a long way since just providing basic storage or software on demand. One of the fastest growing factor in this is mobile cloud computing. With the option of offloading now available to mobile users, mobile users can offload entire applications onto cloudlets. With the problems regarding availability and limited-storage capacity of these mobile cloudlets, it becomes difficult to decide for the mobile user when to use his local memory or the cloudlets. Hence, we take a look at a fast algorithm that decides whether the mobile user should go for cloudlet or rely on local memory based on an offloading probability. We have partially implemented the algorithm which decides whether the task can be carried out locally or given to a cloudlet. But as it becomes a burden on the mobile devices to perform the complete computation, so we look to offload this on to a cloud in our paper. Also further we use a file compression technique before sending the file onto the cloud to further reduce the load.

  6. Translational Biomedical Informatics in the Cloud: Present and Future

    Directory of Open Access Journals (Sweden)

    Jiajia Chen

    2013-01-01

    Full Text Available Next generation sequencing and other high-throughput experimental techniques of recent decades have driven the exponential growth in publicly available molecular and clinical data. This information explosion has prepared the ground for the development of translational bioinformatics. The scale and dimensionality of data, however, pose obvious challenges in data mining, storage, and integration. In this paper we demonstrated the utility and promise of cloud computing for tackling the big data problems. We also outline our vision that cloud computing could be an enabling tool to facilitate translational bioinformatics research.

  7. Space Science Cloud: a Virtual Space Science Research Platform Based on Cloud Model

    Science.gov (United States)

    Hu, Xiaoyan; Tong, Jizhou; Zou, Ziming

    Through independent and co-operational science missions, Strategic Pioneer Program (SPP) on Space Science, the new initiative of space science program in China which was approved by CAS and implemented by National Space Science Center (NSSC), dedicates to seek new discoveries and new breakthroughs in space science, thus deepen the understanding of universe and planet earth. In the framework of this program, in order to support the operations of space science missions and satisfy the demand of related research activities for e-Science, NSSC is developing a virtual space science research platform based on cloud model, namely the Space Science Cloud (SSC). In order to support mission demonstration, SSC integrates interactive satellite orbit design tool, satellite structure and payloads layout design tool, payload observation coverage analysis tool, etc., to help scientists analyze and verify space science mission designs. Another important function of SSC is supporting the mission operations, which runs through the space satellite data pipelines. Mission operators can acquire and process observation data, then distribute the data products to other systems or issue the data and archives with the services of SSC. In addition, SSC provides useful data, tools and models for space researchers. Several databases in the field of space science are integrated and an efficient retrieve system is developing. Common tools for data visualization, deep processing (e.g., smoothing and filtering tools), analysis (e.g., FFT analysis tool and minimum variance analysis tool) and mining (e.g., proton event correlation analysis tool) are also integrated to help the researchers to better utilize the data. The space weather models on SSC include magnetic storm forecast model, multi-station middle and upper atmospheric climate model, solar energetic particle propagation model and so on. All the services above-mentioned are based on the e-Science infrastructures of CAS e.g. cloud storage and

  8. Data distribution method of workflow in the cloud environment

    Science.gov (United States)

    Wang, Yong; Wu, Junjuan; Wang, Ying

    2017-08-01

    Cloud computing for workflow applications provides the required high efficiency calculation and large storage capacity and it also brings challenges to the protection of trade secrets and other privacy data. Because of privacy data will cause the increase of the data transmission time, this paper presents a new data allocation algorithm based on data collaborative damage degree, to improve the existing data allocation strategy? Safety and public cloud computer algorithm depends on the private cloud; the static allocation method in the initial stage only to the non-confidential data division to improve the original data, in the operational phase will continue to generate data to dynamically adjust the data distribution scheme. The experimental results show that the improved method is effective in reducing the data transmission time.

  9. Security Enhancement for Data Migration in the Cloud

    Directory of Open Access Journals (Sweden)

    Jean Raphael Ngnie Sighom

    2017-06-01

    Full Text Available In today’s society, cloud computing has significantly impacted nearly every section of our lives and business structures. Cloud computing is, without any doubt, one of the strategic directions for many companies and the most dominating infrastructure for enterprises as long as end users. Instead of buying IT equipment (hardware and/or software and managing it themselves, many organizations today prefer to buy services from IT service providers. The number of service providers increase dramatically and the cloud is becoming the tools of choice for more cloud storage services. However, as more personal information and data are moved to the cloud, into social media sites, DropBox, Baidu WangPan, etc., data security and privacy issues are questioned. Daily, academia and industry seek to find an efficient way to secure data migration in the cloud. Various solution approaches and encryption techniques have been implemented. In this work, we will discuss some of these approaches and evaluate the popular ones in order to find the elements that affect system performance. Finally, we will propose a model that enhances data security and privacy by combining Advanced Encryption Standard-256, Information Dispersal Algorithms and Secure Hash Algorithm-512. Our protocol achieves provable security assessments and fast execution times for medium thresholds.

  10. Development of the Lock Protocol for DEPSKY Storage System

    Directory of Open Access Journals (Sweden)

    NASCIMENTO, P. S.

    2015-06-01

    Full Text Available Data management in environments based on several clouds (cloud-of-clouds should be dependable and secure. DEPSKY may assure that characteristics through mechanisms as cryptography and data replication, however DEPSKY does not support concurrent writing, a desirable functionality for many applications. This paper presents the development and a performance analysis of a lock algorithmfor DEPSKY storage system. The paper also presents validation test and performance test of the algorithm. Such protocol allows concurrent writing, through a low contention lock mechanism that uses lock filesfi to dene who is allowed to write in a data unit.

  11. Inferring Large-Scale Terrestrial Water Storage Through GRACE and GPS Data Fusion in Cloud Computing Environments

    Science.gov (United States)

    Rude, C. M.; Li, J. D.; Gowanlock, M.; Herring, T.; Pankratius, V.

    2016-12-01

    Surface subsidence due to depletion of groundwater can lead to permanent compaction of aquifers and damaged infrastructure. However, studies of such effects on a large scale are challenging and compute intensive because they involve fusing a variety of data sets beyond direct measurements from groundwater wells, such as gravity change measurements from the Gravity Recovery and Climate Experiment (GRACE) or surface displacements measured by GPS receivers. Our work therefore leverages Amazon cloud computing to enable these types of analyses spanning the entire continental US. Changes in groundwater storage are inferred from surface displacements measured by GPS receivers stationed throughout the country. Receivers located on bedrock are anti-correlated with changes in water levels from elastic deformation due to loading, while stations on aquifers correlate with groundwater changes due to poroelastic expansion and compaction. Correlating linearly detrended equivalent water thickness measurements from GRACE with linearly detrended and Kalman filtered vertical displacements of GPS stations located throughout the United States helps compensate for the spatial and temporal limitations of GRACE. Our results show that the majority of GPS stations are negatively correlated with GRACE in a statistically relevant way, as most GPS stations are located on bedrock in order to provide stable reference locations and measure geophysical processes such as tectonic deformations. Additionally, stations located on the Central Valley California aquifer show statistically significant positive correlations. Through the identification of positive and negative correlations, deformation phenomena can be classified as loading or poroelastic expansion due to changes in groundwater. This method facilitates further studies of terrestrial water storage on a global scale. This work is supported by NASA AIST-NNX15AG84G (PI: V. Pankratius) and Amazon.

  12. Universal Keyword Classifier on Public Key Based Encrypted Multikeyword Fuzzy Search in Public Cloud.

    Science.gov (United States)

    Munisamy, Shyamala Devi; Chokkalingam, Arun

    2015-01-01

    Cloud computing has pioneered the emerging world by manifesting itself as a service through internet and facilitates third party infrastructure and applications. While customers have no visibility on how their data is stored on service provider's premises, it offers greater benefits in lowering infrastructure costs and delivering more flexibility and simplicity in managing private data. The opportunity to use cloud services on pay-per-use basis provides comfort for private data owners in managing costs and data. With the pervasive usage of internet, the focus has now shifted towards effective data utilization on the cloud without compromising security concerns. In the pursuit of increasing data utilization on public cloud storage, the key is to make effective data access through several fuzzy searching techniques. In this paper, we have discussed the existing fuzzy searching techniques and focused on reducing the searching time on the cloud storage server for effective data utilization. Our proposed Asymmetric Classifier Multikeyword Fuzzy Search method provides classifier search server that creates universal keyword classifier for the multiple keyword request which greatly reduces the searching time by learning the search path pattern for all the keywords in the fuzzy keyword set. The objective of using BTree fuzzy searchable index is to resolve typos and representation inconsistencies and also to facilitate effective data utilization.

  13. Point cloud processing for smart systems

    Directory of Open Access Journals (Sweden)

    Jaromír Landa

    2013-01-01

    Full Text Available High population as well as the economical tension emphasises the necessity of effective city management – from land use planning to urban green maintenance. The management effectiveness is based on precise knowledge of the city environment. Point clouds generated by mobile and terrestrial laser scanners provide precise data about objects in the scanner vicinity. From these data pieces the state of the roads, buildings, trees and other objects important for this decision-making process can be obtained. Generally, they can support the idea of “smart” or at least “smarter” cities.Unfortunately the point clouds do not provide this type of information automatically. It has to be extracted. This extraction is done by expert personnel or by object recognition software. As the point clouds can represent large areas (streets or even cities, usage of expert personnel to identify the required objects can be very time-consuming, therefore cost ineffective. Object recognition software allows us to detect and identify required objects semi-automatically or automatically.The first part of the article reviews and analyses the state of current art point cloud object recognition techniques. The following part presents common formats used for point cloud storage and frequently used software tools for point cloud processing. Further, a method for extraction of geospatial information about detected objects is proposed. Therefore, the method can be used not only to recognize the existence and shape of certain objects, but also to retrieve their geospatial properties. These objects can be later directly used in various GIS systems for further analyses.

  14. Electron cloud diagnostics in use at the Los Alamos PSR

    International Nuclear Information System (INIS)

    Macek, R. J.; Browman, A.; Borden, M.; Fitzgerald, D.; Wang, T. S.; Zaugg, T.; Harkay, K.; Rosenberg, R.

    2003-01-01

    A variety of electron cloud diagnostics have been deployed at the Los Alamos Proton Storage Ring (PSR) to detect, measure, and characterize the electron cloud generated in this high intensity, long bunch accumulator ring. These include a version of the ANL-developed retarding field analyzers (RFA) augmented with LANL-developed electronics, a variant of the RFA denoted as the electron sweeping diagnostic (ESD), biased collection plates, and gas pulse measuring devices. The designs and experience with the performance and applicability to PSR are discussed

  15. The direction of cloud computing for Malaysian education sector in 21st century

    Science.gov (United States)

    Jaafar, Jazurainifariza; Rahman, M. Nordin A.; Kadir, M. Fadzil A.; Shamsudin, Syadiah Nor; Saany, Syarilla Iryani A.

    2017-08-01

    In 21st century, technology has turned learning environment into a new way of education to make learning systems more effective and systematic. Nowadays, education institutions are faced many challenges to ensure the teaching and learning process is running smoothly and manageable. Some of challenges in the current education management are lack of integrated systems, high cost of maintenance, difficulty of configuration and deployment as well as complexity of storage provision. Digital learning is an instructional practice that use technology to make learning experience more effective, provides education process more systematic and attractive. Digital learning can be considered as one of the prominent application that implemented under cloud computing environment. Cloud computing is a type of network resources that provides on-demands services where the users can access applications inside it at any location and no time border. It also promises for minimizing the cost of maintenance and provides a flexible of data storage capacity. The aim of this article is to review the definition and types of cloud computing for improving digital learning management as required in the 21st century education. The analysis of digital learning context focused on primary school in Malaysia. Types of cloud applications and services in education sector are also discussed in the article. Finally, gap analysis and direction of cloud computing in education sector for facing the 21st century challenges are suggested.

  16. Cloud Infrastructure & Applications - CloudIA

    Science.gov (United States)

    Sulistio, Anthony; Reich, Christoph; Doelitzscher, Frank

    The idea behind Cloud Computing is to deliver Infrastructure-as-a-Services and Software-as-a-Service over the Internet on an easy pay-per-use business model. To harness the potentials of Cloud Computing for e-Learning and research purposes, and to small- and medium-sized enterprises, the Hochschule Furtwangen University establishes a new project, called Cloud Infrastructure & Applications (CloudIA). The CloudIA project is a market-oriented cloud infrastructure that leverages different virtualization technologies, by supporting Service-Level Agreements for various service offerings. This paper describes the CloudIA project in details and mentions our early experiences in building a private cloud using an existing infrastructure.

  17. Electron Cloud Effect in the Linear Colliders

    International Nuclear Information System (INIS)

    Pivi, M

    2004-01-01

    Beam induced multipacting, driven by the electric field of successive positively charged bunches, may arise from a resonant motion of electrons, generated by secondary emission, bouncing back and forth between opposite walls of the vacuum chamber. The electron-cloud effect (ECE) has been observed or is expected at many storage rings [1]. In the beam pipe of the Damping Ring (DR) of a linear collider, an electron cloud is produced initially by ionization of the residual gas and photoelectrons from the synchrotron radiation. The cloud is then sustained by secondary electron emission. This electron cloud can reach equilibrium after the passage of only a few bunches. The electron-cloud effect may be responsible for collective effects as fast coupled-bunch and single-bunch instability, emittance blow-up or incoherent tune shift when the bunch current exceeds a certain threshold, accompanied by a large number of electrons in the vacuum chamber. The ECE was identified as one of the most important R and D topics in the International Linear Collider Report [2]. Systematic studies on the possible electron-cloud effect have been initiated at SLAC for the GLC/NLC and TESLA linear colliders, with particular attention to the effect in the positron main damping ring (MDR) and the positron Low Emittance Transport which includes the bunch compressor system (BCS), the main linac, and the beam delivery system (BDS). We present recent computer simulation results for the main features of the electron cloud generation in both machine designs. Thus, single and coupled-bunch instability thresholds are estimated for the GLC/NLC design

  18. Caught in the Cloud: Privacy, Encryption, and Government Back Doors in the Web 2.0 Era

    OpenAIRE

    Soghoian, Christopher

    2017-01-01

    Over the last few years, consumers, corporations and governments have rushed to move their data to “the cloud,” adopting web-based applications and storage solutions provided by companies that include Amazon, Google, Microsoft and Yahoo. Unfortunately the shift to cloud computing needlessly exposes users to privacy invasion and fraud by hackers. Cloud based services also leave end users vulnerable to significant invasions of privacy by the government, resulting in the evisceration of traditio...

  19. Cloud Based Educational Systems 
And Its Challenges And Opportunities And Issues

    Directory of Open Access Journals (Sweden)

    Prantosh Kr. PAUL

    2014-01-01

    Full Text Available Cloud Computing (CC is actually is a set of hardware, software, networks, storage, services an interface combines to deliver aspects of computing as a service. Cloud Computing (CC actually uses the central remote servers to maintain data and applications. Practically Cloud Computing (CC is extension of Grid computing with independency and smarter tools and technological gradients. Healthy Cloud Computing helps in sharing of software, hardware, application and other packages with the help of internet tools and wireless media. Cloud Computing, has benefits in several field and applications domain such as Agriculture, Business and Commerce, Health Care, Hospitality and Tourism, Education and Training sector and so on. In Education Systems, it may be applicable in general regular education and other education systems including general and vocational training. This paper is talks about opportunities that provide Cloud Computing (CC; however the intention would be challenges and issues in relation to Education, Education Systems and Training programme.

  20. Strategies for exploiting independent cloud implementations of biometric experts in multibiometric scenarios

    OpenAIRE

    Peer, Peter; Emeršič, Žiga; Bule, Jernej; Žganec Gros, Jerneja; Štruc, Vitomir

    2015-01-01

    Cloud computing represents one of the fastest growing areas of technology and offers a new computing model for various applications and services. This model is particularly interesting for the area of biometric recognition, where scalability, processing power and storage requirements are becoming a bigger and bigger issue with each new generation of recognition technology. Next to the availability of computing resources, another important aspect of cloud computing with respect to biometrics i...

  1. Cloud Computing Benefits for Educational Institutions

    OpenAIRE

    Lakshminarayanan, Ramkumar; Kumar, Binod; Raju, M.

    2013-01-01

    Education today is becoming completely associated with the Information Technology on the content delivery, communication and collaboration. The need for servers, storage and software are highly demanding in the universities, colleges and schools. Cloud Computing is an Internet based computing, whereby shared resources, software and information, are provided to computers and devices on-demand, like the electricity grid. Currently, IaaS (Infrastructure as a Service), PaaS (Platform as a Service...

  2. A scalable and multi-purpose point cloud server (PCS) for easier and faster point cloud data management and processing

    Science.gov (United States)

    Cura, Rémi; Perret, Julien; Paparoditis, Nicolas

    2017-05-01

    In addition to more traditional geographical data such as images (rasters) and vectors, point cloud data are becoming increasingly available. Such data are appreciated for their precision and true three-Dimensional (3D) nature. However, managing point clouds can be difficult due to scaling problems and specificities of this data type. Several methods exist but are usually fairly specialised and solve only one aspect of the management problem. In this work, we propose a comprehensive and efficient point cloud management system based on a database server that works on groups of points (patches) rather than individual points. This system is specifically designed to cover the basic needs of point cloud users: fast loading, compressed storage, powerful patch and point filtering, easy data access and exporting, and integrated processing. Moreover, the proposed system fully integrates metadata (like sensor position) and can conjointly use point clouds with other geospatial data, such as images, vectors, topology and other point clouds. Point cloud (parallel) processing can be done in-base with fast prototyping capabilities. Lastly, the system is built on open source technologies; therefore it can be easily extended and customised. We test the proposed system with several billion points obtained from Lidar (aerial and terrestrial) and stereo-vision. We demonstrate loading speeds in the ˜50 million pts/h per process range, transparent-for-user and greater than 2 to 4:1 compression ratio, patch filtering in the 0.1 to 1 s range, and output in the 0.1 million pts/s per process range, along with classical processing methods, such as object detection.

  3. Efficient Cryptography for the Next Generation Secure Cloud

    Science.gov (United States)

    Kupcu, Alptekin

    2010-01-01

    Peer-to-peer (P2P) systems, and client-server type storage and computation outsourcing constitute some of the major applications that the next generation cloud schemes will address. Since these applications are just emerging, it is the perfect time to design them with security and privacy in mind. Furthermore, considering the high-churn…

  4. Eucalyptus: an open-source cloud computing infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Nurmi, Daniel; Wolski, Rich; Grzegorczyk, Chris; Obertelli, Graziano; Soman, Sunil; Youseff, Lamia; Zagorodnov, Dmitrii, E-mail: rich@cs.ucsb.ed [Computer Science Department, University of California, Santa Barbara, CA 93106 (United States) and Eucalyptus Systems Inc., 130 Castilian Dr., Goleta, CA 93117 (United States)

    2009-07-01

    Utility computing, elastic computing, and cloud computing are all terms that refer to the concept of dynamically provisioning processing time and storage space from a ubiquitous 'cloud' of computational resources. Such systems allow users to acquire and release the resources on demand and provide ready access to data from processing elements, while relegating the physical location and exact parameters of the resources. Over the past few years, such systems have become increasingly popular, but nearly all current cloud computing offerings are either proprietary or depend upon software infrastructure that is invisible to the research community. In this work, we present Eucalyptus, an open-source software implementation of cloud computing that utilizes compute resources that are typically available to researchers, such as clusters and workstation farms. In order to foster community research exploration of cloud computing systems, the design of Eucalyptus emphasizes modularity, allowing researchers to experiment with their own security, scalability, scheduling, and interface implementations. In this paper, we outline the design of Eucalyptus, describe our own implementations of the modular system components, and provide results from experiments that measure performance and scalability of a Eucalyptus installation currently deployed for public use. The main contribution of our work is the presentation of the first research-oriented open-source cloud computing system focused on enabling methodical investigations into the programming, administration, and deployment of systems exploring this novel distributed computing model.

  5. Secure public cloud platform for medical images sharing.

    Science.gov (United States)

    Pan, Wei; Coatrieux, Gouenou; Bouslimi, Dalel; Prigent, Nicolas

    2015-01-01

    Cloud computing promises medical imaging services offering large storage and computing capabilities for limited costs. In this data outsourcing framework, one of the greatest issues to deal with is data security. To do so, we propose to secure a public cloud platform devoted to medical image sharing by defining and deploying a security policy so as to control various security mechanisms. This policy stands on a risk assessment we conducted so as to identify security objectives with a special interest for digital content protection. These objectives are addressed by means of different security mechanisms like access and usage control policy, partial-encryption and watermarking.

  6. Gone in Six Characters: Short URLs Considered Harmful for Cloud Services

    OpenAIRE

    Georgiev, Martin; Shmatikov, Vitaly

    2016-01-01

    Modern cloud services are designed to encourage and support collaboration. To help users share links to online documents, maps, etc., several services, including cloud storage providers such as Microsoft OneDrive and mapping services such as Google Maps, directly integrate URL shorteners that convert long, unwieldy URLs into short URLs, consisting of a domain such as 1drv.ms or goo.gl and a short token. In this paper, we demonstrate that the space of 5- and 6-character tokens included in shor...

  7. Grids, Clouds and Virtualization

    CERN Document Server

    Cafaro, Massimo

    2011-01-01

    Research into grid computing has been driven by the need to solve large-scale, increasingly complex problems for scientific applications. Yet the applications of grid computing for business and casual users did not begin to emerge until the development of the concept of cloud computing, fueled by advances in virtualization techniques, coupled with the increased availability of ever-greater Internet bandwidth. The appeal of this new paradigm is mainly based on its simplicity, and the affordable price for seamless access to both computational and storage resources. This timely text/reference int

  8. Volunteered Cloud Computing for Disaster Management

    Science.gov (United States)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects

  9. Formation of Massive Molecular Cloud Cores by Cloud-cloud Collision

    OpenAIRE

    Inoue, Tsuyoshi; Fukui, Yasuo

    2013-01-01

    Recent observations of molecular clouds around rich massive star clusters including NGC3603, Westerlund 2, and M20 revealed that the formation of massive stars could be triggered by a cloud-cloud collision. By using three-dimensional, isothermal, magnetohydrodynamics simulations with the effect of self-gravity, we demonstrate that massive, gravitationally unstable, molecular cloud cores are formed behind the strong shock waves induced by the cloud-cloud collision. We find that the massive mol...

  10. Actividad reproductiva de Chelonia mydas (Testudines: Cheloniidae en Isla de Aves, Venezuela (2001-2008 Reproductive activity of Chelonia mydas (Testudines: Cheloniidae in Isla de Aves, Venezuela (2001-2008

    Directory of Open Access Journals (Sweden)

    Vicente Vera

    2012-06-01

    Full Text Available Isla de Aves, una isla a 650km de La Guaira, Venezuela, protegida como Refugio de Fauna Silvestre, constituye el segundo sitio de mayor anidación de la tortuga verde (Chelonia mydas (Linnaeus 1758 en el Caribe. El seguimiento de la población comenzó en 1972 y de manera más continua desde 1978. Los datos históricos indican que la captura de hembras en la isla, afectó severamente la población hasta 1978, cuando fue construida una base científico-naval. Durante las temporadas de anidación entre 2001-2008 con excepción de 2003 y 2004, las hembras fueron marcadas con placas metálicas y medidas. Asimismo, se muestreó durante 458 noches, en donde se observaron 5 154 eventos, con un máximo de 53 por noche. Los posibles eventos no observados fueron calculados ajustando la distribución temporal de eventos observados a una curva normal. El total de eventos estimados varió de =637.1±106.6 en 2001 a =2 853±42.5 en 2008 (ANOVA F(6.5gl=60.37, pReproductive activity of Chelonia mydas (Testudines: Cheloniidae in Isla de Aves, Venezuela (2001-2008. The second major nesting-site for green turtles in the Caribbean is Isla de Aves, an island protected as a wildlife refuge since 1972, located at 650km Northeast from La Guaira, Venezuela. In this island, the nesting population monitoring started in 1972 and in a more continuous way after 1978, when a Scientific-Naval Station was established and scientific observations started. Since historical data show that female captures had severely affected population levels in this island before 1978, this study aim to describe recent reproductive activities. For this, during the nesting seasons of 2001-2002 and 2005-2008, nesting females were measured and tagged using metal flipper tags. A total of 458 nights were sampled observing 5 154 female emergences, with a maximum of 53 in a single night. Non-observed emergences were calculated fitting the temporal distribution of observed emergences to a normal curve

  11. Applications integration in a hybrid cloud computing environment: modelling and platform

    Science.gov (United States)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  12. Send your data into the cloud and make it… vaporize

    CERN Multimedia

    Computer Security Team

    2011-01-01

    Cloud computing” --- the term is as nebulous as real clouds.   Basically, it means storing data somewhere on the Internet. This certainly has advantages, since this data will be available anytime from anywhere. For example, the Google mailbox is available from everywhere; “Dropbox” provides a central storage for any type of files; “ZAPR” and “TeamViewer”, once installed, allow you to share your local files by just sending around links, or give third parties full remote access to your PC, respectively. In addition, there is a growing number of cloud synchronisation services (e.g. “iCloud”/”MobileMe”, “Firefox Sync”, “Dropbox”) which provide (semi-)automatic back-ups of all local files of a laptop, PC or mobile phone. But hold on. What actually is transferred into the cloud? Personal files like bank statements? Passwords, especially CE...

  13. CONCEPTS AND CHARACTERISTICS OF CLOUD ORIENTED LEARNING ENVIRONMENT OF SCHOOL

    Directory of Open Access Journals (Sweden)

    Svitlana G. Lytvynova

    2014-04-01

    Full Text Available The article deals with the basic concepts and characteristics of cloud oriented learning environment (COLE of secondary school. It is examined the concept of "cloud oriented learning environment", "mobility training", the requirements for COLE, the goal of creating, the structural components, model deployment, maintenance. Four cloud storages are compared; the subjects and objects of COLE are described; the meaning of spatial and semantic, content and methodical, communication and organizational components are clarified; the benefits and features of cloud computing are defined. It is found that COLE creates conditions for active cooperation, provides mobility of learning process participants, and objects’ virtualization. It is available anywhere and at any time, ensures the development of creativity and innovation, critical thinking, ability to solve problems, to develop communicative, cooperative, life and career skills, to work with data, media, to develop ICT competence either of students and teachers.

  14. Practical Architectures for Deployment of Searchable Encryption in a Cloud Environment

    Directory of Open Access Journals (Sweden)

    Sarah Louise Renwick

    2017-11-01

    Full Text Available Public cloud service providers provide an infrastructure that gives businesses and individuals access to computing power and storage space on a pay-as-you-go basis. This allows these entities to bypass the usual costs associated with having their own data centre such as: hardware, construction, air conditioning and security costs, for example, making this a cost-effective solution for data storage. If the data being stored is of a sensitive nature, encrypting it prior to outsourcing it to a public cloud is a good method of ensuring the confidentiality of the data. With the data being encrypted, however, searching over it becomes unfeasible. In this paper, we examine different architectures for supporting search over encrypted data and discuss some of the challenges that need to be overcome if these techniques are to be engineered into practical systems.

  15. Histology and Immunohistochemistry of the Cardiac Ventricular Structure in the Green Turtle (Chelonia mydas).

    Science.gov (United States)

    Braz, J K F S; Freitas, M L; Magalhães, M S; Oliveira, M F; Costa, M S M O; Resende, N S; Clebis, N K; Silva, N B; Moura, C E B

    2016-08-01

    This study describes the implications of cardiac ventricular microscopy in Chelonia mydas relating to its ability to dive. For this work, 11 specimens of the marine turtle species C. mydas found dead on the coast of Rio Grande do Norte (Northeast Brazil) were used. After necropsy, fragments of the cardiac ventricular wall were fixed in 10% buffered formaldehyde solution for 24 h and then subjected to routine processing for light and scanning electron microscopy (SEM). The ventricle in this species is formed by the epicardium, myocardium and endocardium. The subepicardial layer consists of highly vascularised connective tissue that emits septa to reinforce the myocardium surface. There is an abundant and diffuse subepicardial nerve plexus shown by immunostaining technique. The thickness of the spongy myocardium and the nature of its trabeculae varied between the heart chambers. The endocardium shows no characteristic elements of the heart conduction system. The valves have a hyaline cartilage skeleton, coated by dense irregular connective tissues characterised by elastic fibres. These findings in the green turtle ventricular microscopy are related to hypoxia resistance during diving. © 2015 Blackwell Verlag GmbH.

  16. Food preferences and Hg distribution in Chelonia mydas assessed by stable isotopes

    International Nuclear Information System (INIS)

    Bezerra, M.F.; Lacerda, L.D.; Rezende, C.E.; Franco, M.A.L.; Almeida, M.G.; Macêdo, G.R.; Pires, T.T.; Rostán, G.; Lopez, G.G.

    2015-01-01

    Mercury (Hg) is a highly toxic pollutant that poses in risk several marine animals, including green turtles (Chelonia mydas). Green turtles are globally endangered sea turtle species that occurs in Brazilian coastal waters as a number of life stage classes (i.e., foraging juveniles and nesting adults). We assessed total Hg concentrations and isotopic signatures ("1"3C and "1"5N) in muscle, kidney, liver and scute of juvenile green turtles and their food items from two foraging grounds with different urban and industrial development. We found similar food preferences in specimens from both areas but variable Hg levels in tissues reflecting the influence of local Hg backgrounds in food items. Some juvenile green turtles from the highly industrialized foraging ground presented liver Hg levels among the highest ever reported for this species. Our results suggest that juvenile foraging green turtles are exposed to Hg burdens from locally anthropogenic activities in coastal areas. - Highlights: • We report major diet items for foraging green turtles from northeastern Brazil. • We compare Hg levels between industrialized and relatively pristine foraging grounds. • High local Hg background levels increase Hg exposure in foraging green turtles. • Even an herbivore diet could result in high tissue Hg concentrations. - Hg levels in scutes of foraging green turtles correlated with internal Hg burdens and were influenced by local sources of pollution in two tropical foraging grounds.

  17. Measurements of electron cloud growth and mitigation in dipole, quadrupole, and wiggler magnets

    Energy Technology Data Exchange (ETDEWEB)

    Calvey, J.R., E-mail: jrc97@cornell.edu; Hartung, W.; Li, Y.; Livezey, J.A.; Makita, J.; Palmer, M.A.; Rubin, D.

    2015-01-11

    Retarding field analyzers (RFAs), which provide a localized measurement of the electron cloud, have been installed throughout the Cornell Electron Storage Ring (CESR), in different magnetic field environments. This paper describes the RFA designs developed for dipole, quadrupole, and wiggler field regions, and provides an overview of measurements made in each environment. The effectiveness of electron cloud mitigations, including coatings, grooves, and clearing electrodes, are assessed with the RFA measurements.

  18. Long Read Alignment with Parallel MapReduce Cloud Platform

    Directory of Open Access Journals (Sweden)

    Ahmed Abdulhakim Al-Absi

    2015-01-01

    Full Text Available Genomic sequence alignment is an important technique to decode genome sequences in bioinformatics. Next-Generation Sequencing technologies produce genomic data of longer reads. Cloud platforms are adopted to address the problems arising from storage and analysis of large genomic data. Existing genes sequencing tools for cloud platforms predominantly consider short read gene sequences and adopt the Hadoop MapReduce framework for computation. However, serial execution of map and reduce phases is a problem in such systems. Therefore, in this paper, we introduce Burrows-Wheeler Aligner’s Smith-Waterman Alignment on Parallel MapReduce (BWASW-PMR cloud platform for long sequence alignment. The proposed cloud platform adopts a widely accepted and accurate BWA-SW algorithm for long sequence alignment. A custom MapReduce platform is developed to overcome the drawbacks of the Hadoop framework. A parallel execution strategy of the MapReduce phases and optimization of Smith-Waterman algorithm are considered. Performance evaluation results exhibit an average speed-up of 6.7 considering BWASW-PMR compared with the state-of-the-art Bwasw-Cloud. An average reduction of 30% in the map phase makespan is reported across all experiments comparing BWASW-PMR with Bwasw-Cloud. Optimization of Smith-Waterman results in reducing the execution time by 91.8%. The experimental study proves the efficiency of BWASW-PMR for aligning long genomic sequences on cloud platforms.

  19. Long Read Alignment with Parallel MapReduce Cloud Platform

    Science.gov (United States)

    Al-Absi, Ahmed Abdulhakim; Kang, Dae-Ki

    2015-01-01

    Genomic sequence alignment is an important technique to decode genome sequences in bioinformatics. Next-Generation Sequencing technologies produce genomic data of longer reads. Cloud platforms are adopted to address the problems arising from storage and analysis of large genomic data. Existing genes sequencing tools for cloud platforms predominantly consider short read gene sequences and adopt the Hadoop MapReduce framework for computation. However, serial execution of map and reduce phases is a problem in such systems. Therefore, in this paper, we introduce Burrows-Wheeler Aligner's Smith-Waterman Alignment on Parallel MapReduce (BWASW-PMR) cloud platform for long sequence alignment. The proposed cloud platform adopts a widely accepted and accurate BWA-SW algorithm for long sequence alignment. A custom MapReduce platform is developed to overcome the drawbacks of the Hadoop framework. A parallel execution strategy of the MapReduce phases and optimization of Smith-Waterman algorithm are considered. Performance evaluation results exhibit an average speed-up of 6.7 considering BWASW-PMR compared with the state-of-the-art Bwasw-Cloud. An average reduction of 30% in the map phase makespan is reported across all experiments comparing BWASW-PMR with Bwasw-Cloud. Optimization of Smith-Waterman results in reducing the execution time by 91.8%. The experimental study proves the efficiency of BWASW-PMR for aligning long genomic sequences on cloud platforms. PMID:26839887

  20. Combined phenomena of beam-beam and beam-electron cloud interactionsin circular e^{+}e^{-} colliders

    Directory of Open Access Journals (Sweden)

    Kazuhito Ohmi

    2002-10-01

    Full Text Available An electron cloud causes various effects in high intensity positron storage rings. The positron beam and the electron cloud can be considered a typical two-stream system with a certain plasma frequency. Beam-beam interaction is another important effect for high luminosity circular colliders. Colliding two beams can be considered as a two-stream system with another plasma frequency. We study the combined phenomena of the beam-electron cloud and beam-beam interactions from a viewpoint of two complex two-stream effects with two plasma frequencies.

  1. Globally distributed software defined storage (proposal)

    Science.gov (United States)

    Shevel, A.; Khoruzhnikov, S.; Grudinin, V.; Sadov, O.; Kairkanov, A.

    2017-10-01

    The volume of the coming data in HEP is growing. The volume of the data to be held for a long time is growing as well. Large volume of data - big data - is distributed around the planet. The methods, approaches how to organize and manage the globally distributed data storage are required. The distributed storage has several examples for personal needs like own-cloud.org, pydio.com, seafile.com, sparkleshare.org. For enterprise-level there is a number of systems: SWIFT - distributed storage systems (part of Openstack), CEPH and the like which are mostly object storage. When several data center’s resources are integrated, the organization of data links becomes very important issue especially if several parallel data links between data centers are used. The situation in data centers and in data links may vary each hour. All that means each part of distributed data storage has to be able to rearrange usage of data links and storage servers in each data center. In addition, for each customer of distributed storage different requirements could appear. The above topics are planned to be discussed in data storage proposal.

  2. Hydrodynamic effect of a satellite transmitter on a juvenile green turtle (Chelonia mydas)

    Science.gov (United States)

    Watson; Granger

    1998-09-01

    Wind tunnel tests were performed to measure the effect of a satellite transmitter on a juvenile green turtle (Chelonia mydas). A full-scale turtle model was constructed from an 11.5 kg specimen with a 48 cm carapace length, and a transmitter model was constructed from a Telonics ST-6. The turtle model was tested in a wind tunnel with and without the transmitter, which was mounted on the forward, topmost part of the carapace. Drag, lift and pitch moment were measured for several speeds and flow angles, and the data were scaled for application to the marine environment. At small flow angles representative of straight-line swimming, the transmitter increased drag by 27-30 %, reduced lift by less than 10 % and increased the pitch moment by 11-42 %. On the basis of the drag data at zero angle of attack, it is estimated that the backpack will reduce swimming speed by 11 %, assuming that the turtle produces the same thrust with the unit attached. The drag data are also used to estimate the effect of a transmitter on the swimming energetics of an adult green turtle. Design guidelines are included to minimize the adverse forces and moments caused by the transmitter.

  3. Security and privacy in the clouds: a bird's eye view

    NARCIS (Netherlands)

    Pieters, Wolter; Gutwirth, Serge; Poullet, Yves; De Hert, Paul; Leenes, Ronald

    2011-01-01

    Over the last years, something called "cloud computing" has become a major theme in computer science and information security. Essentially, it concerns delivering information technology as a service, by enabling the renting of soft-ware, computing power and storage. In this contribution, we give a

  4. Primer reporte de Cricocephalus albus (Digenea: Pronocephalidae en el Perú, parásito de la tortuga verde del Pacífico Este (Chelonia mydas agassizii

    Directory of Open Access Journals (Sweden)

    Luis A. Gomez-Puerta

    2017-07-01

    Full Text Available Se registra por primera vez para Perú a Cricocephalus albus (Digenea: Pronocephalidae en la tortuga verde del Pacífico oriental (Chelonia mydas agassizii. Los parásitos fueron colectados durante la necropsia de una tortuga verde varada en el estuario de Virrilá localizado en la provincia de Sechura, Departamento de Piura, Perú. El presente trabajo realiza una breve descripción de C. albus, así como la discusión de sus hospederos y distribución geográfica.

  5. Mitigation of the electron-cloud effect in the PSR and SNS protonstorage rings by tailoring the bunch profile

    CERN Document Server

    Pivi, M T

    2003-01-01

    For the storage ring of the Spallation Neutron Source (SNS) at Oak Ridge, and for the Proton Storage Ring (PSR) at Los Alamos, both with intense and very long bunches, the electron cloud develops primarily by the mechanism of trailing-edge multipacting. We show, by means of simulations for the PSR, how the resonant nature of this mechanism may be effectively broken by tailoring the longitudinal bunch profile at fixed bunch charge, resulting in a significant decrease in the electron-cloud effect. We briefly discuss the experimental difficulties expected in the implementation of this cure.

  6. MITIGATION OF THE ELECTRON-CLOUD EFFECT IN THE PSR AND SNS PROTONSTORAGE RINGS BY TAILORING THE BUNCH PROFILE

    International Nuclear Information System (INIS)

    Pivi, Mauro T F

    2003-01-01

    For the storage ring of the Spallation Neutron Source (SNS) at Oak Ridge, and for the Proton Storage Ring (PSR) at Los Alamos, both with intense and very long bunches, the electron cloud develops primarily by the mechanism of trailing-edge multipacting. We show, by means of simulations for the PSR, how the resonant nature of this mechanism may be effectively broken by tailoring the longitudinal bunch profile at fixed bunch charge, resulting in a significant decrease in the electron-cloud effect. We briefly discuss the experimental difficulties expected in the implementation of this cure

  7. Hybrid Pluggable Processing Pipeline (HyP3): Programmatic Access to Cloud-Based Processing of SAR Data

    Science.gov (United States)

    Weeden, R.; Horn, W. B.; Dimarchi, H.; Arko, S. A.; Hogenson, K.

    2017-12-01

    A problem often faced by Earth science researchers is the question of how to scale algorithms that were developed against few datasets and take them to regional or global scales. This problem only gets worse as we look to a future with larger and larger datasets becoming available. One significant hurdle can be having the processing and storage resources available for such a task, not to mention the administration of those resources. As a processing environment, the cloud offers nearly unlimited potential for compute and storage, with limited administration required. The goal of the Hybrid Pluggable Processing Pipeline (HyP3) project was to demonstrate the utility of the Amazon cloud to process large amounts of data quickly and cost effectively. Principally built by three undergraduate students at the ASF DAAC, the HyP3 system relies on core Amazon cloud services such as Lambda, Relational Database Service (RDS), Elastic Compute Cloud (EC2), Simple Storage Service (S3), and Elastic Beanstalk. HyP3 provides an Application Programming Interface (API) through which users can programmatically interface with the HyP3 system; allowing them to monitor and control processing jobs running in HyP3, and retrieve the generated HyP3 products when completed. This presentation will focus on the development techniques and enabling technologies that were used in developing the HyP3 system. Data and process flow, from new subscription through to order completion will be shown, highlighting the benefits of the cloud for each step. Because the HyP3 system can be accessed directly from a user's Python scripts, powerful applications leveraging SAR products can be put together fairly easily. This is the true power of HyP3; allowing people to programmatically leverage the power of the cloud.

  8. Buildings and Terrain of Urban Area Point Cloud Segmentation based on PCL

    International Nuclear Information System (INIS)

    Liu, Ying; Zhong, Ruofei

    2014-01-01

    One current problem with laser radar point data classification is building and urban terrain segmentation, this paper proposes a point cloud segmentation method base on PCL libraries. PCL is a large cross-platform open source C++ programming library, which implements a large number of point cloud related efficient data structures and generic algorithms involving point cloud retrieval, filtering, segmentation, registration, feature extraction and curved surface reconstruction, visualization, etc. Due to laser radar point cloud characteristics with large amount of data, unsymmetrical distribution, this paper proposes using the data structure of kd-tree to organize data; then using Voxel Grid filter for point cloud resampling, namely to reduce the amount of point cloud data, and at the same time keep the point cloud shape characteristic; use PCL Segmentation Module, we use a Euclidean Cluster Extraction class with Europe clustering for buildings and ground three-dimensional point cloud segmentation. The experimental results show that this method avoids the multiple copy system existing data needs, saves the program storage space through the call of PCL library method and class, shortens the program compiled time and improves the running speed of the program

  9. Implementation of a solution Cloud Computing with MapReduce model

    International Nuclear Information System (INIS)

    Baya, Chalabi

    2014-01-01

    In recent years, large scale computer systems have emerged to meet the demands of high storage, supercomputing, and applications using very large data sets. The emergence of Cloud Computing offers the potentiel for analysis and processing of large data sets. Mapreduce is the most popular programming model which is used to support the development of such applications. It was initially designed by Google for building large datacenters on a large scale, to provide Web search services with rapid response and high availability. In this paper we will test the clustering algorithm K-means Clustering in a Cloud Computing. This algorithm is implemented on MapReduce. It has been chosen for its characteristics that are representative of many iterative data analysis algorithms. Then, we modify the framework CloudSim to simulate the MapReduce execution of K-means Clustering on different Cloud Computing, depending on their size and characteristics of target platforms. The experiment show that the implementation of K-means Clustering gives good results especially for large data set and the Cloud infrastructure has an influence on these results

  10. Universal Keyword Classifier on Public Key Based Encrypted Multikeyword Fuzzy Search in Public Cloud

    Directory of Open Access Journals (Sweden)

    Shyamala Devi Munisamy

    2015-01-01

    Full Text Available Cloud computing has pioneered the emerging world by manifesting itself as a service through internet and facilitates third party infrastructure and applications. While customers have no visibility on how their data is stored on service provider’s premises, it offers greater benefits in lowering infrastructure costs and delivering more flexibility and simplicity in managing private data. The opportunity to use cloud services on pay-per-use basis provides comfort for private data owners in managing costs and data. With the pervasive usage of internet, the focus has now shifted towards effective data utilization on the cloud without compromising security concerns. In the pursuit of increasing data utilization on public cloud storage, the key is to make effective data access through several fuzzy searching techniques. In this paper, we have discussed the existing fuzzy searching techniques and focused on reducing the searching time on the cloud storage server for effective data utilization. Our proposed Asymmetric Classifier Multikeyword Fuzzy Search method provides classifier search server that creates universal keyword classifier for the multiple keyword request which greatly reduces the searching time by learning the search path pattern for all the keywords in the fuzzy keyword set. The objective of using BTree fuzzy searchable index is to resolve typos and representation inconsistencies and also to facilitate effective data utilization.

  11. Building Resilient Cloud Over Unreliable Commodity Infrastructure

    OpenAIRE

    Kedia, Piyus; Bansal, Sorav; Deshpande, Deepak; Iyer, Sreekanth

    2012-01-01

    Cloud Computing has emerged as a successful computing paradigm for efficiently utilizing managed compute infrastructure such as high speed rack-mounted servers, connected with high speed networking, and reliable storage. Usually such infrastructure is dedicated, physically secured and has reliable power and networking infrastructure. However, much of our idle compute capacity is present in unmanaged infrastructure like idle desktops, lab machines, physically distant server machines, and lapto...

  12. Organizational principles of cloud storage to support collaborative biomedical research.

    Science.gov (United States)

    Kanbar, Lara J; Shalish, Wissam; Robles-Rubio, Carlos A; Precup, Doina; Brown, Karen; Sant'Anna, Guilherme M; Kearney, Robert E

    2015-08-01

    This paper describes organizational guidelines and an anonymization protocol for the management of sensitive information in interdisciplinary, multi-institutional studies with multiple collaborators. This protocol is flexible, automated, and suitable for use in cloud-based projects as well as for publication of supplementary information in journal papers. A sample implementation of the anonymization protocol is illustrated for an ongoing study dealing with Automated Prediction of EXtubation readiness (APEX).

  13. Stable isotopes in barnacles as a tool to understand green sea turtle (Chelonia mydas) regional movement patterns

    Science.gov (United States)

    Detjen, M.; Sterling, E.; Gómez, A.

    2015-12-01

    Sea turtles are migratory animals that travel long distances between their feeding and breeding grounds. Traditional methods for researching sea turtle migratory behavior have important disadvantages, and the development of alternatives would enhance our ability to monitor and manage these globally endangered species. Here we report on the isotope signatures in green sea-turtle (Chelonia mydas) barnacles (Platylepas sp.) and discuss their potential relevance as tools with which to study green sea turtle migration and habitat use patterns. We analyzed oxygen (δ18O) and carbon (δ13C) isotope ratios in barnacle calcite layers from specimens collected from green turtles captured at the Palmyra Atoll National Wildlife Refuge (PANWR) in the central Pacific. Carbon isotopes were not informative in this study. However, the oxygen isotope results suggest likely regional movement patterns when mapped onto a predictive oxygen isotope map of the Pacific. Barnacle proxies could therefore complement other methods in understanding regional movement patterns, informing more effective conservation policy that takes into account connectivity between populations.

  14. Relationship between cloud radiative forcing, cloud fraction and cloud albedo, and new surface-based approach for determining cloud albedo

    OpenAIRE

    Y. Liu; W. Wu; M. P. Jensen; T. Toto

    2011-01-01

    This paper focuses on three interconnected topics: (1) quantitative relationship between surface shortwave cloud radiative forcing, cloud fraction, and cloud albedo; (2) surfaced-based approach for measuring cloud albedo; (3) multiscale (diurnal, annual and inter-annual) variations and covariations of surface shortwave cloud radiative forcing, cloud fraction, and cloud albedo. An analytical expression is first derived to quantify the relationship between cloud radiative forcing, cloud fractio...

  15. Batch Attribute-Based Encryption for Secure Clouds

    Directory of Open Access Journals (Sweden)

    Chen Yang

    2015-10-01

    Full Text Available Cloud storage is widely used by organizations due to its advantage of allowing universal access with low cost. Attribute-based encryption (ABE is a kind of public key encryption suitable for cloud storage. The secret key of each user and the ciphertext are associated with an access policy and an attribute set, respectively; in addition to holding a secret key, one can decrypt a ciphertext only if the associated attributes match the predetermined access policy, which allows one to enforce fine-grained access control on outsourced files. One issue in existing ABE schemes is that they are designed for the users of a single organization. When one wants to share the data with the users of different organizations, the owner needs to encrypt the messages to the receivers of one organization and then repeats this process for another organization. This situation is deteriorated with more and more mobile devices using cloud services, as the ABE encryption process is time consuming and may exhaust the power supplies of the mobile devices quickly. In this paper, we propose a batch attribute-based encryption (BABE approach to address this problem in a provably-secure way. With our approach, the data owner can outsource data in batches to the users of different organizations simultaneously. The data owner is allowed to decide the receiving organizations and the attributes required for decryption. Theoretical and experimental analyses show that our approach is more efficient than traditional encryption implementations in computation and communication.

  16. COMPREHENSIVE REVIEW OF AES AND RSA SECURITY ALGORITHMS IN CLOUD COMPUTING

    OpenAIRE

    Shubham Kansal*, Harkiran Kaur

    2017-01-01

    Cloud Computing referred as revolutionary approach which has changed the IT and business integration. It has benefits to almost every type of IT requirement, it can be used by enterprises to cut their IT costs, and it can be used by individual to use it as a storage solution with a disaster recovery solution. One major problem that exists with Cloud Computing, in the present scenario, is security and privacy of the data. Encryption is the most important part of the security if you own a priva...

  17. Task 28: Web Accessible APIs in the Cloud Trade Study

    Science.gov (United States)

    Gallagher, James; Habermann, Ted; Jelenak, Aleksandar; Lee, Joe; Potter, Nathan; Yang, Muqun

    2017-01-01

    This study explored three candidate architectures for serving NASA Earth Science Hierarchical Data Format Version 5 (HDF5) data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the project are: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and Network Common Data Format Version 4 (netCDF4) data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3).Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.

  18. Computational biology in the cloud: methods and new insights from computing at scale.

    Science.gov (United States)

    Kasson, Peter M

    2013-01-01

    The past few years have seen both explosions in the size of biological data sets and the proliferation of new, highly flexible on-demand computing capabilities. The sheer amount of information available from genomic and metagenomic sequencing, high-throughput proteomics, experimental and simulation datasets on molecular structure and dynamics affords an opportunity for greatly expanded insight, but it creates new challenges of scale for computation, storage, and interpretation of petascale data. Cloud computing resources have the potential to help solve these problems by offering a utility model of computing and storage: near-unlimited capacity, the ability to burst usage, and cheap and flexible payment models. Effective use of cloud computing on large biological datasets requires dealing with non-trivial problems of scale and robustness, since performance-limiting factors can change substantially when a dataset grows by a factor of 10,000 or more. New computing paradigms are thus often needed. The use of cloud platforms also creates new opportunities to share data, reduce duplication, and to provide easy reproducibility by making the datasets and computational methods easily available.

  19. The JASMIN Cloud: specialised and hybrid to meet the needs of the Environmental Sciences Community

    Science.gov (United States)

    Kershaw, Philip; Lawrence, Bryan; Churchill, Jonathan; Pritchard, Matt

    2014-05-01

    Cloud computing provides enormous opportunities for the research community. The large public cloud providers provide near-limitless scaling capability. However, adapting Cloud to scientific workloads is not without its problems. The commodity nature of the public cloud infrastructure can be at odds with the specialist requirements of the research community. Issues such as trust, ownership of data, WAN bandwidth and costing models make additional barriers to more widespread adoption. Alongside the application of public cloud for scientific applications, a number of private cloud initiatives are underway in the research community of which the JASMIN Cloud is one example. Here, cloud service models are being effectively super-imposed over more established services such as data centres, compute cluster facilities and Grids. These have the potential to deliver the specialist infrastructure needed for the science community coupled with the benefits of a Cloud service model. The JASMIN facility based at the Rutherford Appleton Laboratory was established in 2012 to support the data analysis requirements of the climate and Earth Observation community. In its first year of operation, the 5PB of available storage capacity was filled and the hosted compute capability used extensively. JASMIN has modelled the concept of a centralised large-volume data analysis facility. Key characteristics have enabled success: peta-scale fast disk connected via low latency networks to compute resources and the use of virtualisation for effective management of the resources for a range of users. A second phase is now underway funded through NERC's (Natural Environment Research Council) Big Data initiative. This will see significant expansion to the resources available with a doubling of disk-based storage to 12PB and an increase of compute capacity by a factor of ten to over 3000 processing cores. This expansion is accompanied by a broadening in the scope for JASMIN, as a service available to

  20. DATA STORAGE & LOAD BALANCING IN CLOUD COMPUTING USING CONTAINER CLUSTERING

    OpenAIRE

    Trapti Gupta*1 & Abhishek Dwivedi2

    2017-01-01

    At the moment, cloud containers are a hot topic in the IT world in general, and security in particular. The world's top technology companies, including Microsoft, Google and Facebook, all use them. Although it's still early days, containers are seeing increasing use in production environments. Containers promise a streamlined, easy-to-deploy and secure method of implementing specific infrastructure requirements, and they also offer an alternative to virtual machines. The key thing t...

  1. A compressive sensing based secure watermark detection and privacy preserving storage framework.

    Science.gov (United States)

    Qia Wang; Wenjun Zeng; Jun Tian

    2014-03-01

    Privacy is a critical issue when the data owners outsource data storage or processing to a third party computing service, such as the cloud. In this paper, we identify a cloud computing application scenario that requires simultaneously performing secure watermark detection and privacy preserving multimedia data storage. We then propose a compressive sensing (CS)-based framework using secure multiparty computation (MPC) protocols to address such a requirement. In our framework, the multimedia data and secret watermark pattern are presented to the cloud for secure watermark detection in a CS domain to protect the privacy. During CS transformation, the privacy of the CS matrix and the watermark pattern is protected by the MPC protocols under the semi-honest security model. We derive the expected watermark detection performance in the CS domain, given the target image, watermark pattern, and the size of the CS matrix (but without the CS matrix itself). The correctness of the derived performance has been validated by our experiments. Our theoretical analysis and experimental results show that secure watermark detection in the CS domain is feasible. Our framework can also be extended to other collaborative secure signal processing and data-mining applications in the cloud.

  2. Cloud computing for genomic data analysis and collaboration.

    Science.gov (United States)

    Langmead, Ben; Nellore, Abhinav

    2018-04-01

    Next-generation sequencing has made major strides in the past decade. Studies based on large sequencing data sets are growing in number, and public archives for raw sequencing data have been doubling in size every 18 months. Leveraging these data requires researchers to use large-scale computational resources. Cloud computing, a model whereby users rent computers and storage from large data centres, is a solution that is gaining traction in genomics research. Here, we describe how cloud computing is used in genomics for research and large-scale collaborations, and argue that its elasticity, reproducibility and privacy features make it ideally suited for the large-scale reanalysis of publicly available archived data, including privacy-protected data.

  3. CREW: A Cloud based Revenue model for Rural Entrepreneurial Women in India

    OpenAIRE

    Hannah Monisha .J; Rhymend Uthariaraj .V

    2012-01-01

    The new paradigm Cloud Computing is the next logical step in distributed systems, which supports the sharing and coordinated use of resources, independent of its location and type. Cloud computing allows you to unite pools of servers, storage systems and networks into a single enormous virtual resource pool so that you can use it for single resource-intensive task. This technology has the potential to address the unmet needs of Indian villagers from education to market access. In India, Self ...

  4. Comparison of electron cloud mitigating coatings using retarding field analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Calvey, J.R., E-mail: jrc97@cornell.edu; Hartung, W.; Li, Y.; Livezey, J.A.; Makita, J.; Palmer, M.A.; Rubin, D.

    2014-10-01

    In 2008, the Cornell Electron Storage Ring (CESR) was reconfigured to serve as a test accelerator (CESRTA) for next generation lepton colliders, in particular for the ILC damping ring. A significant part of this program has been the installation of diagnostic devices to measure and quantify the electron cloud effect, a potential limiting factor in these machines. One such device is the Retarding Field Analyzer (RFA), which provides information on the local electron cloud density and energy distribution. Several different styles of RFAs have been designed, tested, and deployed throughout the CESR ring. They have been used to study the growth of the cloud in different beam conditions, and to evaluate the efficacy of different mitigation techniques. This paper will provide an overview of RFA results obtained in a magnetic field free environment.

  5. Designing Cloud Infrastructure for Big Data in E-government

    Directory of Open Access Journals (Sweden)

    Jelena Šuh

    2015-03-01

    Full Text Available The development of new information services and technologies, especially in domains of mobile communications, Internet of things, and social media, has led to appearance of the large quantities of unstructured data. The pervasive computing also affects the e-government systems, where big data emerges and cannot be processed and analyzed in a traditional manner due to its complexity, heterogeneity and size. The subject of this paper is the design of the cloud infrastructure for big data storage and processing in e-government. The goal is to analyze the potential of cloud computing for big data infrastructure, and propose a model for effective storing, processing and analyzing big data in e-government. The paper provides an overview of current relevant concepts related to cloud infrastructure design that should provide support for big data. The second part of the paper gives a model of the cloud infrastructure based on the concepts of software defined networks and multi-tenancy. The final goal is to support projects in the field of big data in e-government

  6. Efficient and secure outsourcing of genomic data storage.

    Science.gov (United States)

    Sousa, João Sá; Lefebvre, Cédric; Huang, Zhicong; Raisaro, Jean Louis; Aguilar-Melchor, Carlos; Killijian, Marc-Olivier; Hubaux, Jean-Pierre

    2017-07-26

    Cloud computing is becoming the preferred solution for efficiently dealing with the increasing amount of genomic data. Yet, outsourcing storage and processing sensitive information, such as genomic data, comes with important concerns related to privacy and security. This calls for new sophisticated techniques that ensure data protection from untrusted cloud providers and that still enable researchers to obtain useful information. We present a novel privacy-preserving algorithm for fully outsourcing the storage of large genomic data files to a public cloud and enabling researchers to efficiently search for variants of interest. In order to protect data and query confidentiality from possible leakage, our solution exploits optimal encoding for genomic variants and combines it with homomorphic encryption and private information retrieval. Our proposed algorithm is implemented in C++ and was evaluated on real data as part of the 2016 iDash Genome Privacy-Protection Challenge. Results show that our solution outperforms the state-of-the-art solutions and enables researchers to search over millions of encrypted variants in a few seconds. As opposed to prior beliefs that sophisticated privacy-enhancing technologies (PETs) are unpractical for real operational settings, our solution demonstrates that, in the case of genomic data, PETs are very efficient enablers.

  7. Simulation of the electron cloud density in BEPC II

    International Nuclear Information System (INIS)

    Liu Yudong; Guo Zhiyuan; Wang Jiuqing

    2004-01-01

    Electron Cloud Instability (ECI) may take place in positron storage ring when the machine is operated with multi-bunch positron beam. According to the actual shape of the vacuum chamber in the BEPC II, a program has been developed. With the code, authors can calculate the electron density in the chamber with different length of antechamber and the different secondary electron yield respectively. By the simulation, the possibility to put clearing electrodes in the chamber to reduce the electron density in the central region of the chamber is investigated. The simulation provides meaningful and important results for the BEPC II project and electron cloud instability research

  8. Transverse blowup along bunch train caused by electron cloud in BEPC

    International Nuclear Information System (INIS)

    Liu Yudong; Guo Zhiyuan; Qin Qing; Wang Jiuqing; Zhao Zheng

    2006-01-01

    Electron cloud instability (ECI) may take place in a storage ring when the machine is operated with a multi-bunch positively charged beam. Transverse blowup due to electron cloud has been observed in some machines and is considered to be a major limit factor in the development of high current and high luminosity electron positron colliders. With a streak camera, the transverse blowup along the bunch train was first observed in an experiment at the Beijing Electron-Positron Collider (BEPC) and the simulation results were used to compared with the observation. (authors)

  9. CLOUD COMPUTING: AN AMBIGUOUS WEATHER FORCAST IN THE MARKET

    Directory of Open Access Journals (Sweden)

    Евгений Юрьевич Бакалдин

    2013-08-01

    Full Text Available Cloud computing is a pretty old idea according to the standards of IT, which, however, has only recently gained wide popularity and use. At that, according to most experts, the real "finest hour" of cloud computing is yet to come. However, the situation with the "clouds" in the global and domestic markets is not entirely cloudless.In this article (apart from a brief review of the history and the essence of cloud computing, we will focus on the immediate prospects of the development of cloud computing in Russia and in the world, and the main problems of their improvement and implementation, including the problems of not technical but legal implication.At the end of this article, it is concluded that at present the use of cloud technology in all spheres, from simple document storage users to large business systems - is a compromise between the broad capabilities on the one hand, and certain risks on the other. Of course, that's looking to the cloud and from the economic point of view: the balance between their apparent attraction of investment and financial risk of such investments is not yet found.On preparing this article we used the research data produced by such relevant research companies as Gartner and Forrester Research, as well as the reports of Russian experts in this field.DOI: http://dx.doi.org/10.12731/2218-7405-2013-6-7

  10. Cloud-based adaptive exon prediction for DNA analysis.

    Science.gov (United States)

    Putluri, Srinivasareddy; Zia Ur Rahman, Md; Fathima, Shaik Yasmeen

    2018-02-01

    Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database.

  11. A dispersion safety factor for LNG vapor clouds

    Energy Technology Data Exchange (ETDEWEB)

    Vílchez, Juan A. [TIPs – Trámites, Informes y Proyectos, SL, Llenguadoc 10, 08030 Barcelona (Spain); Villafañe, Diana [Centre d’Estudis del Risc Tecnològic (CERTEC), Universitat Politècnica de Catalunya, Diagonal 647, 08028 Barcelona, Catalonia (Spain); Casal, Joaquim, E-mail: joaquim.casal@upc.edu [Centre d’Estudis del Risc Tecnològic (CERTEC), Universitat Politècnica de Catalunya, Diagonal 647, 08028 Barcelona, Catalonia (Spain)

    2013-02-15

    Highlights: ► We proposed a new parameter: the dispersion safety factor (DSF). ► DSF is the ratio between the distance reached by the LFL and that reached by the visible cloud. ► The results for the DSF agree well with the evidence from large scale experiments. ► Two expressions have been proposed to calculate DSF as a function of H{sub R}. ► The DSF may help in indicating the danger of ignition of a LNG vapor cloud. -- Abstract: The growing importance of liquefied natural gas (LNG) to global energy demand has increased interest in the possible hazards associated with its storage and transportation. Concerning the event of an LNG spill, a study was performed on the relationship between the distance at which the lower flammability limit (LFL) concentration occurs and that corresponding to the visible contour of LNG vapor clouds. A parameter called the dispersion safety factor (DSF) has been defined as the ratio between these two lengths, and two expressions are proposed to estimate it. During an emergency, the DSF can be a helpful parameter to indicate the danger of cloud ignition and flash fire.

  12. A dispersion safety factor for LNG vapor clouds

    International Nuclear Information System (INIS)

    Vílchez, Juan A.; Villafañe, Diana; Casal, Joaquim

    2013-01-01

    Highlights: ► We proposed a new parameter: the dispersion safety factor (DSF). ► DSF is the ratio between the distance reached by the LFL and that reached by the visible cloud. ► The results for the DSF agree well with the evidence from large scale experiments. ► Two expressions have been proposed to calculate DSF as a function of H R . ► The DSF may help in indicating the danger of ignition of a LNG vapor cloud. -- Abstract: The growing importance of liquefied natural gas (LNG) to global energy demand has increased interest in the possible hazards associated with its storage and transportation. Concerning the event of an LNG spill, a study was performed on the relationship between the distance at which the lower flammability limit (LFL) concentration occurs and that corresponding to the visible contour of LNG vapor clouds. A parameter called the dispersion safety factor (DSF) has been defined as the ratio between these two lengths, and two expressions are proposed to estimate it. During an emergency, the DSF can be a helpful parameter to indicate the danger of cloud ignition and flash fire

  13. Cryptography in the Cloud Computing: the Current State and Logical Tasks

    OpenAIRE

    Sergey Nikolaevich Kyazhin; Andrey Vladimirovich Moiseev

    2013-01-01

    The current state of the cloud computing (CC) information security is analysed and logical problems of storage and data transmission security at CC are allocated. Cryptographic methods of data security in CC, in particular, lightweight cryptography and the cryptography based on bilinear pairings are described.

  14. Compact toroidal energy storage device with relativistically densified electrons through the use of travelling magnetic waves

    International Nuclear Information System (INIS)

    Peter, W.; Faehl, R.J.

    1983-01-01

    A new concept for a small compact multimegajoule energy storage device utilizing relativistically densified electron beam circulating in a torus is presented. The electron cloud is produced through inductive charge injection by a travelling magnetic wave circulating the torus. Parameters are given for two representative toroidal energy storage devices, consisting of 1 m and 32 m in radius respectively, which could store more than 4 x 10 17 electrons and 30' MJ in energy. The concept utilizes the idea that large electric and magnetic fields can be produced by a partially space-charge neutralized intense relativistic electron beam which could become many orders of magnitude greater than the externally applied field confining the beam. In the present approach, the electron cloud densification can be achieved gradually by permitting multiple traversals of the magnetic wave around the torus. The magnetic mirror force acts on the orbital magnetic electron dipole moment and completely penetrates the entire electron cloud. As the electrons gain relativistic energies, the beam can be continuously densified at the front of the travelling wave, where the magnetic field is rising with time. The use of travelling magnetic wave to accelerate an electron cloud and the use of large electric field at the thusly accelerated cloud form the basis for a high beam intensity and hence high energy storage. Technical considerations and several potential applications, which include the driving of a powerful gyrotron, are discussed

  15. Cloud-Top Entrainment in Stratocumulus Clouds

    Science.gov (United States)

    Mellado, Juan Pedro

    2017-01-01

    Cloud entrainment, the mixing between cloudy and clear air at the boundary of clouds, constitutes one paradigm for the relevance of small scales in the Earth system: By regulating cloud lifetimes, meter- and submeter-scale processes at cloud boundaries can influence planetary-scale properties. Understanding cloud entrainment is difficult given the complexity and diversity of the associated phenomena, which include turbulence entrainment within a stratified medium, convective instabilities driven by radiative and evaporative cooling, shear instabilities, and cloud microphysics. Obtaining accurate data at the required small scales is also challenging, for both simulations and measurements. During the past few decades, however, high-resolution simulations and measurements have greatly advanced our understanding of the main mechanisms controlling cloud entrainment. This article reviews some of these advances, focusing on stratocumulus clouds, and indicates remaining challenges.

  16. Security and privacy in the clouds: a bird's eye view

    NARCIS (Netherlands)

    Pieters, Wolter

    2010-01-01

    Over the last years, something called “cloud computing��? has become a major theme in computer science and information security. Essentially, it concerns delivering information technology as a service, by enabling the renting of software, computing power and storage. In this contribution, we

  17. 3d object segmentation of point clouds using profiling techniques

    African Journals Online (AJOL)

    Administrator

    optimization attempts to physically store the point cloud so that storage, retrieval and visualisation ..... Ideally three stacks should be sufficient, but in practice four or five are used. .... The authors would like to acknowledge that this paper is based on a paper presented at ... Theory, Processing and Application, 5 pages.

  18. Cloud type comparisons of AIRS, CloudSat, and CALIPSO cloud height and amount

    Directory of Open Access Journals (Sweden)

    B. H. Kahn

    2008-03-01

    Full Text Available The precision of the two-layer cloud height fields derived from the Atmospheric Infrared Sounder (AIRS is explored and quantified for a five-day set of observations. Coincident profiles of vertical cloud structure by CloudSat, a 94 GHz profiling radar, and the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO, are compared to AIRS for a wide range of cloud types. Bias and variability in cloud height differences are shown to have dependence on cloud type, height, and amount, as well as whether CloudSat or CALIPSO is used as the comparison standard. The CloudSat-AIRS biases and variability range from −4.3 to 0.5±1.2–3.6 km for all cloud types. Likewise, the CALIPSO-AIRS biases range from 0.6–3.0±1.2–3.6 km (−5.8 to −0.2±0.5–2.7 km for clouds ≥7 km (<7 km. The upper layer of AIRS has the greatest sensitivity to Altocumulus, Altostratus, Cirrus, Cumulonimbus, and Nimbostratus, whereas the lower layer has the greatest sensitivity to Cumulus and Stratocumulus. Although the bias and variability generally decrease with increasing cloud amount, the ability of AIRS to constrain cloud occurrence, height, and amount is demonstrated across all cloud types for many geophysical conditions. In particular, skill is demonstrated for thin Cirrus, as well as some Cumulus and Stratocumulus, cloud types infrared sounders typically struggle to quantify. Furthermore, some improvements in the AIRS Version 5 operational retrieval algorithm are demonstrated. However, limitations in AIRS cloud retrievals are also revealed, including the existence of spurious Cirrus near the tropopause and low cloud layers within Cumulonimbus and Nimbostratus clouds. Likely causes of spurious clouds are identified and the potential for further improvement is discussed.

  19. The application of cloud computing to scientific workflows: a study of cost and performance.

    Science.gov (United States)

    Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S

    2013-01-28

    The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.

  20. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    Science.gov (United States)

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  1. A comparison of shock-cloud and wind-cloud interactions: effect of increased cloud density contrast on cloud evolution

    Science.gov (United States)

    Goldsmith, K. J. A.; Pittard, J. M.

    2018-05-01

    The similarities, or otherwise, of a shock or wind interacting with a cloud of density contrast χ = 10 were explored in a previous paper. Here, we investigate such interactions with clouds of higher density contrast. We compare the adiabatic hydrodynamic interaction of a Mach 10 shock with a spherical cloud of χ = 103 with that of a cloud embedded in a wind with identical parameters to the post-shock flow. We find that initially there are only minor morphological differences between the shock-cloud and wind-cloud interactions, compared to when χ = 10. However, once the transmitted shock exits the cloud, the development of a turbulent wake and fragmentation of the cloud differs between the two simulations. On increasing the wind Mach number, we note the development of a thin, smooth tail of cloud material, which is then disrupted by the fragmentation of the cloud core and subsequent `mass-loading' of the flow. We find that the normalized cloud mixing time (tmix) is shorter at higher χ. However, a strong Mach number dependence on tmix and the normalized cloud drag time, t_{drag}^' }, is not observed. Mach-number-dependent values of tmix and t_{drag}^' } from comparable shock-cloud interactions converge towards the Mach-number-independent time-scales of the wind-cloud simulations. We find that high χ clouds can be accelerated up to 80-90 per cent of the wind velocity and travel large distances before being significantly mixed. However, complete mixing is not achieved in our simulations and at late times the flow remains perturbed.

  2. Cloud Computing, Tieto Cloud Server Model

    OpenAIRE

    Suikkanen, Saara

    2013-01-01

    The purpose of this study is to find out what is cloud computing. To be able to make wise decisions when moving to cloud or considering it, companies need to understand what cloud is consists of. Which model suits best to they company, what should be taken into account before moving to cloud, what is the cloud broker role and also SWOT analysis of cloud? To be able to answer customer requirements and business demands, IT companies should develop and produce new service models. IT house T...

  3. Green turtles (Chelonia mydas foraging at Arvoredo Island in Southern Brazil: genetic characterization and mixed stock analysis through mtDNA control region haplotypes

    Directory of Open Access Journals (Sweden)

    Maíra Carneiro Proietti

    2009-01-01

    Full Text Available We analyzed mtDNA control region sequences of green turtles (Chelonia mydas from Arvoredo Island, a foraging ground in southern Brazil, and identified eight haplotypes. Of these, CM-A8 (64% and CM-A5 (22% were dominant, the remainder presenting low frequencies ( 0.05. Mixed Stock Analysis, incorporating eleven Atlantic and one Mediterranean rookery as possible sources of individuals, indicated Ascension and Aves islands as the main contributing stocks to the Arvoredo aggregation (68.01% and 22.96%, respectively. These results demonstrate the extensive relationships between Arvoredo Island and other Atlantic foraging and breeding areas. Such an understanding provides a framework for establishing adequate management and conservation strategies for this endangered species.

  4. Model Infrastruktur dan Manajemen Platform Server Berbasis Cloud Computing

    Directory of Open Access Journals (Sweden)

    Mulki Indana Zulfa

    2017-11-01

    Full Text Available Cloud computing is a new technology that is still very rapidly growing. This technology makes the Internet as the main media for the management of data and applications remotely. Cloud computing allows users to run an application without having to think about infrastructure and its platforms. Other technical aspects such as memory, storage, backup and restore, can be done very easily. This research is intended to modeling the infrastructure and management of computer platform in computer network of Faculty of Engineering, University of Jenderal Soedirman. The first stage in this research is literature study, by finding out the implementation model in previous research. Then the result will be combined with a new approach to existing resources and try to implement directly on the existing server network. The results showed that the implementation of cloud computing technology is able to replace the existing platform network.

  5. Eucalyptus Cloud to Remotely Provision e-Governance Applications

    Directory of Open Access Journals (Sweden)

    Sreerama Prabhu Chivukula

    2011-01-01

    Full Text Available Remote rural areas are constrained by lack of reliable power supply, essential for setting up advanced IT infrastructure as servers or storage; therefore, cloud computing comprising an Infrastructure-as-a-Service (IaaS is well suited to provide such IT infrastructure in remote rural areas. Additional cloud layers of Platform-as-a-Service (PaaS and Software-as-a-Service (SaaS can be added above IaaS. Cluster-based IaaS cloud can be set up by using open-source middleware Eucalyptus in data centres of NIC. Data centres of the central and state governments can be integrated with State Wide Area Networks and NICNET together to form the e-governance grid of India. Web service repositories at centre, state, and district level can be built over the national e-governance grid of India. Using Globus Toolkit, we can achieve stateful web services with speed and security. Adding the cloud layer over the e-governance grid will make a grid-cloud environment possible through Globus Nimbus. Service delivery can be in terms of web services delivery through heterogeneous client devices. Data mining using Weka4WS and DataMiningGrid can produce meaningful knowledge discovery from data. In this paper, a plan of action is provided for the implementation of the above proposed architecture.

  6. The structure of the clouds distributed operating system

    Science.gov (United States)

    Dasgupta, Partha; Leblanc, Richard J., Jr.

    1989-01-01

    A novel system architecture, based on the object model, is the central structuring concept used in the Clouds distributed operating system. This architecture makes Clouds attractive over a wide class of machines and environments. Clouds is a native operating system, designed and implemented at Georgia Tech. and runs on a set of generated purpose computers connected via a local area network. The system architecture of Clouds is composed of a system-wide global set of persistent (long-lived) virtual address spaces, called objects that contain persistent data and code. The object concept is implemented at the operating system level, thus presenting a single level storage view to the user. Lightweight treads carry computational activity through the code stored in the objects. The persistent objects and threads gives rise to a programming environment composed of shared permanent memory, dispensing with the need for hardware-derived concepts such as the file systems and message systems. Though the hardware may be distributed and may have disks and networks, the Clouds provides the applications with a logically centralized system, based on a shared, structured, single level store. The current design of Clouds uses a minimalist philosophy with respect to both the kernel and the operating system. That is, the kernel and the operating system support a bare minimum of functionality. Clouds also adheres to the concept of separation of policy and mechanism. Most low-level operating system services are implemented above the kernel and most high level services are implemented at the user level. From the measured performance of using the kernel mechanisms, we are able to demonstrate that efficient implementations are feasible for the object model on commercially available hardware. Clouds provides a rich environment for conducting research in distributed systems. Some of the topics addressed in this paper include distributed programming environments, consistency of persistent data

  7. Toward a Big Data Science: A challenge of "Science Cloud"

    Science.gov (United States)

    Murata, Ken T.; Watanabe, Hidenobu

    2013-04-01

    cloud, we named as OneSpaceNet (OSN), is the first open cloud system for scientists who are going to carry out their informatics for their own science. The science cloud is not for simple uses. Many functions are expected to the science cloud; such as data standardization, data collection and crawling, large and distributed data storage system, security and reliability, database and meta-database, data stewardship, long-term data preservation, data rescue and preservation, data mining, parallel processing, data publication and provision, semantic web, 3D and 4D visualization, out-reach and in-reach, and capacity buildings. Figure (not shown here) is a schematic picture of the NICT science cloud. Both types of data from observation and simulation are stored in the storage system in the science cloud. It should be noted that there are two types of data in observation. One is from archive site out of the cloud: this is a data to be downloaded through the Internet to the cloud. The other one is data from the equipment directly connected to the science cloud. They are often called as sensor clouds. In the present talk, we first introduce the NICT science cloud. We next demonstrate the efficiency of the science cloud, showing several scientific results which we achieved with this cloud system. Through the discussions and demonstrations, the potential performance of sciences cloud will be revealed for any research fields.

  8. Can Nuclear Installations and Research Centres Adopt Cloud Computing Platform-

    International Nuclear Information System (INIS)

    Pichan, A.; Lazarescu, M.; Soh, S.T.

    2015-01-01

    Cloud Computing is arguably one of the recent and highly significant advances in information technology today. It produces transformative changes in the history of computing and presents many promising technological and economic opportunities. The pay-per-use model, the computing power, abundance of storage, skilled resources, fault tolerance and the economy of scale it offers, provides significant advantages to enterprises to adopt cloud platform for their business needs. However, customers especially those dealing with national security, high end scientific research institutions, critical national infrastructure service providers (like power, water) remain very much reluctant to move their business system to the cloud. One of the main concerns is the question of information security in the cloud and the threat of the unknown. Cloud Service Providers (CSP) indirectly encourages this perception by not letting their customers see what is behind their virtual curtain. Jurisdiction (information assets being stored elsewhere), data duplication, multi-tenancy, virtualisation and decentralized nature of data processing are the default characteristics of cloud computing. Therefore traditional approach of enforcing and implementing security controls remains a big challenge and largely depends upon the service provider. The other biggest challenge and open issue is the ability to perform digital forensic investigations in the cloud in case of security breaches. Traditional approaches to evidence collection and recovery are no longer practical as they rely on unrestricted access to the relevant systems and user data, something that is not available in the cloud model. This continues to fuel high insecurity for the cloud customers. In this paper we analyze the cyber security and digital forensics challenges, issues and opportunities for nuclear facilities to adopt cloud computing. We also discuss the due diligence process and applicable industry best practices which shall be

  9. A User-Customized Virtual Network Platform for NaaS Cloud

    Directory of Open Access Journals (Sweden)

    Lei Xiao

    2016-01-01

    Full Text Available Now all kinds of public cloud providers take computing and storage resources as the user’s main demand, making it difficult for users to deploy complex network in the public cloud. This paper proposes a virtual cloud platform with network as the core demand of the user, which can provide the user with the capacity of free network architecture as well as all kinds of virtual resources. The network is isolated by port groups of the virtual distributed switch and the data forwarding and access control between different network segments are implemented by virtual machines loading a soft-routing system. This paper also studies the management interface of network architecture and the uniform way to connect the remote desktop of virtual resources on the web, hoping to provide some new ideas for the Network as a Service model.

  10. Phylogenetic Variation of Chelonid Alphaherpesvirus 5 (ChHV5) in Populations of Green Turtles Chelonia mydas along the Queensland Coast, Australia.

    Science.gov (United States)

    Ariel, E; Nainu, F; Jones, K; Juntunen, K; Bell, I; Gaston, J; Scott, J; Trocini, S; Burgess, G W

    2017-09-01

    Sea turtle fibropapillomatosis (FP) is a disease marked by the proliferation of benign but debilitating cutaneous and occasional visceral tumors, likely to be caused by chelonid alphaherpesvirus 5 (ChHV5). This study presents a phylogeny of ChHV5 strains found on the east coast of Queensland, Australia, and a validation for previously unused primers. Two different primer sets (gB-1534 and gB-813) were designed to target a region including part of the UL27 glycoprotein B (gB) gene and part of UL28 of ChHV5. Sequences obtained from FP tumors found on juvenile green turtles Chelonia mydas (Queensland, and Queensland clusters. The clusters reflect the collection sites on the east coast of Queensland with a definitive north-south trend. Received October 22, 2016; accepted May 7, 2017.

  11. +Cloud: An Agent-Based Cloud Computing Platform

    OpenAIRE

    González, Roberto; Hernández de la Iglesia, Daniel; de la Prieta Pintado, Fernando; Gil González, Ana Belén

    2017-01-01

    Cloud computing is revolutionizing the services provided through the Internet, and is continually adapting itself in order to maintain the quality of its services. This study presents the platform +Cloud, which proposes a cloud environment for storing information and files by following the cloud paradigm. This study also presents Warehouse 3.0, a cloud-based application that has been developed to validate the services provided by +Cloud.

  12. Large-Scale, Parallel, Multi-Sensor Atmospheric Data Fusion Using Cloud Computing

    Science.gov (United States)

    Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E. J.

    2013-12-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the 'A-Train' platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration analyses of important climate variables presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (MERRA), stratify the comparisons using a classification of the 'cloud scenes' from CloudSat, and repeat the entire analysis over 10 years of data. To efficiently assemble such datasets, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. However, these problems are Data Intensive computing so the data transfer times and storage costs (for caching) are key issues. SciReduce is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Figure 1 shows the architecture of the full computational system, with SciReduce at the core. Multi-year datasets are automatically 'sharded' by time and space across a cluster of nodes so that years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the cached input and intermediate datasets. We are using SciReduce to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a NASA MEASURES grant. We will

  13. Moving image analysis to the cloud: A case study with a genome-scale tomographic study

    Energy Technology Data Exchange (ETDEWEB)

    Mader, Kevin [4Quant Ltd., Switzerland & Institute for Biomedical Engineering at University and ETH Zurich (Switzerland); Stampanoni, Marco [Institute for Biomedical Engineering at University and ETH Zurich, Switzerland & Swiss Light Source at Paul Scherrer Institut, Villigen (Switzerland)

    2016-01-28

    Over the last decade, the time required to measure a terabyte of microscopic imaging data has gone from years to minutes. This shift has moved many of the challenges away from experimental design and measurement to scalable storage, organization, and analysis. As many scientists and scientific institutions lack training and competencies in these areas, major bottlenecks have arisen and led to substantial delays and gaps between measurement, understanding, and dissemination. We present in this paper a framework for analyzing large 3D datasets using cloud-based computational and storage resources. We demonstrate its applicability by showing the setup and costs associated with the analysis of a genome-scale study of bone microstructure. We then evaluate the relative advantages and disadvantages associated with local versus cloud infrastructures.

  14. Moving image analysis to the cloud: A case study with a genome-scale tomographic study

    International Nuclear Information System (INIS)

    Mader, Kevin; Stampanoni, Marco

    2016-01-01

    Over the last decade, the time required to measure a terabyte of microscopic imaging data has gone from years to minutes. This shift has moved many of the challenges away from experimental design and measurement to scalable storage, organization, and analysis. As many scientists and scientific institutions lack training and competencies in these areas, major bottlenecks have arisen and led to substantial delays and gaps between measurement, understanding, and dissemination. We present in this paper a framework for analyzing large 3D datasets using cloud-based computational and storage resources. We demonstrate its applicability by showing the setup and costs associated with the analysis of a genome-scale study of bone microstructure. We then evaluate the relative advantages and disadvantages associated with local versus cloud infrastructures

  15. Cloud-based preoperative planning for total hip arthroplasty: a study of accuracy, efficiency, and compliance.

    Science.gov (United States)

    Maratt, Joseph D; Srinivasan, Ramesh C; Dahl, William J; Schilling, Peter L; Urquhart, Andrew G

    2012-08-01

    As digital radiography becomes more prevalent, several systems for digital preoperative planning have become available. The purpose of this study was to evaluate the accuracy and efficiency of an inexpensive, cloud-based digital templating system, which is comparable with acetate templating. However, cloud-based templating is substantially faster and more convenient than acetate templating or locally installed software. Although this is a practical solution for this particular medical application, regulatory changes are necessary before the tremendous advantages of cloud-based storage and computing can be realized in medical research and clinical practice. Copyright 2012, SLACK Incorporated.

  16. Cloud computing models and their application in LTE based cellular systems

    NARCIS (Netherlands)

    Staring, A.J.; Karagiannis, Georgios

    2013-01-01

    As cloud computing emerges as the next novel concept in computer science, it becomes clear that the model applied in large data storage systems used to resolve issues coming forth from an increasing demand, could also be used to resolve the very high bandwidth requirements on access network, core

  17. Cryptography in the Cloud Computing: the Current State and Logical Tasks

    Directory of Open Access Journals (Sweden)

    Sergey Nikolaevich Kyazhin

    2013-09-01

    Full Text Available The current state of the cloud computing (CC information security is analysed and logical problems of storage and data transmission security at CC are allocated. Cryptographic methods of data security in CC, in particular, lightweight cryptography and the cryptography based on bilinear pairings are described.

  18. RAIN: A Bio-Inspired Communication and Data Storage Infrastructure.

    Science.gov (United States)

    Monti, Matteo; Rasmussen, Steen

    2017-01-01

    We summarize the results and perspectives from a companion article, where we presented and evaluated an alternative architecture for data storage in distributed networks. We name the bio-inspired architecture RAIN, and it offers file storage service that, in contrast with current centralized cloud storage, has privacy by design, is open source, is more secure, is scalable, is more sustainable, has community ownership, is inexpensive, and is potentially faster, more efficient, and more reliable. We propose that a RAIN-style architecture could form the backbone of the Internet of Things that likely will integrate multiple current and future infrastructures ranging from online services and cryptocurrency to parts of government administration.

  19. Silicon Photonics Cloud (SiCloud)

    DEFF Research Database (Denmark)

    DeVore, P. T. S.; Jiang, Y.; Lynch, M.

    2015-01-01

    Silicon Photonics Cloud (SiCloud.org) is the first silicon photonics interactive web tool. Here we report new features of this tool including mode propagation parameters and mode distribution galleries for user specified waveguide dimensions and wavelengths.......Silicon Photonics Cloud (SiCloud.org) is the first silicon photonics interactive web tool. Here we report new features of this tool including mode propagation parameters and mode distribution galleries for user specified waveguide dimensions and wavelengths....

  20. A hybrid cloud read aligner based on MinHash and kmer voting that preserves privacy

    Science.gov (United States)

    Popic, Victoria; Batzoglou, Serafim

    2017-05-01

    Low-cost clouds can alleviate the compute and storage burden of the genome sequencing data explosion. However, moving personal genome data analysis to the cloud can raise serious privacy concerns. Here, we devise a method named Balaur, a privacy preserving read mapper for hybrid clouds based on locality sensitive hashing and kmer voting. Balaur can securely outsource a substantial fraction of the computation to the public cloud, while being highly competitive in accuracy and speed with non-private state-of-the-art read aligners on short read data. We also show that the method is significantly faster than the state of the art in long read mapping. Therefore, Balaur can enable institutions handling massive genomic data sets to shift part of their analysis to the cloud without sacrificing accuracy or exposing sensitive information to an untrusted third party.