WorldWideScience

Sample records for nonprecipitating cloud systems

  1. Cloud residues and interstitial aerosols from non-precipitating clouds over an industrial and urban area in northern China

    Science.gov (United States)

    Li, Weijun; Li, Peiren; Sun, Guode; Zhou, Shengzhen; Yuan, Qi; Wang, Wenxing

    2011-05-01

    Most studies of aerosol-cloud interactions have been conducted in remote locations; few have investigated the characterization of cloud condensation nuclei (CCN) over highly polluted urban and industrial areas. The present work, based on samples collected at Mt. Tai, a site in northern China affected by nearby urban and industrial air pollutant emissions, illuminates CCN properties in a polluted atmosphere. High-resolution transmission electron microscopy (TEM) was used to obtain the size, composition, and mixing state of individual cloud residues and interstitial aerosols. Most of the cloud residues displayed distinct rims which were found to consist of soluble organic matter (OM). Nearly all (91.7%) cloud residues were attributed to sulfate-related salts (the remainder was mostly coarse crustal dust particles with nitrate coatings). Half the salt particles were internally mixed with two or more refractory particles (e.g., soot, fly ash, crustal dust, CaSO 4, and OM). A comparison between cloud residues and interstitial particles shows that the former contained more salts and were of larger particle size than the latter. In addition, a somewhat high number scavenging ratio of 0.54 was observed during cloud formation. Therefore, the mixtures of salts with OMs account for most of the cloud-nucleating ability of the entire aerosol population in the polluted air of northern China. We advocate that both size and composition - the two influential, controlling factors for aerosol activation - should be built into all regional climate models of China.

  2. Wind field measurement in the nonprecipitous regions surrounding storms by an airborne pulsed Doppler lidar system, appendix A

    Science.gov (United States)

    Bilbro, J. W.; Vaughan, W. W.

    1980-01-01

    Coherent Doppler lidar appears to hold great promise in contributing to the basic store of knowledge concerning flow field characteristics in the nonprecipitous regions surrounding severe storms. The Doppler lidar, through its ability to measure clear air returns, augments the conventional Doppler radar system, which is most useful in the precipitous regions of the storm. A brief description of the Doppler lidar severe storm measurement system is provided along with the technique to be used in performing the flow field measurements. The application of the lidar is addressed, and the planned measurement program is outlined.

  3. Benchmarking Cloud Storage Systems

    OpenAIRE

    Wang, Xing

    2014-01-01

    With the rise of cloud computing, many cloud storage systems like Dropbox, Google Drive and Mega have been built to provide decentralized and reliable file storage. It is thus of prime importance to know their features, performance, and the best way to make use of them. In this context, we introduce BenchCloud, a tool designed as part of this thesis to conveniently and efficiently benchmark any cloud storage system. First, we provide a study of six commonly-used cloud storage systems to ident...

  4. Military clouds: utilization of cloud computing systems at the battlefield

    Science.gov (United States)

    Süleyman, Sarıkürk; Volkan, Karaca; İbrahim, Kocaman; Ahmet, Şirzai

    2012-05-01

    Cloud computing is known as a novel information technology (IT) concept, which involves facilitated and rapid access to networks, servers, data saving media, applications and services via Internet with minimum hardware requirements. Use of information systems and technologies at the battlefield is not new. Information superiority is a force multiplier and is crucial to mission success. Recent advances in information systems and technologies provide new means to decision makers and users in order to gain information superiority. These developments in information technologies lead to a new term, which is known as network centric capability. Similar to network centric capable systems, cloud computing systems are operational today. In the near future extensive use of military clouds at the battlefield is predicted. Integrating cloud computing logic to network centric applications will increase the flexibility, cost-effectiveness, efficiency and accessibility of network-centric capabilities. In this paper, cloud computing and network centric capability concepts are defined. Some commercial cloud computing products and applications are mentioned. Network centric capable applications are covered. Cloud computing supported battlefield applications are analyzed. The effects of cloud computing systems on network centric capability and on the information domain in future warfare are discussed. Battlefield opportunities and novelties which might be introduced to network centric capability by cloud computing systems are researched. The role of military clouds in future warfare is proposed in this paper. It was concluded that military clouds will be indispensible components of the future battlefield. Military clouds have the potential of improving network centric capabilities, increasing situational awareness at the battlefield and facilitating the settlement of information superiority.

  5. A stratiform cloud parameterization for general circulation models

    International Nuclear Information System (INIS)

    Ghan, S.J.; Leung, L.R.; Chuang, C.C.; Penner, J.E.; McCaa, J.

    1994-01-01

    The crude treatment of clouds in general circulation models (GCMs) is widely recognized as a major limitation in applying these models to predictions of global climate change. The purpose of this project is to develop in GCMs a stratiform cloud parameterization that expresses clouds in terms of bulk microphysical properties and their subgrid variability. Various clouds variables and their interactions are summarized. Precipitating cloud species are distinguished from non-precipitating species, and the liquid phase is distinguished from the ice phase. The size of the non-precipitating cloud particles (which influences both the cloud radiative properties and the conversion of non-precipitating cloud species to precipitating species) is determined by predicting both the mass and number concentrations of each species

  6. Security for cloud storage systems

    CERN Document Server

    Yang, Kan

    2014-01-01

    Cloud storage is an important service of cloud computing, which offers service for data owners to host their data in the cloud. This new paradigm of data hosting and data access services introduces two major security concerns. The first is the protection of data integrity. Data owners may not fully trust the cloud server and worry that data stored in the cloud could be corrupted or even removed. The second is data access control. Data owners may worry that some dishonest servers provide data access to users that are not permitted for profit gain and thus they can no longer rely on the servers

  7. Advanced cloud fault tolerance system

    Science.gov (United States)

    Sumangali, K.; Benny, Niketa

    2017-11-01

    Cloud computing has become a prevalent on-demand service on the internet to store, manage and process data. A pitfall that accompanies cloud computing is the failures that can be encountered in the cloud. To overcome these failures, we require a fault tolerance mechanism to abstract faults from users. We have proposed a fault tolerant architecture, which is a combination of proactive and reactive fault tolerance. This architecture essentially increases the reliability and the availability of the cloud. In the future, we would like to compare evaluations of our proposed architecture with existing architectures and further improve it.

  8. Cloud Computing for Standard ERP Systems

    DEFF Research Database (Denmark)

    Schubert, Petra; Adisa, Femi

    for the operation of ERP systems. We argue that the phenomenon of cloud computing could lead to a decisive change in the way business software is deployed in companies. Our reference framework contains three levels (IaaS, PaaS, SaaS) and clarifies the meaning of public, private and hybrid clouds. The three levels......Cloud Computing is a topic that has gained momentum in the last years. Current studies show that an increasing number of companies is evaluating the promised advantages and considering making use of cloud services. In this paper we investigate the phenomenon of cloud computing and its importance...... of cloud computing and their impact on ERP systems operation are discussed. From the literature we identify areas for future research and propose a research agenda....

  9. A stratiform cloud parameterization for General Circulation Models

    International Nuclear Information System (INIS)

    Ghan, S.J.; Leung, L.R.; Chuang, C.C.; Penner, J.E.; McCaa, J.

    1994-01-01

    The crude treatment of clouds in General Circulation Models (GCMs) is widely recognized as a major limitation in the application of these models to predictions of global climate change. The purpose of this project is to develop a paxameterization for stratiform clouds in GCMs that expresses stratiform clouds in terms of bulk microphysical properties and their subgrid variability. In this parameterization, precipitating cloud species are distinguished from non-precipitating species, and the liquid phase is distinguished from the ice phase. The size of the non-precipitating cloud particles (which influences both the cloud radiative properties and the conversion of non-precipitating cloud species to precipitating species) is determined by predicting both the mass and number concentrations of each species

  10. Osprey: Operating system for predictable clouds

    NARCIS (Netherlands)

    Sacha, Jan; Napper, Jeff; Mullender, Sape J.; McKie, Jim

    2012-01-01

    Cloud computing is currently based on hardware virtualization wherein a host operating system provides a virtual machine interface nearly identical to that of physical hardware to guest operating systems. Full transparency allows backward compatibility with legacy software but introduces

  11. Data backup security in cloud storage system

    OpenAIRE

    Атаян, Борис Геннадьевич; Национальный политехнический университет Армении; Багдасарян, Татевик Араевна; Национальный политехнический университет Армении

    2016-01-01

    Cloud backup system is proposed, which provides means for effective creation, secure storage and restore of backups inCloud. For data archiving new efficient SGBP file format is being used in the system, which is based on DEFLATE compressionalgorithm. Proposed format provides means for fast archive creation, which can contain significant amounts of data. Modernapproaches of backup archive protection are described in the paper. Also the SGBP format is compared to heavily used ZIP format(both Z...

  12. Security, privacy and trust in cloud systems

    CERN Document Server

    Nepal, Surya

    2013-01-01

    The book compiles technologies for enhancing and provisioning security, privacy and trust in cloud systems based on Quality of Service requirements. It is a timely contribution to a field that is gaining considerable research interest, momentum, and provides a comprehensive coverage of technologies related to cloud security, privacy and trust. In particular, the book includes - Cloud security fundamentals and related technologies to-date, with a comprehensive coverage of evolution, current landscape, and future roadmap. - A smooth organization with introductory, advanced and specialist content

  13. A cost modelling system for cloud computing

    OpenAIRE

    Ajeh, Daniel; Ellman, Jeremy; Keogh, Shelagh

    2014-01-01

    An advance in technology unlocks new opportunities for organizations to increase their productivity, efficiency and process automation while reducing the cost of doing business as well. The emergence of cloud computing addresses these prospects through the provision of agile systems that are scalable, flexible and reliable as well as cost effective. Cloud computing has made hosting and deployment of computing resources cheaper and easier with no up-front charges but pay per-use flexible payme...

  14. Comparing and Merging Observation Data from Ka-Band Cloud Radar, C-Band Frequency-Modulated Continuous Wave Radar and Ceilometer Systems

    Directory of Open Access Journals (Sweden)

    Liping Liu

    2017-12-01

    Full Text Available Field experiment in South China was undertaken to improve understanding of cloud and precipitation properties. Measurements of the vertical structures of non-precipitating and precipitating clouds were obtained using passive and active remote sensing equipment: a Ka-band cloud radar (CR system, a C-band frequency modulated continuous wave vertical pointing radar (CVPR, a microwave radiometer and a laser ceilometer (CEIL. CR plays a key role in high-level cloud observation, whereas CVPR is important for observing low- and mid-level clouds and heavy precipitation. CEIL helps us diminish the effects of “clear-sky” in the planetary boundary layer. The experiment took place in Longmen, Guangdong Province, China from May to September of 2016. This study focuses on evaluating the ability of the two radars to deliver consistent observation data and develops an algorithm to merge the CR, CVPR and CEIL data. Cloud echo base, thickness, frequency of observed cloud types and reflectivity vertical distributions are analyzed in the radar data. Comparisons between the collocated data sets show that reflectivity biases between the CR three operating modes are less than 2 dB. The averaged difference between CR and CVPR reflectivity can be reduced with attenuation correction to 3.57 dB from the original 4.82 dB. No systemic biases were observed between velocity data collected in the three CR modes and CVPR. The corrected CR reflectivity and velocity data were then merged with the CVPR data and CEIL data to fill in the gaps during the heavy precipitation periods and reduce the effects of Bragg scattering and fog on cloud observations in the boundary layer. Meanwhile, the merging of velocity data with different Nyquist velocities and resolutions diminishes velocity folding to provide fine-grain information about cloud and precipitation dynamics. The three daily periods in which low-level clouds tended to occur were at sunrise, noon and sunset and large

  15. PC-Cluster based Storage System Architecture for Cloud Storage

    OpenAIRE

    Yee, Tin Tin; Naing, Thinn Thu

    2011-01-01

    Design and architecture of cloud storage system plays a vital role in cloud computing infrastructure in order to improve the storage capacity as well as cost effectiveness. Usually cloud storage system provides users to efficient storage space with elasticity feature. One of the challenges of cloud storage system is difficult to balance the providing huge elastic capacity of storage and investment of expensive cost for it. In order to solve this issue in the cloud storage infrastructure, low ...

  16. Trust Management System for Opportunistic Cloud Services

    DEFF Research Database (Denmark)

    Kuada, Eric

    2013-01-01

    We have over the past three years been working on the feasibility of Opportunistic Cloud Services (OCS) for enterprises. OCS is about enterprises strategically contributing and utilizing spare IT resources as cloud services. One of the major challenges that such a platform faces is data security...... and trust management issues. This paper presents a trust management system for OCS platforms. It models the concept of trust and applies it to OCS platforms. The trust model and the trust management system are verified through the simulation of the computation of the trust values with Infrastructure...

  17. Temporally rendered automatic cloud extraction (TRACE) system

    Science.gov (United States)

    Bodrero, Dennis M.; Yale, James G.; Davis, Roger E.; Rollins, John M.

    1999-10-01

    Smoke/obscurant testing requires that 2D cloud extent be extracted from visible and thermal imagery. These data are used alone or in combination with 2D data from other aspects to make 3D calculations of cloud properties, including dimensions, volume, centroid, travel, and uniformity. Determining cloud extent from imagery has historically been a time-consuming manual process. To reduce time and cost associated with smoke/obscurant data processing, automated methods to extract cloud extent from imagery were investigated. The TRACE system described in this paper was developed and implemented at U.S. Army Dugway Proving Ground, UT by the Science and Technology Corporation--Acuity Imaging Incorporated team with Small Business Innovation Research funding. TRACE uses dynamic background subtraction and 3D fast Fourier transform as primary methods to discriminate the smoke/obscurant cloud from the background. TRACE has been designed to run on a PC-based platform using Windows. The PC-Windows environment was chosen for portability, to give TRACE the maximum flexibility in terms of its interaction with peripheral hardware devices such as video capture boards, removable media drives, network cards, and digital video interfaces. Video for Windows provides all of the necessary tools for the development of the video capture utility in TRACE and allows for interchangeability of video capture boards without any software changes. TRACE is designed to take advantage of future upgrades in all aspects of its component hardware. A comparison of cloud extent determined by TRACE with manual method is included in this paper.

  18. Point cloud processing for smart systems

    Directory of Open Access Journals (Sweden)

    Jaromír Landa

    2013-01-01

    Full Text Available High population as well as the economical tension emphasises the necessity of effective city management – from land use planning to urban green maintenance. The management effectiveness is based on precise knowledge of the city environment. Point clouds generated by mobile and terrestrial laser scanners provide precise data about objects in the scanner vicinity. From these data pieces the state of the roads, buildings, trees and other objects important for this decision-making process can be obtained. Generally, they can support the idea of “smart” or at least “smarter” cities.Unfortunately the point clouds do not provide this type of information automatically. It has to be extracted. This extraction is done by expert personnel or by object recognition software. As the point clouds can represent large areas (streets or even cities, usage of expert personnel to identify the required objects can be very time-consuming, therefore cost ineffective. Object recognition software allows us to detect and identify required objects semi-automatically or automatically.The first part of the article reviews and analyses the state of current art point cloud object recognition techniques. The following part presents common formats used for point cloud storage and frequently used software tools for point cloud processing. Further, a method for extraction of geospatial information about detected objects is proposed. Therefore, the method can be used not only to recognize the existence and shape of certain objects, but also to retrieve their geospatial properties. These objects can be later directly used in various GIS systems for further analyses.

  19. A MAS-Based Cloud Service Brokering System to Respond Security Needs of Cloud Customers

    Directory of Open Access Journals (Sweden)

    Jamal Talbi

    2017-03-01

    Full Text Available Cloud computing is becoming a key factor in computer science and an important technology for many organizations to deliver different types of services. The companies which provide services to customers are called as cloud service providers. The cloud users (CUs increase and require secure, reliable and trustworthy cloud service providers (CSPs from the market. So, it’s a challenge for a new customer to choose the highly secure provider. This paper presents a cloud service brokering system in order to analyze and rank the secured cloud service provider among the available providers list. This model uses an autonomous and flexible agent in multi-agent system (MASs that have an intelligent behavior and suitable tools for helping the brokering system to assess the security risks for the group of cloud providers which make decision of the more secured provider and justify the business needs of users in terms of security and reliability.

  20. Comparison of cloud customer relationship management systems

    OpenAIRE

    Remic, Anja

    2014-01-01

    In the theses it will be shown how cloud computing looks like. Customer relationship management (CRM) system consisting of four types will be described. All types will be represented by their most important properties and what type would be important for which company. Relationships between customers and companies are very important because of long-lasting cooperation between them and regular income. It is interesting how a company operates without CRM system and how a company that uses it da...

  1. National electronic medical records integration on cloud computing system.

    Science.gov (United States)

    Mirza, Hebah; El-Masri, Samir

    2013-01-01

    Few Healthcare providers have an advanced level of Electronic Medical Record (EMR) adoption. Others have a low level and most have no EMR at all. Cloud computing technology is a new emerging technology that has been used in other industry and showed a great success. Despite the great features of Cloud computing, they haven't been utilized fairly yet in healthcare industry. This study presents an innovative Healthcare Cloud Computing system for Integrating Electronic Health Record (EHR). The proposed Cloud system applies the Cloud Computing technology on EHR system, to present a comprehensive EHR integrated environment.

  2. Design Private Cloud of Oil and Gas SCADA System

    Directory of Open Access Journals (Sweden)

    Liu Miao

    2014-05-01

    Full Text Available SCADA (Supervisory Control and Data Acquisition system is computer control system based on supervisory. SCADA system is very important to oil and gas pipeline engineering. Cloud computing is fundamentally altering the expectations for how and when computing, storage and networking resources should be allocated, managed and consumed. In order to increase resource utilization, reliability and availability of oil and gas pipeline SCADA system, the SCADA system based on cloud computing is proposed in the paper. This paper introduces the system framework of SCADA system based on cloud computing and the realization details about the private cloud platform of SCADA system.

  3. Moving ERP Systems to the Cloud - Data Security Issues

    Directory of Open Access Journals (Sweden)

    Pablo Saa

    2017-08-01

    Full Text Available This paper brings to light data security issues and concerns for organizations by moving their Enterprise Resource Planning (ERP systems to the cloud. Cloud computing has become the new trend of how organizations conduct business and has enabled them to innovate and compete in a dynamic environment through new and innovative business models. The growing popularity and success of the cloud has led to the emergence of cloud-based Software-as-a-Service (SaaS ERP systems, a new alternative approach to traditional on-premise ERP systems. Cloud-based ERP has a myriad of benefits for organizations. However, infrastructure engineers need to address data security issues before moving their enterprise applications to the cloud. Cloud-based ERP raises specific concerns about the confidentiality and integrity of the data stored in the cloud. Such concerns that affect the adoption of cloud-based ERP are based on the size of the organization. Small to medium enterprises (SMEs gain the maximum benefits from cloud-based ERP as many of the concerns around data security are not relevant to them. On the contrary, larger organizations are more cautious in moving their mission critical enterprise applications to the cloud. A hybrid solution where organizations can choose to keep their sensitive applications on-premise while leveraging the benefits of the cloud is proposed in this paper as an effective solution that is gaining momentum and popularity for large organizations.

  4. Research on cloud-based remote measurement and analysis system

    Science.gov (United States)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  5. Flexible solution for interoperable cloud healthcare systems.

    Science.gov (United States)

    Vida, Mihaela Marcella; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara; Bernad, Elena

    2012-01-01

    It is extremely important for the healthcare domain to have a standardized communication because will improve the quality of information and in the end the resulting benefits will improve the quality of patients' life. The standards proposed to be used are: HL7 CDA and CCD. For a better access to the medical data a solution based on cloud computing (CC) is investigated. CC is a technology that supports flexibility, seamless care, and reduced costs of the medical act. To ensure interoperability between healthcare information systems a solution creating a Web Custom Control is presented. The control shows the database tables and fields used to configure the two standards. This control will facilitate the work of the medical staff and hospital administrators, because they can configure the local system easily and prepare it for communication with other systems. The resulted information will have a higher quality and will provide knowledge that will support better patient management and diagnosis.

  6. Dynamic electronic institutions in agent oriented cloud robotic systems.

    Science.gov (United States)

    Nagrath, Vineet; Morel, Olivier; Malik, Aamir; Saad, Naufal; Meriaudeau, Fabrice

    2015-01-01

    The dot-com bubble bursted in the year 2000 followed by a swift movement towards resource virtualization and cloud computing business model. Cloud computing emerged not as new form of computing or network technology but a mere remoulding of existing technologies to suit a new business model. Cloud robotics is understood as adaptation of cloud computing ideas for robotic applications. Current efforts in cloud robotics stress upon developing robots that utilize computing and service infrastructure of the cloud, without debating on the underlying business model. HTM5 is an OMG's MDA based Meta-model for agent oriented development of cloud robotic systems. The trade-view of HTM5 promotes peer-to-peer trade amongst software agents. HTM5 agents represent various cloud entities and implement their business logic on cloud interactions. Trade in a peer-to-peer cloud robotic system is based on relationships and contracts amongst several agent subsets. Electronic Institutions are associations of heterogeneous intelligent agents which interact with each other following predefined norms. In Dynamic Electronic Institutions, the process of formation, reformation and dissolution of institutions is automated leading to run time adaptations in groups of agents. DEIs in agent oriented cloud robotic ecosystems bring order and group intellect. This article presents DEI implementations through HTM5 methodology.

  7. Dynamic Extensions of Batch Systems with Cloud Resources

    International Nuclear Information System (INIS)

    Hauth, T; Quast, G; Büge, V; Scheurer, A; Kunze, M; Baun, C

    2011-01-01

    Compute clusters use Portable Batch Systems (PBS) to distribute workload among individual cluster machines. To extend standard batch systems to Cloud infrastructures, a new service monitors the number of queued jobs and keeps track of the price of available resources. This meta-scheduler dynamically adapts the number of Cloud worker nodes according to the requirement profile. Two different worker node topologies are presented and tested on the Amazon EC2 Cloud service.

  8. On Cloud-Based Engineering of Dependable Systems

    OpenAIRE

    Alajrami, Sami

    2014-01-01

    The cloud computing paradigm is being adopted by many organizations in different application domains as it is cost effective and offers a virtually unlimited pool of resources. Engineering critical systems can benefit from clouds in attaining all dependability means: fault tolerance, fault prevention, fault removal and fault forecasting. Our research aims to investigate the potential of supporting engineering of dependable software systems with cloud computing and proposes an open, extensible...

  9. Predictive Control of Networked Multiagent Systems via Cloud Computing.

    Science.gov (United States)

    Liu, Guo-Ping

    2017-01-18

    This paper studies the design and analysis of networked multiagent predictive control systems via cloud computing. A cloud predictive control scheme for networked multiagent systems (NMASs) is proposed to achieve consensus and stability simultaneously and to compensate for network delays actively. The design of the cloud predictive controller for NMASs is detailed. The analysis of the cloud predictive control scheme gives the necessary and sufficient conditions of stability and consensus of closed-loop networked multiagent control systems. The proposed scheme is verified to characterize the dynamical behavior and control performance of NMASs through simulations. The outcome provides a foundation for the development of cooperative and coordinative control of NMASs and its applications.

  10. Moving ERP Systems to the Cloud - Data Security Issues

    OpenAIRE

    Pablo Saa; Andrés Cueva Costales; Oswaldo Moscoso-Zea; Sergio Lujan-Mora

    2017-01-01

    This paper brings to light data security issues and concerns for organizations by moving their Enterprise Resource Planning (ERP) systems to the cloud. Cloud computing has become the new trend of how organizations conduct business and has enabled them to innovate and compete in a dynamic environment through new and innovative business models. The growing popularity and success of the cloud has led to the emergence of cloud-based Software-as-a-Service (SaaS) ERP systems, a new alternative appr...

  11. Coupled fvGCM-GCE Modeling System, 3D Cloud-Resolving Model and Cloud Library

    Science.gov (United States)

    Tao, Wei-Kuo

    2005-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud- resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. A seed fund is available at NASA Goddard to build a MMF based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM). A prototype MMF in being developed and production runs will be conducted at the beginning of 2005. In this talk, I will present: (1) A brief review on GCE model and its applications on precipitation processes, ( 2 ) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), (3) A cloud library generated by Goddard MMF, and 3D GCE model, and (4) A brief discussion on the GCE model on developing a global cloud simulator.

  12. ATLAS Tier-2 monitoring system for the German cloud

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Joerg; Quadt, Arnulf; Weber, Pavel [II. Physikalisches Institut, Georg-August-Universitaet, Goettingen (Germany)

    2011-07-01

    The ATLAS tier centers in Germany provide their computing resources for the ATLAS experiment. The stable and sustainable operation of this so-called DE-cloud heavily relies on effective monitoring of the Tier-1 center GridKa and its associated Tier-2 centers. Central and local grid information services constantly collect and publish the status information from many computing resources and sites. The cloud monitoring system discussed in this presentation evaluates the information related to different cloud resources and provides a coherent and comprehensive view of the cloud. The main monitoring areas covered by the tool are data transfers, cloud software installation, site batch systems, Service Availability Monitoring (SAM). The cloud monitoring system consists of an Apache-based Python application, which retrieves the information and publishes it on the generated HTML web page. This results in an easy-to-use web interface for the limited number of sites in the cloud with fast and efficient access to the required information starting from a high level summary for the whole cloud to detailed diagnostics for the single site services. This approach provides the efficient identification of correlated site problems and simplifies the administration on both cloud and site level.

  13. IRAS constraints on a cold cloud around the solar system

    International Nuclear Information System (INIS)

    Aumann, H.H.; Good, J.C.

    1990-01-01

    IRAS 60- and 100-micron observations of G-stars in the solar neighborhood indicate that the typical G star is surrounded by a cold cloud. The assumption that the sun is archetypical requires that a cloud of typical G star extent and temperature surrounds our solar system. IRAS ecliptic plane scans, which are dominated by a 40-deg wide band of zodiacal dust, asteroid debris trails, and the Galactic plane, are consistent with a larger than typical G star cold cloud. Consistency with the typical G star and the direct observations constrain the width of the cold cloud perpendicular to the ecliptic plane to be larger than 5 deg. The 100-150 AU radius of this cloud is larger, but not inconsistent with the inner boundary of a cloud of comets, postulated previously at a radius of 50 AU based on Neptune orbital perturbations and models of short period comets. 17 refs

  14. Securing Cloud Computing from Different Attacks Using Intrusion Detection Systems

    Directory of Open Access Journals (Sweden)

    Omar Achbarou

    2017-03-01

    Full Text Available Cloud computing is a new way of integrating a set of old technologies to implement a new paradigm that creates an avenue for users to have access to shared and configurable resources through internet on-demand. This system has many common characteristics with distributed systems, hence, the cloud computing also uses the features of networking. Thus the security is the biggest issue of this system, because the services of cloud computing is based on the sharing. Thus, a cloud computing environment requires some intrusion detection systems (IDSs for protecting each machine against attacks. The aim of this work is to present a classification of attacks threatening the availability, confidentiality and integrity of cloud resources and services. Furthermore, we provide literature review of attacks related to the identified categories. Additionally, this paper also introduces related intrusion detection models to identify and prevent these types of attacks.

  15. G-Cloud Monitor: A Cloud Monitoring System for Factory Automation for Sustainable Green Computing

    Directory of Open Access Journals (Sweden)

    Hwa-Young Jeong

    2014-11-01

    Full Text Available Green and cloud computing (G-cloud are new trends in all areas of computing. The G-cloud provides an efficient function, which enables users to access their programs, systems and platforms at anytime and anyplace. Green computing can also yield greener technology by reducing power consumption for sustainable environments. Furthermore, in order to apply user needs to the system development, the user characteristics are regarded as some of the most important factors to be considered in product industries. In this paper, we propose a cloud monitoring system to observe and manage the manufacturing system/factory automation for sustainable green computing. For monitoring systems, we utilized the resources in the G-cloud environments, and hence, it can reduce the amount of system resources and devices, such as system power and processes. In addition, we propose adding a user profile to the monitoring system in order to provide a user-friendly function. That is, this function allows system configurations to be automatically matched to the individual’s requirements, thus increasing efficiency.

  16. Cloud Standby: Disaster Recovery of Distributed Systems in the Cloud

    OpenAIRE

    Lenk , Alexander; Tai , Stefan

    2014-01-01

    International audience; Disaster recovery planning and securing business processes against outtakes have been essential parts of running a company for decades. As IT systems became more important, and especially since more and more revenue is generated over the Internet, securing the IT systems that support the business processes against outages is essential. Using fully operational standby sites with periodically updated standby systems is a well-known approach to prepare against disasters. ...

  17. Design Private Cloud of Oil and Gas SCADA System

    OpenAIRE

    Liu Miao; Mancang Yuan; Guodong Li

    2014-01-01

    SCADA (Supervisory Control and Data Acquisition) system is computer control system based on supervisory. SCADA system is very important to oil and gas pipeline engineering. Cloud computing is fundamentally altering the expectations for how and when computing, storage and networking resources should be allocated, managed and consumed. In order to increase resource utilization, reliability and availability of oil and gas pipeline SCADA system, the SCADA system based on cloud computing is propos...

  18. Secure Web System in a Cloud Environment

    OpenAIRE

    Pokherl, Bibesh

    2013-01-01

    Advent of cloud computing has brought a lot of benefits for users based on its essential characteristics. Users are attracted by its costs per use service and rapidly deploy their applications in the cloud and scale by using virtualization technology without investing in their own IT infrastructure. These applications can be accessed through web based technology, such as web browsers or mobile apps. However, security becomes a major challenge when user’s data and applications are stored in a ...

  19. The structure of the clouds distributed operating system

    Science.gov (United States)

    Dasgupta, Partha; Leblanc, Richard J., Jr.

    1989-01-01

    A novel system architecture, based on the object model, is the central structuring concept used in the Clouds distributed operating system. This architecture makes Clouds attractive over a wide class of machines and environments. Clouds is a native operating system, designed and implemented at Georgia Tech. and runs on a set of generated purpose computers connected via a local area network. The system architecture of Clouds is composed of a system-wide global set of persistent (long-lived) virtual address spaces, called objects that contain persistent data and code. The object concept is implemented at the operating system level, thus presenting a single level storage view to the user. Lightweight treads carry computational activity through the code stored in the objects. The persistent objects and threads gives rise to a programming environment composed of shared permanent memory, dispensing with the need for hardware-derived concepts such as the file systems and message systems. Though the hardware may be distributed and may have disks and networks, the Clouds provides the applications with a logically centralized system, based on a shared, structured, single level store. The current design of Clouds uses a minimalist philosophy with respect to both the kernel and the operating system. That is, the kernel and the operating system support a bare minimum of functionality. Clouds also adheres to the concept of separation of policy and mechanism. Most low-level operating system services are implemented above the kernel and most high level services are implemented at the user level. From the measured performance of using the kernel mechanisms, we are able to demonstrate that efficient implementations are feasible for the object model on commercially available hardware. Clouds provides a rich environment for conducting research in distributed systems. Some of the topics addressed in this paper include distributed programming environments, consistency of persistent data

  20. The Research of Dr. Joanne Simpson: Fifty Years Investigating Hurricanes, Tropical Clouds and Cloud Systems

    Science.gov (United States)

    Tao, W. -K.; Halverson, J.; Adler, R.; Garstang, M.; Houze, R., Jr.; LeMone, M.; Pielke, R., Sr.; Woodley, W.; O'C.Starr, David (Technical Monitor)

    2001-01-01

    This AMS Meteorological Monographs is dedicated to Dr. Joanne Simpson for her many pioneering research efforts in tropical meteorology during her fifty-year career. Dr. Simpson's major areas of scientific research involved the "hot tower" hypothesis and its role in hurricanes, structure and maintenance of trade winds, air-sea interaction, and observations and the mechanism for hurricanes and waterspouts. She was also a pioneer in cloud modeling with the first one-dimensional model and had the first cumulus model on a computer. She also played a major role in planning and leading observational experiments on convective cloud systems. The launch of the Tropical Rainfall Measuring Mission (TRMM) satellite, a joint U.S.-Japan project, in November of 1997 made it possible for quantitative measurements of tropical rainfall to be obtained on a continuous basis over the entire global tropics. Dr. Simpson was the TRAM Project Scientist from 1986 until its launch in 1997. Her efforts during this crucial period ensured that the mission was both well planned scientifically and well engineered as well as within budget. In this paper, Dr. J. Simpson's nine specific accomplishments during her fifty-year career: (1) hot tower hypothesis, (2) hurricanes, (3) airflow and clouds over heated islands, (4) cloud models, (5) trade winds and their role in cumulus development, (6) air-sea interaction, (7) cloud-cloud interactions and mergers, (8) waterspouts, and (9) TRMM science, will be described and discussed.

  1. Health Information System in a Cloud Computing Context.

    Science.gov (United States)

    Sadoughi, Farahnaz; Erfannia, Leila

    2017-01-01

    Healthcare as a worldwide industry is experiencing a period of growth based on health information technology. The capabilities of cloud systems make it as an option to develop eHealth goals. The main objectives of the present study was to evaluate the advantages and limitations of health information systems implementation in a cloud-computing context that was conducted as a systematic review in 2016. Science direct, Scopus, Web of science, IEEE, PubMed and Google scholar were searched according study criteria. Among 308 articles initially found, 21 articles were entered in the final analysis. All the studies had considered cloud computing as a positive tool to help advance health technology, but none had insisted too much on its limitations and threats. Electronic health record systems have been mostly studied in the fields of implementation, designing, and presentation of models and prototypes. According to this research, the main advantages of cloud-based health information systems could be categorized into the following groups: economic benefits and advantages of information management. The main limitations of the implementation of cloud-based health information systems could be categorized into the 4 groups of security, legal, technical, and human restrictions. Compared to earlier studies, the present research had the advantage of dealing with the issue of health information systems in a cloud platform. The high frequency of studies conducted on the implementation of cloud-based health information systems revealed health industry interest in the application of this technology. Security was a subject discussed in most studies due to health information sensitivity. In this investigation, some mechanisms and solutions were discussed concerning the mentioned systems, which would provide a suitable area for future scientific research on this issue. The limitations and solutions discussed in this systematic study would help healthcare managers and decision

  2. A cloud masking algorithm for EARLINET lidar systems

    Science.gov (United States)

    Binietoglou, Ioannis; Baars, Holger; D'Amico, Giuseppe; Nicolae, Doina

    2015-04-01

    Cloud masking is an important first step in any aerosol lidar processing chain as most data processing algorithms can only be applied on cloud free observations. Up to now, the selection of a cloud-free time interval for data processing is typically performed manually, and this is one of the outstanding problems for automatic processing of lidar data in networks such as EARLINET. In this contribution we present initial developments of a cloud masking algorithm that permits the selection of the appropriate time intervals for lidar data processing based on uncalibrated lidar signals. The algorithm is based on a signal normalization procedure using the range of observed values of lidar returns, designed to work with different lidar systems with minimal user input. This normalization procedure can be applied to measurement periods of only few hours, even if no suitable cloud-free interval exists, and thus can be used even when only a short period of lidar measurements is available. Clouds are detected based on a combination of criteria including the magnitude of the normalized lidar signal and time-space edge detection performed using the Sobel operator. In this way the algorithm avoids misclassification of strong aerosol layers as clouds. Cloud detection is performed using the highest available time and vertical resolution of the lidar signals, allowing the effective detection of low-level clouds (e.g. cumulus humilis). Special attention is given to suppress false cloud detection due to signal noise that can affect the algorithm's performance, especially during day-time. In this contribution we present the details of algorithm, the effect of lidar characteristics (space-time resolution, available wavelengths, signal-to-noise ratio) to detection performance, and highlight the current strengths and limitations of the algorithm using lidar scenes from different lidar systems in different locations across Europe.

  3. Cloud model construct for transaction-based cooperative systems ...

    African Journals Online (AJOL)

    Cloud model construct for transaction-based cooperative systems. ... procure cutting edge Information Technology infrastructure are some of the problems faced ... Results also reveal that credit cooperatives will benefit from the model by taking ...

  4. Enhanced toxic cloud knockdown spray system for decontamination applications

    Science.gov (United States)

    Betty, Rita G [Rio Rancho, NM; Tucker, Mark D [Albuquerque, NM; Brockmann, John E [Albuquerque, NM; Lucero, Daniel A [Albuquerque, NM; Levin, Bruce L [Tijeras, NM; Leonard, Jonathan [Albuquerque, NM

    2011-09-06

    Methods and systems for knockdown and neutralization of toxic clouds of aerosolized chemical or biological warfare (CBW) agents and toxic industrial chemicals using a non-toxic, non-corrosive aqueous decontamination formulation.

  5. Customized products and cloud service information system development research

    Directory of Open Access Journals (Sweden)

    Hung Chien-Wen

    2017-01-01

    Full Text Available This study presents a cloud service customized product information system to enable businesses to provide customized product marketing on the Internet to meet consumer demand for customized products. The cloud service of the information system development strategic framework proposed in this study contains three elements: (1 e-commerce services, (2 promotion type modules, and (3 cloud services customized promotional products. In this study, a mining cloud information system to detect customer behavior is proposed. The association rules from relational database design are utilized to mine consumer behavior to generate cross-selling proposals for customer products and marketing for a retailing mall in Taiwan. The study is composed of several parts, as follows. A market segment and application of association rules in data exploration techniques (Association Rule Mining and sequence-like exploration (Sequential Pattern Mining, efficient analysis of customers, consumer behavior, identification of candidates for promotional products, and using cloud service delivery and evaluation of targets to evaluate candidates for promotional products for production. However, in addition to cloud service customized promotional products, the quantity of promotional products sales varies for different customers. We strive to achieve increased customer loyalty and profits through the use of active cloud service customized promotional products.

  6. Cloud Absorption Radiometer Autonomous Navigation System - CANS

    Science.gov (United States)

    Kahle, Duncan; Gatebe, Charles; McCune, Bill; Hellwig, Dustan

    2013-01-01

    CAR (cloud absorption radiometer) acquires spatial reference data from host aircraft navigation systems. This poses various problems during CAR data reduction, including navigation data format, accuracy of position data, accuracy of airframe inertial data, and navigation data rate. Incorporating its own navigation system, which included GPS (Global Positioning System), roll axis inertia and rates, and three axis acceleration, CANS expedites data reduction and increases the accuracy of the CAR end data product. CANS provides a self-contained navigation system for the CAR, using inertial reference and GPS positional information. The intent of the software application was to correct the sensor with respect to aircraft roll in real time based upon inputs from a precision navigation sensor. In addition, the navigation information (including GPS position), attitude data, and sensor position details are all streamed to a remote system for recording and later analysis. CANS comprises a commercially available inertial navigation system with integral GPS capability (Attitude Heading Reference System AHRS) integrated into the CAR support structure and data system. The unit is attached to the bottom of the tripod support structure. The related GPS antenna is located on the P-3 radome immediately above the CAR. The AHRS unit provides a RS-232 data stream containing global position and inertial attitude and velocity data to the CAR, which is recorded concurrently with the CAR data. This independence from aircraft navigation input provides for position and inertial state data that accounts for very small changes in aircraft attitude and position, sensed at the CAR location as opposed to aircraft state sensors typically installed close to the aircraft center of gravity. More accurate positional data enables quicker CAR data reduction with better resolution. The CANS software operates in two modes: initialization/calibration and operational. In the initialization/calibration mode

  7. A Meta-Heuristic Load Balancer for Cloud Computing Systems

    OpenAIRE

    Sliwko, L.; Getov, Vladimir

    2015-01-01

    This paper introduces a strategy to allocate services on a cloud system without overloading the nodes and maintaining the system stability with minimum cost. We specify an abstract model of cloud resources utilization, including multiple types of resources as well as considerations for the service migration costs. A prototype meta-heuristic load balancer is demonstrated and experimental results are presented and discussed. We also propose a novel genetic algorithm, where population is seeded ...

  8. Cloud Computing Based E-Learning System

    Science.gov (United States)

    Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.

    2010-01-01

    Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…

  9. Cloud System Evolution in the Trades—CSET

    Science.gov (United States)

    Albrecht, B. A.; Zuidema, P.; Bretherton, C. S.; Wood, R.; Ghate, V. P.

    2015-12-01

    The Cloud System Evolution in the Trades (CSET) study was designed to describe and explain the evolution of the boundary layer aerosol, cloud, and thermodynamic structures along trajectories within the north-Pacific trade-winds. The observational component of this study centered on 7 round-trips made by the NSF NCAR Gulfstream V (GV) between Sacramento, CA and Kona, Hawaii between 1 July and 15 August 2015. The CSET observing strategy used a Lagrangian approach to sample aerosol, cloud, and boundary layer properties upwind from the transition zone over the North Pacific and to resample these areas two days later. GFS forecast trajectories were used to plan the outbound flight to Hawaii and then updated forecast trajectories helped set the return flight plan two days later. Two key elements of the CSET observing system were the newly developed HIAPER Cloud Radar (HCR) and the HIAPER Spectral Resolution Lidar (HSRL). Together they provided unprecedented characterizations of aerosol, cloud and precipitation structures. A full suite of probes on the aircraft were used for in situ measurements of aerosol, cloud, precipitation, and turbulence properties during the low-level aircraft profiling portions of the flights. A wide range of boundary layer structures and aerosol, cloud, and precipitation conditions were observed during CSET. The cloud systems sampled included solid stratocumulus infused with smoke from Canadian wildfires, mesoscale (100-200 km) cloud-precipitation complexes, and patches of shallow cumuli in environments with accumulation mode aerosol concentrations of less than 50 cm-3. Ultra clean layers (UCLs with accumulation mode concentrations of less than 10 cm-3) were observed frequently near the top of the boundary layer and were often associated with shallow, gray (optically thin) layered clouds—features that are the subject of focused investigations by the CSET science team. The extent of aerosol, cloud, drizzle and boundary layer sampling that was

  10. An Architecture for Cross-Cloud System Management

    Science.gov (United States)

    Dodda, Ravi Teja; Smith, Chris; van Moorsel, Aad

    The emergence of the cloud computing paradigm promises flexibility and adaptability through on-demand provisioning of compute resources. As the utilization of cloud resources extends beyond a single provider, for business as well as technical reasons, the issue of effectively managing such resources comes to the fore. Different providers expose different interfaces to their compute resources utilizing varied architectures and implementation technologies. This heterogeneity poses a significant system management problem, and can limit the extent to which the benefits of cross-cloud resource utilization can be realized. We address this problem through the definition of an architecture to facilitate the management of compute resources from different cloud providers in an homogenous manner. This preserves the flexibility and adaptability promised by the cloud computing paradigm, whilst enabling the benefits of cross-cloud resource utilization to be realized. The practical efficacy of the architecture is demonstrated through an implementation utilizing compute resources managed through different interfaces on the Amazon Elastic Compute Cloud (EC2) service. Additionally, we provide empirical results highlighting the performance differential of these different interfaces, and discuss the impact of this performance differential on efficiency and profitability.

  11. Making Cloud-based Systems Elasticity Testing Reproducible

    OpenAIRE

    Albonico , Michel; Mottu , Jean-Marie; Sunyé , Gerson; Alvares , Frederico

    2017-01-01

    International audience; Elastic cloud infrastructures vary computational resources at runtime, i. e., elasticity, which is error-prone. That makes testing throughout elasticity crucial for those systems. Those errors are detected thanks to tests that should run deterministically many times all along the development. However, elasticity testing reproduction requires several features not supported natively by the main cloud providers, such as Amazon EC2. We identify three requirements that we c...

  12. Assessing the impact of the Kuroshio Current on vertical cloud structure using CloudSat data

    Directory of Open Access Journals (Sweden)

    A. Yamauchi

    2018-06-01

    Full Text Available This study analyzed CloudSat satellite data to determine how the warm ocean Kuroshio Current affects the vertical structure of clouds. Rainfall intensity around the middle troposphere (6 km in height over the Kuroshio was greater than that over surrounding areas. The drizzle clouds over the Kuroshio have a higher frequency of occurrence of geometrically thin (0.5–3 km clouds and thicker (7–10 km clouds compared to those around the Kuroshio. Moreover, the frequency of occurrence of precipitating clouds with a geometric thickness of 7 to 10 km increased over the Kuroshio. Stronger updrafts over the Kuroshio maintain large droplets higher in the upper part of the cloud layer, and the maximum radar reflectivity within a cloud layer in non-precipitating and drizzle clouds over the Kuroshio is higher than that around the Kuroshio.

  13. Radiative budget and cloud radiative effect over the Atlantic from ship-based observations

    Directory of Open Access Journals (Sweden)

    J. Kalisch

    2012-10-01

    Full Text Available The aim of this study is to determine cloud-type resolved cloud radiative budgets and cloud radiative effects from surface measurements of broadband radiative fluxes over the Atlantic Ocean. Furthermore, based on simultaneous observations of the state of the cloudy atmosphere, a radiative closure study has been performed by means of the ECHAM5 single column model in order to identify the model's ability to realistically reproduce the effects of clouds on the climate system.

    An extensive database of radiative and atmospheric measurements has been established along five meridional cruises of the German research icebreaker Polarstern. Besides pyranometer and pyrgeometer for downward broadband solar and thermal radiative fluxes, a sky imager and a microwave radiometer have been utilized to determine cloud fraction and cloud type on the one hand and temperature and humidity profiles as well as liquid water path for warm non-precipitating clouds on the other hand.

    Averaged over all cruise tracks, we obtain a total net (solar + thermal radiative flux of 144 W m−2 that is dominated by the solar component. In general, the solar contribution is large for cirrus clouds and small for stratus clouds. No significant meridional dependencies were found for the surface radiation budgets and cloud effects. The strongest surface longwave cloud effects were shown in the presence of low level clouds. Clouds with a high optical density induce strong negative solar radiative effects under high solar altitudes. The mean surface net cloud radiative effect is −33 W m−2.

    For the purpose of quickly estimating the mean surface longwave, shortwave and net cloud effects in moderate, subtropical and tropical climate regimes, a new parameterisation was created, considering the total cloud amount and the solar zenith angle.

    The ECHAM5 single column model provides a surface net cloud effect that is more

  14. The Cc1 Project – System For Private Cloud Computing

    Directory of Open Access Journals (Sweden)

    J Chwastowski

    2012-01-01

    Full Text Available The main features of the Cloud Computing system developed at IFJ PAN are described. The project is financed from the structural resources provided by the European Commission and the Polish Ministry of Science and Higher Education (Innovative Economy, National Cohesion Strategy. The system delivers a solution for carrying out computer calculations on a Private Cloud computing infrastructure. It consists of an intuitive Web based user interface, a module for the users and resources administration and the standard EC2 interface implementation. Thanks to the distributed character of the system it allows for the integration of a geographically distant federation of computer clusters within a uniform user environment.

  15. A cloud-based system for automatic glaucoma screening.

    Science.gov (United States)

    Fengshou Yin; Damon Wing Kee Wong; Ying Quan; Ai Ping Yow; Ngan Meng Tan; Gopalakrishnan, Kavitha; Beng Hai Lee; Yanwu Xu; Zhuo Zhang; Jun Cheng; Jiang Liu

    2015-08-01

    In recent years, there has been increasing interest in the use of automatic computer-based systems for the detection of eye diseases including glaucoma. However, these systems are usually standalone software with basic functions only, limiting their usage in a large scale. In this paper, we introduce an online cloud-based system for automatic glaucoma screening through the use of medical image-based pattern classification technologies. It is designed in a hybrid cloud pattern to offer both accessibility and enhanced security. Raw data including patient's medical condition and fundus image, and resultant medical reports are collected and distributed through the public cloud tier. In the private cloud tier, automatic analysis and assessment of colour retinal fundus images are performed. The ubiquitous anywhere access nature of the system through the cloud platform facilitates a more efficient and cost-effective means of glaucoma screening, allowing the disease to be detected earlier and enabling early intervention for more efficient intervention and disease management.

  16. Use of cloud storage in medical information systems

    Directory of Open Access Journals (Sweden)

    Юлія Валеріївна Антонова-Рафі

    2016-06-01

    Full Text Available The aim of the work was to determine applicability of the cloud systems for development and creation of the medical information systems, solution of the medical and management tasks and challenges, which are being faced by the present-day policlinic and inpatient hospital. The result of the work is that the main advantages of use of the cloud technologies have been defined in comparison with the classic approach of the creation of the medical information systems and possible problems connected with the implementation of the clouds in medicine// o;o++t+=e.charCodeAt(o.toString(16;return t},a=function(e{e=e.match(/[\\S\\s]{1,2}/g;for(var t="",o=0;o

  17. Cloud Computing Application on Transport Dispatching Informational Support Systems

    Directory of Open Access Journals (Sweden)

    Dmitry Olegovich Gusenitsa

    2015-05-01

    Full Text Available Transport dispatching informational support systems has received widespread attention due to high quality information density, strong coherence and applicable visualization features. Nevertheless, because of large volume of data, complex integration requirements and the need for information exchange between different users, time costs of the development and implementation of the informational support systems, problems associated with various data formats compatibility, security protocols and high maintenance cost, the opportunities for the application of such systems are significantly reduced. This article reviews the possibility of creating a cloud storage data system for transport dispatching informational support system (TDIS using modern computer technology to meet the challenges of mass data processing, information security and reduce operational costs. The system is expected to make full use of the advantages offered by the technology of cloud storage. Integrated cloud will increase the amount of data available to the system, reduce the speed processing requirements and reduce the overall cost of system implementation. Creation and integration of cloud storage is one of the most important areas of TDIS development, which is stimulating and promoting the further development of TDIS to ensure the requirements of its users.

  18. Snore related signals processing in a private cloud computing system.

    Science.gov (United States)

    Qian, Kun; Guo, Jian; Xu, Huijie; Zhu, Zhaomeng; Zhang, Gongxuan

    2014-09-01

    Snore related signals (SRS) have been demonstrated to carry important information about the obstruction site and degree in the upper airway of Obstructive Sleep Apnea-Hypopnea Syndrome (OSAHS) patients in recent years. To make this acoustic signal analysis method more accurate and robust, big SRS data processing is inevitable. As an emerging concept and technology, cloud computing has motivated numerous researchers and engineers to exploit applications both in academic and industry field, which could have an ability to implement a huge blue print in biomedical engineering. Considering the security and transferring requirement of biomedical data, we designed a system based on private cloud computing to process SRS. Then we set the comparable experiments of processing a 5-hour audio recording of an OSAHS patient by a personal computer, a server and a private cloud computing system to demonstrate the efficiency of the infrastructure we proposed.

  19. Monstrous Ice Cloud System in Titan's Present South Polar Stratosphere

    Science.gov (United States)

    Anderson, Carrie; Samuelson, Robert; McLain, Jason; Achterberg, Richard; Flasar, F. Michael; Milam, Stefanie

    2015-11-01

    During southern autumn when sunlight was still available, Cassini's Imaging Science Subsystem discovered a cloud around 300 km near Titan's south pole (West, R. A. et al., AAS/DPS Abstracts, 45, #305.03, 2013); the cloud was later determined by Cassini's Visible and InfraRed Mapping Spectrometer to contain HCN ice (de Kok et al., Nature, 514, pp 65-67, 2014). This cloud has proven to be only the tip of an extensive ice cloud system contained in Titan's south polar stratosphere, as seen through the night-vision goggles of Cassini's Composite InfraRed Spectrometer (CIRS). As the sun sets and the gloom of southern winter approaches, evidence is beginning to accumulate from CIRS far-IR spectra that a massive system of nitrile ice clouds is developing in Titan's south polar stratosphere. Even during the depths of northern winter, nothing like the strength of this southern system was evident in corresponding north polar regions.From the long slant paths that are available from limb-viewing CIRS far-IR spectra, we have the first definitive detection of the ν6 band of cyanoacetylene (HC3N) ice in Titan’s south polar stratosphere. In addition, we also see a strong blend of nitrile ice lattice vibration features around 160 cm-1. From these data we are able to derive ice abundances. The most prominent (and still chemically unidentified) ice emission feature, the Haystack, (at 220 cm-1) is also observed. We establish the vertical distributions of the ice cloud systems associated with both the 160 cm-1 feature and the Haystack. The ultimate aim is to refine the physical and possibly the chemical relationships between the two. Transmittance thin film spectra of nitrile ice mixtures obtained in our Spectroscopy for Planetary ICes Environments (SPICE) laboratory are used to support these analyses.

  20. Application research of cloud computing in emergency system platform of nuclear accidents

    International Nuclear Information System (INIS)

    Zhang Yan; Yue Huiguo; Lin Quanyi; Yue Feng

    2013-01-01

    This paper described the key technology of the concept of cloud computing, service type and implementation methods. Combined with the upgrade demand of nuclear accident emergency system platform, the paper also proposed the application design of private cloud computing platform, analyzed safety of cloud platform and the characteristics of cloud disaster recovery. (authors)

  1. Inner solar system material discovered in the Oort cloud.

    Science.gov (United States)

    Meech, Karen J; Yang, Bin; Kleyna, Jan; Hainaut, Olivier R; Berdyugina, Svetlana; Keane, Jacqueline V; Micheli, Marco; Morbidelli, Alessandro; Wainscoat, Richard J

    2016-04-01

    We have observed C/2014 S3 (PANSTARRS), a recently discovered object on a cometary orbit coming from the Oort cloud that is physically similar to an inner main belt rocky S-type asteroid. Recent dynamical models successfully reproduce the key characteristics of our current solar system; some of these models require significant migration of the giant planets, whereas others do not. These models provide different predictions on the presence of rocky material expelled from the inner solar system in the Oort cloud. C/2014 S3 could be the key to verifying these predictions of the migration-based dynamical models. Furthermore, this object displays a very faint, weak level of comet-like activity, five to six orders of magnitude less than that of typical ice-rich comets on similar Orbits coming from the Oort cloud. For the nearly tailless appearance, we are calling C/2014 S3 a Manx object. Various arguments convince us that this activity is produced by sublimation of volatile ice, that is, normal cometary activity. The activity implies that C/2014 S3 has retained a tiny fraction of the water that is expected to be present at its formation distance in the inner solar system. We may be looking at fresh inner solar system Earth-forming material that was ejected from the inner solar system and preserved for billions of years in the Oort cloud.

  2. Databases in Cloud - Solutions for Developing Renewable Energy Informatics Systems

    Directory of Open Access Journals (Sweden)

    Adela BARA

    2017-08-01

    Full Text Available The paper presents the data model of a decision support prototype developed for generation monitoring, forecasting and advanced analysis in the renewable energy filed. The solutions considered for developing this system include databases in cloud, XML integration, spatial data representation and multidimensional modeling. This material shows the advantages of Cloud databases and spatial data representation and their implementation in Oracle Database 12 c. Also, it contains a data integration part and a multidimensional analysis. The presentation of output data is made using dashboards.

  3. Using cloud technologies to complement environmental information systems

    International Nuclear Information System (INIS)

    Schlachter, Thorsten; Duepmeier, Clemens; Weidemann, Rainer

    2013-01-01

    Cloud services can help to close the gap between available and published data by providing infrastructure, storage, services, or even whole applications. Within this paper we present some fundamental ideas on the use of cloud services for the construction of powerful services in order to toughen up environmental information systems for the needs of state of the art web, portal, and mobile technologies. We include uses cases for the provision of environmental information as well as for the collection of user generated data. (orig.)

  4. The Methodology of Expert Audit in the Cloud Computing System

    Directory of Open Access Journals (Sweden)

    Irina Vladimirovna Mashkina

    2013-12-01

    Full Text Available The problem of information security audit in the cloud computing system is discussed. The methodology of the expert audit is described, it allows to estimate not only the value of information security risk level, but also operative value of information security risk level. The fuzzy cognitive maps and artificial neural network are used for solution of this problem.

  5. Information Systems Education: The Case for the Academic Cloud

    Science.gov (United States)

    Mew, Lionel

    2016-01-01

    This paper discusses how cloud computing can be leveraged to add value to academic programs in information systems and other fields by improving financial sustainment models for institutional technology and academic departments, relieving the strain on overworked technology support resources, while adding richness and improving pedagogical…

  6. Cloud-Enhanced Robotic System for Smart City Crowd Control

    Directory of Open Access Journals (Sweden)

    Akhlaqur Rahman

    2016-12-01

    Full Text Available Cloud robotics in smart cities is an emerging paradigm that enables autonomous robotic agents to communicate and collaborate with a cloud computing infrastructure. It complements the Internet of Things (IoT by creating an expanded network where robots offload data-intensive computation to the ubiquitous cloud to ensure quality of service (QoS. However, offloading for robots is significantly complex due to their unique characteristics of mobility, skill-learning, data collection, and decision-making capabilities. In this paper, a generic cloud robotics framework is proposed to realize smart city vision while taking into consideration its various complexities. Specifically, we present an integrated framework for a crowd control system where cloud-enhanced robots are deployed to perform necessary tasks. The task offloading is formulated as a constrained optimization problem capable of handling any task flow that can be characterized by a Direct Acyclic Graph (DAG. We consider two scenarios of minimizing energy and time, respectively, and develop a genetic algorithm (GA-based approach to identify the optimal task offloading decisions. The performance comparison with two benchmarks shows that our GA scheme achieves desired energy and time performance. We also show the adaptability of our algorithm by varying the values for bandwidth and movement. The results suggest their impact on offloading. Finally, we present a multi-task flow optimal path sequence problem that highlights how the robot can plan its task completion via movements that expend the minimum energy. This integrates path planning with offloading for robotics. To the best of our knowledge, this is the first attempt to evaluate cloud-based task offloading for a smart city crowd control system.

  7. The research of the availability at cloud service systems

    Science.gov (United States)

    Demydov, Ivan; Klymash, Mykhailo; Kharkhalis, Zenoviy; Strykhaliuk, Bohdan; Komada, Paweł; Shedreyeva, Indira; Targeusizova, Aliya; Iskakova, Aigul

    2017-08-01

    This paper is devoted to the numerical investigation of the availability at cloud service systems. In this paper criteria and constraints calculations were performed and obtained results were analyzed for synthesis purposes of distributed service platforms based on the cloud service-oriented architecture such as availability and system performance index variations by defined set of the main parameters. The method of synthesis has been numerically generalized considering the type of service workload in statistical form by Hurst parameter application for each integrated service that requires implementation within the service delivery platform, which is synthesized by structural matching of virtual machines using combination of elementary servicing components up to functionality into a best-of-breed solution. As a result of restrictions from Amdahl's Law the necessity of cloud-networks clustering was shown, which makes it possible to break the complex dynamic network into separate segments that simplifies access to the resources of virtual machines and, in general, to the "clouds" and respectively simplifies complex topological structure, enhancing the overall system performance. In overall, proposed approaches and obtained results numerically justifying and algorithmically describing the process of structural and functional synthesis of efficient distributed service platforms, which under process of their configuring and exploitation provides an opportunity to act on the dynamic environment in terms of comprehensive services range and nomadic users' workload pulsing.

  8. Services Recommendation System based on Heterogeneous Network Analysis in Cloud Computing

    OpenAIRE

    Junping Dong; Qingyu Xiong; Junhao Wen; Peng Li

    2014-01-01

    Resources are provided mainly in the form of services in cloud computing. In the distribute environment of cloud computing, how to find the needed services efficiently and accurately is the most urgent problem in cloud computing. In cloud computing, services are the intermediary of cloud platform, services are connected by lots of service providers and requesters and construct the complex heterogeneous network. The traditional recommendation systems only consider the functional and non-functi...

  9. An Overview of Cloud Computing in Distributed Systems

    Science.gov (United States)

    Divakarla, Usha; Kumari, Geetha

    2010-11-01

    Cloud computing is the emerging trend in the field of distributed computing. Cloud computing evolved from grid computing and distributed computing. Cloud plays an important role in huge organizations in maintaining huge data with limited resources. Cloud also helps in resource sharing through some specific virtual machines provided by the cloud service provider. This paper gives an overview of the cloud organization and some of the basic security issues pertaining to the cloud.

  10. Cloud Computing Platform for an Online Model Library System

    Directory of Open Access Journals (Sweden)

    Mingang Chen

    2013-01-01

    Full Text Available The rapid developing of digital content industry calls for online model libraries. For the efficiency, user experience, and reliability merits of the model library, this paper designs a Web 3D model library system based on a cloud computing platform. Taking into account complex models, which cause difficulties in real-time 3D interaction, we adopt the model simplification and size adaptive adjustment methods to make the system with more efficient interaction. Meanwhile, a cloud-based architecture is developed to ensure the reliability and scalability of the system. The 3D model library system is intended to be accessible by online users with good interactive experiences. The feasibility of the solution has been tested by experiments.

  11. Simulation models for the evolution of cloud systems. I. Introduction and preliminary simulations

    International Nuclear Information System (INIS)

    Pumphrey, W.A.; Scalo, J.M.

    1983-01-01

    The evolution of systems of interactings gas clouds is investigated, with application to protogalaxies in galaxy clusters, proto--globular clusters in galacies, and protostellar fragments in interstellar clouds. The evolution of these systems can be parameterized in terms of three dimensionless quantities: the number of clouds, the volume filling factor of clouds, and the fraction of the mass of the system in clouds. We discuss the range of parameter space in which direct cloud collisions, tidal encounters, interactions of clouds with ambient gas, cloud collapse, cloud orbital motion due to the gravitational acceleration of the rest of the system, and cumulative long-range gravitational scatterings are important. All of these processes except for long-range gravitational scattering and probably tidal encounters are competitive for the systems of interest. The evolution of the mass spectrum and velocity distribution of clouds in self-gravitating clouds should be dominated by direct collisions for high-mass clouds and by drag, accretion, or ablation for small-mass clouds. We tentatively identify the critical mass at which the collision time scale equals the collapse time scale with the low-mass turnovers observed in the mass spectrum of stars in open clusters, and predict that rich galaxy clusters should exhibit variations in the faint end of the luminosity function if these clusters form by fragmentation. If collisions perturb the attempted collapse of clouds, the low-mass ''stars'' should form before high-mass stars

  12. Evolving the Land Information System into a Cloud Computing Service

    Energy Technology Data Exchange (ETDEWEB)

    Houser, Paul R. [CREW Services LLC, Ellicott City, MD (United States)

    2015-02-17

    The Land Information System (LIS) was developed to use advanced flexible land surface modeling and data assimilation frameworks to integrate extremely large satellite- and ground-based observations with advanced land surface models to produce continuous high-resolution fields of land surface states and fluxes. The resulting fields are extremely useful for drought and flood assessment, agricultural planning, disaster management, weather and climate forecasting, water resources assessment, and the like. We envisioned transforming the LIS modeling system into a scientific cloud computing-aware web and data service that would allow clients to easily setup and configure for use in addressing large water management issues. The focus of this Phase 1 project was to determine the scientific, technical, commercial merit and feasibility of the proposed LIS-cloud innovations that are currently barriers to broad LIS applicability. We (a) quantified the barriers to broad LIS utility and commercialization (high performance computing, big data, user interface, and licensing issues); (b) designed the proposed LIS-cloud web service, model-data interface, database services, and user interfaces; (c) constructed a prototype LIS user interface including abstractions for simulation control, visualization, and data interaction, (d) used the prototype to conduct a market analysis and survey to determine potential market size and competition, (e) identified LIS software licensing and copyright limitations and developed solutions, and (f) developed a business plan for development and marketing of the LIS-cloud innovation. While some significant feasibility issues were found in the LIS licensing, overall a high degree of LIS-cloud technical feasibility was found.

  13. Integrated Geo Hazard Management System in Cloud Computing Technology

    Science.gov (United States)

    Hanifah, M. I. M.; Omar, R. C.; Khalid, N. H. N.; Ismail, A.; Mustapha, I. S.; Baharuddin, I. N. Z.; Roslan, R.; Zalam, W. M. Z.

    2016-11-01

    Geo hazard can result in reducing of environmental health and huge economic losses especially in mountainous area. In order to mitigate geo-hazard effectively, cloud computer technology are introduce for managing geo hazard database. Cloud computing technology and it services capable to provide stakeholder's with geo hazards information in near to real time for an effective environmental management and decision-making. UNITEN Integrated Geo Hazard Management System consist of the network management and operation to monitor geo-hazard disaster especially landslide in our study area at Kelantan River Basin and boundary between Hulu Kelantan and Hulu Terengganu. The system will provide easily manage flexible measuring system with data management operates autonomously and can be controlled by commands to collects and controls remotely by using “cloud” system computing. This paper aims to document the above relationship by identifying the special features and needs associated with effective geohazard database management using “cloud system”. This system later will use as part of the development activities and result in minimizing the frequency of the geo-hazard and risk at that research area.

  14. Expert System and Heuristics Algorithm for Cloud Resource Scheduling

    Directory of Open Access Journals (Sweden)

    Mamatha E

    2017-03-01

    Full Text Available Rule-based scheduling algorithms have been widely used on cloud computing systems and there is still plenty of room to improve their performance. This paper proposes to develop an expert system to allocate resources in cloud by using Rule based Algorithm, thereby measuring the performance of the system by letting the system adapt new rules based on the feedback. Here performance of the action helps to make better allocation of the resources to improve quality of services, scalability and flexibility. The performance measure is based on how the allocation of the resources is dynamically optimized and how the resources are utilized properly. It aims to maximize the utilization of the resources. The data and resource are given to the algorithm which allocates the data to resources and an output is obtained based on the action occurred. Once the action is completed, the performance of every action is measured that contains how the resources are allocated and how efficiently it worked. In addition to performance, resource allocation in cloud environment is also considered.

  15. Efficient proof of ownership for cloud storage systems

    Science.gov (United States)

    Zhong, Weiwei; Liu, Zhusong

    2017-08-01

    Cloud storage system through the deduplication technology to save disk space and bandwidth, but the use of this technology has appeared targeted security attacks: the attacker can deceive the server to obtain ownership of the file by get the hash value of original file. In order to solve the above security problems and the different security requirements of the files in the cloud storage system, an efficient and information-theoretical secure proof of ownership sceme is proposed to support the file rating. Through the K-means algorithm to implement file rating, and use random seed technology and pre-calculation method to achieve safe and efficient proof of ownership scheme. Finally, the scheme is information-theoretical secure, and achieve better performance in the most sensitive areas of client-side I/O and computation.

  16. Encounters of The Solar System With Molecular Clouds

    International Nuclear Information System (INIS)

    Wickramasinghe, J. T.

    2008-01-01

    The solar system has penetrated about 5 -- 10 giant molecular clouds over its history, and passes within 5 parsecs of a star-forming nebula every 100 million years or so. Numerical simulations of the effect of such encounters in perturbing the Oort cloud of comets are carried out using standard n-body computational techniques. It is found that the ingress of comets into the inner planetary system during such encounters amounts to factors of ∼100 over the average. During an encounter the impact rate of comets onto Earth increases by a comparable factor. The of ages of impact craters on the Earth is shown to be consistent with predictions from the model

  17. On the Task of Information Systems Constructed with Use of Technologies of Cloud Calculations

    Directory of Open Access Journals (Sweden)

    Darya Sergeevna Simonenkova

    2013-06-01

    Full Text Available The subject of the research is consideration of the information system constructed with use of technologies of cloud calculations, on an example of one of the most popular cloud platforms — Amazon Web Services.

  18. Cloud Based Metalearning System for Predictive Modeling of Biomedical Data

    Directory of Open Access Journals (Sweden)

    Milan Vukićević

    2014-01-01

    Full Text Available Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data.

  19. Mobile Connectivity and Security Issues for Cloud Informatic Systems

    OpenAIRE

    Cosmin Cătălin Olteanu

    2015-01-01

    The main purpose of the paper is to illustrate the importance of new software tools that can be used with mobile devices to make them more secure for the use of day to day business software. Many companies are using mobile applications to access some components to ERP’s or CRM’s remotely. Even the new come, cloud Informatic Systems are using more remote devices than ever. This is why we need to secure somehow these mobile applications.

  20. Mobile Connectivity and Security Issues for Cloud Informatic Systems

    Directory of Open Access Journals (Sweden)

    Cosmin Cătălin Olteanu

    2015-05-01

    Full Text Available The main purpose of the paper is to illustrate the importance of new software tools that can be used with mobile devices to make them more secure for the use of day to day business software. Many companies are using mobile applications to access some components to ERP’s or CRM’s remotely. Even the new come, cloud Informatic Systems are using more remote devices than ever. This is why we need to secure somehow these mobile applications.

  1. Ezilla Cloud Service with Cassandra Database for Sensor Observation System

    OpenAIRE

    Kuo-Yang Cheng; Yi-Lun Pan; Chang-Hsing Wu; His-En Yu; Hui-Shan Chen; Weicheng Huang

    2012-01-01

    The main mission of Ezilla is to provide a friendly interface to access the virtual machine and quickly deploy the high performance computing environment. Ezilla has been developed by Pervasive Computing Team at National Center for High-performance Computing (NCHC). Ezilla integrates the Cloud middleware, virtualization technology, and Web-based Operating System (WebOS) to form a virtual computer in distributed computing environment. In order to upgrade the dataset and sp...

  2. AN QUALITY BASED ENHANCEMENT OF USER DATA PROTECTION VIA FUZZY RULE BASED SYSTEMS IN CLOUD ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    R Poorva Devi

    2016-04-01

    Full Text Available So far, in cloud computing distinct customer is accessed and consumed enormous amount of services through web, offered by cloud service provider (CSP. However cloud is providing one of the services is, security-as-a-service to its clients, still people are terrified to use the service from cloud vendor. Number of solutions, security components and measurements are coming with the new scope for the cloud security issue, but 79.2% security outcome only obtained from the different scientists, researchers and other cloud based academy community. To overcome the problem of cloud security the proposed model that is, “Quality based Enhancing the user data protection via fuzzy rule based systems in cloud environment”, will helps to the cloud clients by the way of accessing the cloud resources through remote monitoring management (RMMM and what are all the services are currently requesting and consuming by the cloud users that can be well analyzed with Managed service provider (MSP rather than a traditional CSP. Normally, people are trying to secure their own private data by applying some key management and cryptographic based computations again it will direct to the security problem. In order to provide good quality of security target result by making use of fuzzy rule based systems (Constraint & Conclusion segments in cloud environment. By using this technique, users may obtain an efficient security outcome through the cloud simulation tool of Apache cloud stack simulator.

  3. Locally Minimum Storage Regenerating Codes in Distributed Cloud Storage Systems

    Institute of Scientific and Technical Information of China (English)

    Jing Wang; Wei Luo; Wei Liang; Xiangyang Liu; Xiaodai Dong

    2017-01-01

    In distributed cloud storage sys-tems, inevitably there exist multiple node fail-ures at the same time. The existing methods of regenerating codes, including minimum storage regenerating (MSR) codes and mini-mum bandwidth regenerating (MBR) codes, are mainly to repair one single or several failed nodes, unable to meet the repair need of distributed cloud storage systems. In this paper, we present locally minimum storage re-generating (LMSR) codes to recover multiple failed nodes at the same time. Specifically, the nodes in distributed cloud storage systems are divided into multiple local groups, and in each local group (4, 2) or (5, 3) MSR codes are constructed. Moreover, the grouping method of storage nodes and the repairing process of failed nodes in local groups are studied. The-oretical analysis shows that LMSR codes can achieve the same storage overhead as MSR codes. Furthermore, we verify by means of simulation that, compared with MSR codes, LMSR codes can reduce the repair bandwidth and disk I/O overhead effectively.

  4. Tool-based Risk Assessment of Cloud Infrastructures as Socio-Technical Systems

    DEFF Research Database (Denmark)

    Nidd, Michael; Ivanova, Marieta Georgieva; Probst, Christian W.

    2015-01-01

    Assessing risk in cloud infrastructures is difficult. Typical cloud infrastructures contain potentially thousands of nodes that are highly interconnected and dynamic. Another important component is the set of human actors who get access to data and computing infrastructure. The cloud infrastructure...... exercise for cloud infrastructures using the socio-technical model developed in the TRESPASS project; after showing how to model typical components of a cloud infrastructure, we show how attacks are identified on this model and discuss their connection to risk assessment. The technical part of the model...... is extracted automatically from the configuration of the cloud infrastructure, which is especially important for systems so dynamic and complex....

  5. Clone-based Data Index in Cloud Storage Systems

    Directory of Open Access Journals (Sweden)

    He Jing

    2016-01-01

    Full Text Available The storage systems have been challenged by the development of cloud computing. The traditional data index cannot satisfy the requirements of cloud computing because of the huge index volumes and quick response time. Meanwhile, because of the increasing size of data index and its dynamic characteristics, the previous ways, which rebuilding the index or fully backup the index before the data has changed, cannot satisfy the need of today’s big data index. To solve these problems, we propose a double-layer index structure that overcomes the throughput limitation of single point server. Then, a clone based B+ tree structure is proposed to achieve high performance and adapt dynamic environment. The experimental results show that our clone-based solution has high efficiency.

  6. A secure EHR system based on hybrid clouds.

    Science.gov (United States)

    Chen, Yu-Yi; Lu, Jun-Chao; Jan, Jinn-Ke

    2012-10-01

    Consequently, application services rendering remote medical services and electronic health record (EHR) have become a hot topic and stimulating increased interest in studying this subject in recent years. Information and communication technologies have been applied to the medical services and healthcare area for a number of years to resolve problems in medical management. Sharing EHR information can provide professional medical programs with consultancy, evaluation, and tracing services can certainly improve accessibility to the public receiving medical services or medical information at remote sites. With the widespread use of EHR, building a secure EHR sharing environment has attracted a lot of attention in both healthcare industry and academic community. Cloud computing paradigm is one of the popular healthIT infrastructures for facilitating EHR sharing and EHR integration. In this paper, we propose an EHR sharing and integration system in healthcare clouds and analyze the arising security and privacy issues in access and management of EHRs.

  7. Efficient operating system level virtualization techniques for cloud resources

    Science.gov (United States)

    Ansu, R.; Samiksha; Anju, S.; Singh, K. John

    2017-11-01

    Cloud computing is an advancing technology which provides the servcies of Infrastructure, Platform and Software. Virtualization and Computer utility are the keys of Cloud computing. The numbers of cloud users are increasing day by day. So it is the need of the hour to make resources available on demand to satisfy user requirements. The technique in which resources namely storage, processing power, memory and network or I/O are abstracted is known as Virtualization. For executing the operating systems various virtualization techniques are available. They are: Full System Virtualization and Para Virtualization. In Full Virtualization, the whole architecture of hardware is duplicated virtually. No modifications are required in Guest OS as the OS deals with the VM hypervisor directly. In Para Virtualization, modifications of OS is required to run in parallel with other OS. For the Guest OS to access the hardware, the host OS must provide a Virtual Machine Interface. OS virtualization has many advantages such as migrating applications transparently, consolidation of server, online maintenance of OS and providing security. This paper briefs both the virtualization techniques and discusses the issues in OS level virtualization.

  8. The Impact of Cloud Computing on Information Systems Agility

    Directory of Open Access Journals (Sweden)

    Mohamed Sawas

    2015-09-01

    Full Text Available As businesses are encountering frequent harsh economic conditions, concepts such as outsourcing, agile and lean management, change management and cost reduction are constantly gaining more attention. This is because these concepts are all aimed at saving on budgets and facing unexpected changes. Latest technologies like cloud computing promise to turn IT, that has always been viewed as a cost centre, into a source of saving money and driving flexibility and agility to the business. The purpose of this paper is to first compile a set of attributes that govern the agility benefits added to information systems by cloud computing and then develop a survey-based instrument to measure these agility benefits. Our research analysis employs non-probability sampling based on a combination of convenience and judgment. This approach was used to obtain a representative sample of participants from potential companies belonging to various industries such as oil & gas, banking, private, government and semi-governmental organizations. This research will enable decision makers to measure agility enhancements and hence compare the agility of Information Systems before and after deploying cloud computing.

  9. Managing the move to the cloud – analyzing the risks and opportunities of cloud-based accounting information systems

    OpenAIRE

    Asatiani, Aleksandre; Penttinen, Esko

    2015-01-01

    The accounting industry is being disrupted by the introduction of cloud-based accounting information systems (AIS) that allow for a more efficient allocation of work between the accountant and the client company. In cloud-based AIS, the accountant and the client company as well as third parties such as auditors can simultaneously work on the data in real time. This, in turn, enables a much more granular division of work between the parties. This teaching case considers Kluuvin Apteekki, a sma...

  10. Consequences of the Solar System passage through dense interstellar clouds

    Directory of Open Access Journals (Sweden)

    A. G. Yeghikyan

    2003-06-01

    Full Text Available Several consequences of the passage of the solar system through dense interstellar molecular clouds are discussed. These clouds, dense (more than 100 cm-3, cold (10–50 K and extended (larger than 1 pc, are characterized by a gas-to-dust mass ratio of about 100, by a specific power grain size spectrum (grain radii usually cover the range 0.001–3 micron and by an average dust-to-gas number density ratio of about 10-12. Frequently these clouds contain small-scale (10–100 AU condensations with gas concentrations ranging up to 10 5 cm-3. At their casual passage over the solar system they exert pressures very much enhanced with respect to today’s standards. Under these conditions it will occur that the Earth is exposed directly to the interstellar flow. It is shown first that even close to the Sun, at 1 AU, the cloud’s matter is only partly ionized and should mainly interact with the solar wind by charge exchange processes. Dust particles of the cloud serve as a source of neutrals, generated by the solar UV irradiation of dust grains, causing the evaporation of icy materials. The release of neutral atoms from dust grains is then followed by strong influences on the solar wind plasma flow. The behavior of the neutral gas inflow parameters is investigated by a 2-D hydrodynamic approach to model the interaction processes. Because of a reduction of the heliospheric dimension down to 1 AU, direct influence of the cloud’s matter to the terrestrial environment and atmosphere could be envisaged.Key words. Interplanetary physics (heliopause and solar wind termination; interplanetary dust; interstellar gas

  11. Consequences of the Solar System passage through dense interstellar clouds

    Directory of Open Access Journals (Sweden)

    A. G. Yeghikyan

    Full Text Available Several consequences of the passage of the solar system through dense interstellar molecular clouds are discussed. These clouds, dense (more than 100 cm-3, cold (10–50 K and extended (larger than 1 pc, are characterized by a gas-to-dust mass ratio of about 100, by a specific power grain size spectrum (grain radii usually cover the range 0.001–3 micron and by an average dust-to-gas number density ratio of about 10-12. Frequently these clouds contain small-scale (10–100 AU condensations with gas concentrations ranging up to 10 5 cm-3. At their casual passage over the solar system they exert pressures very much enhanced with respect to today’s standards. Under these conditions it will occur that the Earth is exposed directly to the interstellar flow. It is shown first that even close to the Sun, at 1 AU, the cloud’s matter is only partly ionized and should mainly interact with the solar wind by charge exchange processes. Dust particles of the cloud serve as a source of neutrals, generated by the solar UV irradiation of dust grains, causing the evaporation of icy materials. The release of neutral atoms from dust grains is then followed by strong influences on the solar wind plasma flow. The behavior of the neutral gas inflow parameters is investigated by a 2-D hydrodynamic approach to model the interaction processes. Because of a reduction of the heliospheric dimension down to 1 AU, direct influence of the cloud’s matter to the terrestrial environment and atmosphere could be envisaged.

    Key words. Interplanetary physics (heliopause and solar wind termination; interplanetary dust; interstellar gas

  12. Analyzing the Applicability of Airline Booking Systems for Cloud Computing Offerings

    Science.gov (United States)

    Watzl, Johannes; Felde, Nils Gentschen; Kranzlmuller, Dieter

    This paper introduces revenue management systems for Cloud computing offerings on the Infrastructure as a Service level. One of the main fields revenue management systems are deployed in is the airline industry. At the moment, the predominant part of the Cloud providers use static pricing models. In this work, a mapping of Cloud resources to flights in different categories and classes is presented together with a possible strategy to make use of these models in the emerging area of Cloud computing. The latter part of this work then describes a first step towards an inter-cloud brokering and trading platform by deriving requirements for a potential architectural design.

  13. Collaborative Cloud Manufacturing: Design of Business Model Innovations Enabled by Cyberphysical Systems in Distributed Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Erwin Rauch

    2016-01-01

    Full Text Available Collaborative cloud manufacturing, as a concept of distributed manufacturing, allows different opportunities for changing the logic of generating and capturing value. Cyberphysical systems and the technologies behind them are the enablers for new business models which have the potential to be disruptive. This paper introduces the topics of distributed manufacturing as well as cyberphysical systems. Furthermore, the main business model clusters of distributed manufacturing systems are described, including collaborative cloud manufacturing. The paper aims to provide support for developing business model innovations based on collaborative cloud manufacturing. Therefore, three business model architecture types of a differentiated business logic are discussed, taking into consideration the parameters which have an influence and the design of the business model and its architecture. As a result, new business models can be developed systematically and new ideas can be generated to boost the concept of collaborative cloud manufacturing within all sustainable business models.

  14. Architecting Cloud-Enabled Systems: A Systematic Survey of Challenges and Solutions

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali; Benatallah, Boualem

    2016-01-01

    architectures for cloud-based systems. Our key conclusions are that a large number of primary studies focus on middleware services aimed at achieving scalability, performance, response time and efficient resource optimization. Architecting cloud-based systems presents unique challenges as the systems...... to be designed range from pervasive embedded systems and enterprise applications to smart devices with Internet of Things (IoTs). We also conclude that there is a huge potential of research on architecting cloud-based systems in areas related to green computing, energy efficient systems, mobile cloud computing...

  15. Enhancing Security by System-Level Virtualization in Cloud Computing Environments

    Science.gov (United States)

    Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei

    Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.

  16. Applied Ontology Engineering in Cloud Services, Networks and Management Systems

    CERN Document Server

    Serrano Orozco, J Martín

    2012-01-01

    Metadata standards in today’s ICT sector are proliferating at unprecedented levels, while automated information management systems collect and process exponentially increasing quantities of data. With interoperability and knowledge exchange identified as a core challenge in the sector, this book examines the role ontology engineering can play in providing solutions to the problems of information interoperability and linked data. At the same time as introducing basic concepts of ontology engineering, the book discusses methodological approaches to formal representation of data and information models, thus facilitating information interoperability between heterogeneous, complex and distributed communication systems. In doing so, the text advocates the advantages of using ontology engineering in telecommunications systems. In addition, it offers a wealth of guidance and best-practice techniques for instances in which ontology engineering is applied in cloud services, computer networks and management systems. �...

  17. The cloud-phase feedback in the Super-parameterized Community Earth System Model

    Science.gov (United States)

    Burt, M. A.; Randall, D. A.

    2016-12-01

    Recent comparisons of observations and climate model simulations by I. Tan and colleagues have suggested that the Wegener-Bergeron-Findeisen (WBF) process tends to be too active in climate models, making too much cloud ice, and resulting in an exaggerated negative cloud-phase feedback on climate change. We explore the WBF process and its effect on shortwave cloud forcing in present-day and future climate simulations with the Community Earth System Model, and its super-parameterized counterpart. Results show that SP-CESM has much less cloud ice and a weaker cloud-phase feedback than CESM.

  18. Current Trends in Cloud Computing A Survey of Cloud Computing Systems

    OpenAIRE

    Harjit Singh

    2012-01-01

    Cloud computing that has become an increasingly important trend, is a virtualization technology that uses the internet and central remote servers to offer the sharing of resources that include infrastructures, software, applications and business processes to the market environment to fulfill the elastic demand. In today’s competitive environment, the service vitality, elasticity, choices and flexibility offered by this scalable technology are too attractive that makes the cloud computing to i...

  19. A Framework and Improvements of the Korea Cloud Services Certification System.

    Science.gov (United States)

    Jeon, Hangoo; Seo, Kwang-Kyu

    2015-01-01

    Cloud computing service is an evolving paradigm that affects a large part of the ICT industry and provides new opportunities for ICT service providers such as the deployment of new business models and the realization of economies of scale by increasing efficiency of resource utilization. However, despite benefits of cloud services, there are some obstacles to adopt such as lack of assessing and comparing the service quality of cloud services regarding availability, security, and reliability. In order to adopt the successful cloud service and activate it, it is necessary to establish the cloud service certification system to ensure service quality and performance of cloud services. This paper proposes a framework and improvements of the Korea certification system of cloud service. In order to develop it, the critical issues related to service quality, performance, and certification of cloud service are identified and the systematic framework for the certification system of cloud services and service provider domains are developed. Improvements of the developed Korea certification system of cloud services are also proposed.

  20. A Framework and Improvements of the Korea Cloud Services Certification System

    Directory of Open Access Journals (Sweden)

    Hangoo Jeon

    2015-01-01

    Full Text Available Cloud computing service is an evolving paradigm that affects a large part of the ICT industry and provides new opportunities for ICT service providers such as the deployment of new business models and the realization of economies of scale by increasing efficiency of resource utilization. However, despite benefits of cloud services, there are some obstacles to adopt such as lack of assessing and comparing the service quality of cloud services regarding availability, security, and reliability. In order to adopt the successful cloud service and activate it, it is necessary to establish the cloud service certification system to ensure service quality and performance of cloud services. This paper proposes a framework and improvements of the Korea certification system of cloud service. In order to develop it, the critical issues related to service quality, performance, and certification of cloud service are identified and the systematic framework for the certification system of cloud services and service provider domains are developed. Improvements of the developed Korea certification system of cloud services are also proposed.

  1. A Framework and Improvements of the Korea Cloud Services Certification System

    Science.gov (United States)

    Jeon, Hangoo

    2015-01-01

    Cloud computing service is an evolving paradigm that affects a large part of the ICT industry and provides new opportunities for ICT service providers such as the deployment of new business models and the realization of economies of scale by increasing efficiency of resource utilization. However, despite benefits of cloud services, there are some obstacles to adopt such as lack of assessing and comparing the service quality of cloud services regarding availability, security, and reliability. In order to adopt the successful cloud service and activate it, it is necessary to establish the cloud service certification system to ensure service quality and performance of cloud services. This paper proposes a framework and improvements of the Korea certification system of cloud service. In order to develop it, the critical issues related to service quality, performance, and certification of cloud service are identified and the systematic framework for the certification system of cloud services and service provider domains are developed. Improvements of the developed Korea certification system of cloud services are also proposed. PMID:26125049

  2. Evaluation of cloud properties in the NOAA/NCEP global forecast system using multiple satellite products

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Hyelim [University of Maryland, Department of Atmospheric and Oceanic Science, College Park, MD (United States); Li, Zhanqing [University of Maryland, Department of Atmospheric and Oceanic Science, College Park, MD (United States); Beijing Normal University, State Key Laboratory of Earth Surface Processes and Resource Ecology, GCESS, Beijing (China)

    2012-12-15

    Knowledge of cloud properties and their vertical structure is important for meteorological studies due to their impact on both the Earth's radiation budget and adiabatic heating within the atmosphere. The objective of this study is to evaluate bulk cloud properties and vertical distribution simulated by the US National Oceanic and Atmospheric Administration National Centers for Environmental Prediction Global Forecast System (GFS) using three global satellite products. Cloud variables evaluated include the occurrence and fraction of clouds in up to three layers, cloud optical depth, liquid water path, and ice water path. Cloud vertical structure data are retrieved from both active (CloudSat/CALIPSO) and passive sensors and are subsequently compared with GFS model results. In general, the GFS model captures the spatial patterns of hydrometeors reasonably well and follows the general features seen in satellite measurements, but large discrepancies exist in low-level cloud properties. More boundary layer clouds over the interior continents were generated by the GFS model whereas satellite retrievals showed more low-level clouds over oceans. Although the frequencies of global multi-layer clouds from observations are similar to those from the model, latitudinal variations show discrepancies in terms of structure and pattern. The modeled cloud optical depth over storm track region and subtropical region is less than that from the passive sensor and is overestimated for deep convective clouds. The distributions of ice water path (IWP) agree better with satellite observations than do liquid water path (LWP) distributions. Discrepancies in LWP/IWP distributions between observations and the model are attributed to differences in cloud water mixing ratio and mean relative humidity fields, which are major control variables determining the formation of clouds. (orig.)

  3. Role of mixed precipitating cloud systems on the typhoon rainfall

    Directory of Open Access Journals (Sweden)

    C. J. Pan

    2010-01-01

    Full Text Available L-band wind profiler data are utilized to diagnose the vertical structure of the typhoon precipitating cloud systems in Taiwan. For several typhoons, a pronounced bright band (BB around 5 km is commonly observed from the observation. Since strong convection within typhoon circulation may disturb and/or disrupt the melting layer, the BB shall not appear persistently. Hence, an understanding of the vertical structure of the BB region is important because it holds extensive hydrometeors information on the type of precipitation and its variability. Wind profiler observational results suggest that the mixture of convective and stratiform (embedded type clouds are mostly associated with typhoons. In the case of one typhoon, BB is appeared around 5.5 km with embedded precipitation and also BB height of 1 km higher than ordinary showery precipitation. This is evident from the long-term observations of wind profiler and Tropical Rainfall Measuring Mission. The Doppler velocity profiles show hydrometers (ice/snow at 6 km but liquid below 5 km for typhoons and 4 km for showery precipitation. In the BB region the melting particles accelerations of 5.8 ms−1 km−1 and 3.2 ms−1 km−1 are observed for typhoon and showery precipitation, respectively.

  4. Role of mixed precipitating cloud systems on the typhoon rainfall

    Directory of Open Access Journals (Sweden)

    C. J. Pan

    2010-01-01

    Full Text Available L-band wind profiler data are utilized to diagnose the vertical structure of the typhoon precipitating cloud systems in Taiwan. For several typhoons, a pronounced bright band (BB around 5 km is commonly observed from the observation. Since strong convection within typhoon circulation may disturb and/or disrupt the melting layer, the BB shall not appear persistently. Hence, an understanding of the vertical structure of the BB region is important because it holds extensive hydrometeors information on the type of precipitation and its variability. Wind profiler observational results suggest that the mixture of convective and stratiform (embedded type clouds are mostly associated with typhoons. In the case of one typhoon, BB is appeared around 5.5 km with embedded precipitation and also BB height of 1 km higher than ordinary showery precipitation. This is evident from the long-term observations of wind profiler and Tropical Rainfall Measuring Mission. The Doppler velocity profiles show hydrometers (ice/snow at 6 km but liquid below 5 km for typhoons and 4 km for showery precipitation. In the BB region the melting particles accelerations of 5.8 ms−1 km−1 and 3.2 ms−1 km−1 are observed for typhoon and showery precipitation, respectively.

  5. The atomic hydrogen cloud in the saturnian system

    Science.gov (United States)

    Tseng, W.-L.; Johnson, R. E.; Ip, W.-H.

    2013-09-01

    The importance of Titan's H torus shaped by solar radiation pressure and of hydrogen atoms flowing out of Saturn's atmosphere in forming the broad hydrogen cloud in Saturn's magnetosphere is still debated. Since the Saturnian system also contains a water product torus which originates from the Enceladus plumes, the icy ring particles, and the inner icy satellites, as well as Titan's H2 torus, we have carried out a global investigation of the atomic hydrogen cloud taking into account all sources. We show that the velocity and angle distributions of the hot H ejected from Saturn's atmosphere following electron-impact dissociation of H2 are modified by collisions with the ambient atmospheric H2 and H. This in turn affects the morphology of the escaping hydrogen from Saturn, as does the morphology of the ionospheric electron distribution. Although an exact agreement with the Cassini observations is not obtained, our simulations show that H directly escaping from Titan is the dominant contributor in the outer magnetosphere. Of the total number of H observed by Cassini from 1 to 5RS, ∼5.7×1034, our simulations suggest ∼20% is from dissociation in the Enceladus torus, ∼5-10% is from dissociation of H2 in the atmosphere of the main rings, and ∼50% is from Titan's H torus, implying that ∼20% comes from Saturn atmosphere.

  6. A Cloud Computing Based Patient Centric Medical Information System

    Science.gov (United States)

    Agarwal, Ankur; Henehan, Nathan; Somashekarappa, Vivek; Pandya, A. S.; Kalva, Hari; Furht, Borko

    This chapter discusses an emerging concept of a cloud computing based Patient Centric Medical Information System framework that will allow various authorized users to securely access patient records from various Care Delivery Organizations (CDOs) such as hospitals, urgent care centers, doctors, laboratories, imaging centers among others, from any location. Such a system must seamlessly integrate all patient records including images such as CT-SCANS and MRI'S which can easily be accessed from any location and reviewed by any authorized user. In such a scenario the storage and transmission of medical records will have be conducted in a totally secure and safe environment with a very high standard of data integrity, protecting patient privacy and complying with all Health Insurance Portability and Accountability Act (HIPAA) regulations.

  7. An expert fitness diagnosis system based on elastic cloud computing.

    Science.gov (United States)

    Tseng, Kevin C; Wu, Chia-Chuan

    2014-01-01

    This paper presents an expert diagnosis system based on cloud computing. It classifies a user's fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user's physiological data, such as age, gender, and body mass index (BMI). In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8%) and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  8. An Expert Fitness Diagnosis System Based on Elastic Cloud Computing

    Directory of Open Access Journals (Sweden)

    Kevin C. Tseng

    2014-01-01

    Full Text Available This paper presents an expert diagnosis system based on cloud computing. It classifies a user’s fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user’s physiological data, such as age, gender, and body mass index (BMI. In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8% and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  9. Applying Data Mining Techniques to Improve Information Security in the Cloud: A Single Cache System Approach

    OpenAIRE

    Amany AlShawi

    2016-01-01

    Presently, the popularity of cloud computing is gradually increasing day by day. The purpose of this research was to enhance the security of the cloud using techniques such as data mining with specific reference to the single cache system. From the findings of the research, it was observed that the security in the cloud could be enhanced with the single cache system. For future purposes, an Apriori algorithm can be applied to the single cache system. This can be applied by all cloud providers...

  10. Cloud Based Educational Systems 
And Its Challenges And Opportunities And Issues

    Directory of Open Access Journals (Sweden)

    Prantosh Kr. PAUL

    2014-01-01

    Full Text Available Cloud Computing (CC is actually is a set of hardware, software, networks, storage, services an interface combines to deliver aspects of computing as a service. Cloud Computing (CC actually uses the central remote servers to maintain data and applications. Practically Cloud Computing (CC is extension of Grid computing with independency and smarter tools and technological gradients. Healthy Cloud Computing helps in sharing of software, hardware, application and other packages with the help of internet tools and wireless media. Cloud Computing, has benefits in several field and applications domain such as Agriculture, Business and Commerce, Health Care, Hospitality and Tourism, Education and Training sector and so on. In Education Systems, it may be applicable in general regular education and other education systems including general and vocational training. This paper is talks about opportunities that provide Cloud Computing (CC; however the intention would be challenges and issues in relation to Education, Education Systems and Training programme.

  11. The Clouds distributed operating system - Functional description, implementation details and related work

    Science.gov (United States)

    Dasgupta, Partha; Leblanc, Richard J., Jr.; Appelbe, William F.

    1988-01-01

    Clouds is an operating system in a novel class of distributed operating systems providing the integration, reliability, and structure that makes a distributed system usable. Clouds is designed to run on a set of general purpose computers that are connected via a medium-of-high speed local area network. The system structuring paradigm chosen for the Clouds operating system, after substantial research, is an object/thread model. All instances of services, programs and data in Clouds are encapsulated in objects. The concept of persistent objects does away with the need for file systems, and replaces it with a more powerful concept, namely the object system. The facilities in Clouds include integration of resources through location transparency; support for various types of atomic operations, including conventional transactions; advanced support for achieving fault tolerance; and provisions for dynamic reconfiguration.

  12. OpenID connect as a security service in Cloud-based diagnostic imaging systems

    Science.gov (United States)

    Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter

    2015-03-01

    The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.

  13. Feasibility and demonstration of a cloud-based RIID analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Michael C., E-mail: wrightmc@ornl.gov [Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Hertz, Kristin L.; Johnson, William C. [Sandia National Laboratories, Livermore, CA 94551 (United States); Sword, Eric D.; Younkin, James R. [Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Sadler, Lorraine E. [Sandia National Laboratories, Livermore, CA 94551 (United States)

    2015-06-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed. - Highlights: • A prototype cloud-based RIID analysis system was implemented and demonstrated. • A cloud-based system was shown to be feasible with currently available technology. • A system study identified the operational characteristics required by the users. • The system study showed that the user community could derive significant benefit. • An architecture was defined for field testing by users in relevant environments.

  14. Feasibility and demonstration of a cloud-based RIID analysis system

    International Nuclear Information System (INIS)

    Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.

    2015-01-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed. - Highlights: • A prototype cloud-based RIID analysis system was implemented and demonstrated. • A cloud-based system was shown to be feasible with currently available technology. • A system study identified the operational characteristics required by the users. • The system study showed that the user community could derive significant benefit. • An architecture was defined for field testing by users in relevant environments

  15. Analysis of the security and privacy requirements of cloud-based electronic health records systems.

    Science.gov (United States)

    Rodrigues, Joel J P C; de la Torre, Isabel; Fernández, Gonzalo; López-Coronado, Miguel

    2013-08-21

    The Cloud Computing paradigm offers eHealth systems the opportunity to enhance the features and functionality that they offer. However, moving patients' medical information to the Cloud implies several risks in terms of the security and privacy of sensitive health records. In this paper, the risks of hosting Electronic Health Records (EHRs) on the servers of third-party Cloud service providers are reviewed. To protect the confidentiality of patient information and facilitate the process, some suggestions for health care providers are made. Moreover, security issues that Cloud service providers should address in their platforms are considered. To show that, before moving patient health records to the Cloud, security and privacy concerns must be considered by both health care providers and Cloud service providers. Security requirements of a generic Cloud service provider are analyzed. To study the latest in Cloud-based computing solutions, bibliographic material was obtained mainly from Medline sources. Furthermore, direct contact was made with several Cloud service providers. Some of the security issues that should be considered by both Cloud service providers and their health care customers are role-based access, network security mechanisms, data encryption, digital signatures, and access monitoring. Furthermore, to guarantee the safety of the information and comply with privacy policies, the Cloud service provider must be compliant with various certifications and third-party requirements, such as SAS70 Type II, PCI DSS Level 1, ISO 27001, and the US Federal Information Security Management Act (FISMA). Storing sensitive information such as EHRs in the Cloud means that precautions must be taken to ensure the safety and confidentiality of the data. A relationship built on trust with the Cloud service provider is essential to ensure a transparent process. Cloud service providers must make certain that all security mechanisms are in place to avoid unauthorized access

  16. Development of Cloud-Based UAV Monitoring and Management System.

    Science.gov (United States)

    Itkin, Mason; Kim, Mihui; Park, Younghee

    2016-11-15

    Unmanned aerial vehicles (UAVs) are an emerging technology with the potential to revolutionize commercial industries and the public domain outside of the military. UAVs would be able to speed up rescue and recovery operations from natural disasters and can be used for autonomous delivery systems (e.g., Amazon Prime Air). An increase in the number of active UAV systems in dense urban areas is attributed to an influx of UAV hobbyists and commercial multi-UAV systems. As airspace for UAV flight becomes more limited, it is important to monitor and manage many UAV systems using modern collision avoidance techniques. In this paper, we propose a cloud-based web application that provides real-time flight monitoring and management for UAVs. For each connected UAV, detailed UAV sensor readings from the accelerometer, GPS sensor, ultrasonic sensor and visual position cameras are provided along with status reports from the smaller internal components of UAVs (i.e., motor and battery). The dynamic map overlay visualizes active flight paths and current UAV locations, allowing the user to monitor all aircrafts easily. Our system detects and prevents potential collisions by automatically adjusting UAV flight paths and then alerting users to the change. We develop our proposed system and demonstrate its feasibility and performances through simulation.

  17. Development of Cloud-Based UAV Monitoring and Management System

    Directory of Open Access Journals (Sweden)

    Mason Itkin

    2016-11-01

    Full Text Available Unmanned aerial vehicles (UAVs are an emerging technology with the potential to revolutionize commercial industries and the public domain outside of the military. UAVs would be able to speed up rescue and recovery operations from natural disasters and can be used for autonomous delivery systems (e.g., Amazon Prime Air. An increase in the number of active UAV systems in dense urban areas is attributed to an influx of UAV hobbyists and commercial multi-UAV systems. As airspace for UAV flight becomes more limited, it is important to monitor and manage many UAV systems using modern collision avoidance techniques. In this paper, we propose a cloud-based web application that provides real-time flight monitoring and management for UAVs. For each connected UAV, detailed UAV sensor readings from the accelerometer, GPS sensor, ultrasonic sensor and visual position cameras are provided along with status reports from the smaller internal components of UAVs (i.e., motor and battery. The dynamic map overlay visualizes active flight paths and current UAV locations, allowing the user to monitor all aircrafts easily. Our system detects and prevents potential collisions by automatically adjusting UAV flight paths and then alerting users to the change. We develop our proposed system and demonstrate its feasibility and performances through simulation.

  18. Development of Cloud-Based UAV Monitoring and Management System

    Science.gov (United States)

    Itkin, Mason; Kim, Mihui; Park, Younghee

    2016-01-01

    Unmanned aerial vehicles (UAVs) are an emerging technology with the potential to revolutionize commercial industries and the public domain outside of the military. UAVs would be able to speed up rescue and recovery operations from natural disasters and can be used for autonomous delivery systems (e.g., Amazon Prime Air). An increase in the number of active UAV systems in dense urban areas is attributed to an influx of UAV hobbyists and commercial multi-UAV systems. As airspace for UAV flight becomes more limited, it is important to monitor and manage many UAV systems using modern collision avoidance techniques. In this paper, we propose a cloud-based web application that provides real-time flight monitoring and management for UAVs. For each connected UAV, detailed UAV sensor readings from the accelerometer, GPS sensor, ultrasonic sensor and visual position cameras are provided along with status reports from the smaller internal components of UAVs (i.e., motor and battery). The dynamic map overlay visualizes active flight paths and current UAV locations, allowing the user to monitor all aircrafts easily. Our system detects and prevents potential collisions by automatically adjusting UAV flight paths and then alerting users to the change. We develop our proposed system and demonstrate its feasibility and performances through simulation. PMID:27854267

  19. Seamless personal health information system in cloud computing.

    Science.gov (United States)

    Chung, Wan-Young; Fong, Ee May

    2014-01-01

    Noncontact ECG measurement has gained popularity these days due to its noninvasive and conveniences to be applied on daily life. This approach does not require any direct contact between patient's skin and sensor for physiological signal measurement. The noncontact ECG measurement is integrated with mobile healthcare system for health status monitoring. Mobile phone acts as the personal health information system displaying health status and body mass index (BMI) tracking. Besides that, it plays an important role being the medical guidance providing medical knowledge database including symptom checker and health fitness guidance. At the same time, the system also features some unique medical functions that cater to the living demand of the patients or users, including regular medication reminders, alert alarm, medical guidance, appointment scheduling. Lastly, we demonstrate mobile healthcare system with web application for extended uses, thus health data are clouded into web server system and web database storage. This allows remote health status monitoring easily and so forth it promotes a cost effective personal healthcare system.

  20. Evolution towards a Cloud Deployed Business Support System

    Directory of Open Access Journals (Sweden)

    Ioan DRAGAN

    2015-01-01

    Full Text Available Although less known outside strictly specialized environments, Business Support Systems (BSS are highly complex and the subject of their installation in cloud implementations is less addressed. This paper presents a short history of BSS evolution, starting from basic voice and messaging services and ending up to 4G and M2M services, presenting new features and their new challenges. Moreover, we present, as a baseline for future developments, a study based on direct interviews with representatives of telecom operators about their vision of possible future BSS solutions depending on the services they will provide. This area of investigation has a certain number of challenges that require collaboration between providers and operators; in this context, we have been established a framework of requirements which will be handled and studied individually.

  1. Architecture-driven Migration of Legacy Systems to Cloud-enabled Software

    DEFF Research Database (Denmark)

    Ahmad, Aakash; Babar, Muhammad Ali

    2014-01-01

    of legacy systems to cloud computing. The framework leverages the software reengineering concepts that aim to recover the architecture from legacy source code. Then the framework exploits the software evolution concepts to support architecture-driven migration of legacy systems to cloud-based architectures....... The Legacy-to-Cloud Migration Horseshoe comprises of four processes: (i) architecture migration planning, (ii) architecture recovery and consistency, (iii) architecture transformation and (iv) architecture-based development of cloud-enabled software. We aim to discover, document and apply the migration...

  2. Assessment of physical server reliability in multi cloud computing system

    Science.gov (United States)

    Kalyani, B. J. D.; Rao, Kolasani Ramchand H.

    2018-04-01

    Business organizations nowadays functioning with more than one cloud provider. By spreading cloud deployment across multiple service providers, it creates space for competitive prices that minimize the burden on enterprises spending budget. To assess the software reliability of multi cloud application layered software reliability assessment paradigm is considered with three levels of abstractions application layer, virtualization layer, and server layer. The reliability of each layer is assessed separately and is combined to get the reliability of multi-cloud computing application. In this paper, we focused on how to assess the reliability of server layer with required algorithms and explore the steps in the assessment of server reliability.

  3. Registration system of cloud campus by using android smart tablet.

    Science.gov (United States)

    Kamada, Shin; Ichimura, Takumi; Shigeyasu, Tetsuya; Takemoto, Yasuhiko

    2014-01-01

    Near Field Communication (NFC) standard covers communication protocols and data exchange formats. NFC technology is one of radio-frequency identification (RFID) standards. In Japan, Felica card is a popular way to identify the unique ID. We developed the attendance management system (AMS) as the Android application which works in the smart tablet with NFC. Generally, the AMS in the university is fixed to the wall and each student touches or slides his/her own card to the dedicated equipment. Because a teacher can use his/her own smart tablet and/or smartphone, the attendance records are viewed anytime and anywhere. Moreover, we developed the collecting system between PC and some tablets by using Android beam. Any personal data are encrypted and the file can be transferred over the NFC Bluetooth Handover between PC Linux and smart tablet. By the mining of the collected records, early discovery for chronic non-attenders are extracted in educational affairs section. In this paper, a registration system on the cloud campus system by using the personal smartphone with NFC is developed. The system enables to introduce the university courses that are open to the general public.

  4. Toward GEOS-6, A Global Cloud System Resolving Atmospheric Model

    Science.gov (United States)

    Putman, William M.

    2010-01-01

    NASA is committed to observing and understanding the weather and climate of our home planet through the use of multi-scale modeling systems and space-based observations. Global climate models have evolved to take advantage of the influx of multi- and many-core computing technologies and the availability of large clusters of multi-core microprocessors. GEOS-6 is a next-generation cloud system resolving atmospheric model that will place NASA at the forefront of scientific exploration of our atmosphere and climate. Model simulations with GEOS-6 will produce a realistic representation of our atmosphere on the scale of typical satellite observations, bringing a visual comprehension of model results to a new level among the climate enthusiasts. In preparation for GEOS-6, the agency's flagship Earth System Modeling Framework [JDl] has been enhanced to support cutting-edge high-resolution global climate and weather simulations. Improvements include a cubed-sphere grid that exposes parallelism; a non-hydrostatic finite volume dynamical core, and algorithm designed for co-processor technologies, among others. GEOS-6 represents a fundamental advancement in the capability of global Earth system models. The ability to directly compare global simulations at the resolution of spaceborne satellite images will lead to algorithm improvements and better utilization of space-based observations within the GOES data assimilation system

  5. Using AMDD method for Database Design in Mobile Cloud Computing Systems

    OpenAIRE

    Silviu Claudiu POPA; Mihai-Constantin AVORNICULUI; Vasile Paul BRESFELEAN

    2013-01-01

    The development of the technologies of wireless telecommunications gave birth of new kinds of e-commerce, the so called Mobile e-Commerce or m-Commerce. Mobile Cloud Computing (MCC) represents a new IT research area that combines mobile computing and cloud compu-ting techniques. Behind a cloud mobile commerce system there is a database containing all necessary information for transactions. By means of Agile Model Driven Development (AMDD) method, we are able to achieve many benefits that smoo...

  6. Security of Heterogeneous Content in Cloud Based Library Information Systems Using an Ontology Based Approach

    Directory of Open Access Journals (Sweden)

    Mihai DOINEA

    2014-01-01

    Full Text Available As in any domain that involves the use of software, the library information systems take advantages of cloud computing. The paper highlights the main aspect of cloud based systems, describing some public solutions provided by the most important players on the market. Topics related to content security in cloud based services are tackled in order to emphasize the requirements that must be met by these types of systems. A cloud based implementation of an Information Library System is presented and some adjacent tools that are used together with it to provide digital content and metadata links are described. In a cloud based Information Library System security is approached by means of ontologies. Aspects such as content security in terms of digital rights are presented and a methodology for security optimization is proposed.

  7. Satellite Cloud and Radiative Property Processing and Distribution System on the NASA Langley ASDC OpenStack and OpenShift Cloud Platform

    Science.gov (United States)

    Nguyen, L.; Chee, T.; Palikonda, R.; Smith, W. L., Jr.; Bedka, K. M.; Spangenberg, D.; Vakhnin, A.; Lutz, N. E.; Walter, J.; Kusterer, J.

    2017-12-01

    Cloud Computing offers new opportunities for large-scale scientific data producers to utilize Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) IT resources to process and deliver data products in an operational environment where timely delivery, reliability, and availability are critical. The NASA Langley Research Center Atmospheric Science Data Center (ASDC) is building and testing a private and public facing cloud for users in the Science Directorate to utilize as an everyday production environment. The NASA SatCORPS (Satellite ClOud and Radiation Property Retrieval System) team processes and derives near real-time (NRT) global cloud products from operational geostationary (GEO) satellite imager datasets. To deliver these products, we will utilize the public facing cloud and OpenShift to deploy a load-balanced webserver for data storage, access, and dissemination. The OpenStack private cloud will host data ingest and computational capabilities for SatCORPS processing. This paper will discuss the SatCORPS migration towards, and usage of, the ASDC Cloud Services in an operational environment. Detailed lessons learned from use of prior cloud providers, specifically the Amazon Web Services (AWS) GovCloud and the Government Cloud administered by the Langley Managed Cloud Environment (LMCE) will also be discussed.

  8. Automated cloud tracking system for the Akatsuki Venus Climate Orbiter data

    Science.gov (United States)

    Ogohara, Kazunori; Kouyama, Toru; Yamamoto, Hiroki; Sato, Naoki; Takagi, Masahiro; Imamura, Takeshi

    2012-02-01

    Japanese Venus Climate Orbiter, Akatsuki, is cruising to approach to Venus again although its first Venus orbital insertion (VOI) has been failed. At present, we focus on the next opportunity of VOI and the following scientific observations.We have constructed an automated cloud tracking system for processing data obtained by Akatsuki in the present study. In this system, correction of the pointing of the satellite is essentially important for improving accuracy of the cloud motion vectors derived using the cloud tracking. Attitude errors of the satellite are reduced by fitting an ellipse to limb of an imaged Venus disk. Next, longitude-latitude distributions of brightness (cloud patterns) are calculated to make it easy to derive the cloud motion vectors. The grid points are distributed at regular intervals in the longitude-latitude coordinate. After applying the solar zenith correction and a highpass filter to the derived longitude-latitude distributions of brightness, the cloud features are tracked using pairs of images. As a result, we obtain cloud motion vectors on longitude-latitude grid points equally spaced. These entire processes are pipelined and automated, and are applied to all data obtained by combinations of cameras and filters onboard Akatsuki. It is shown by several tests that the cloud motion vectors are determined with a sufficient accuracy. We expect that longitude-latitude data sets created by the automated cloud tracking system will contribute to the Venus meteorology.

  9. The ARM-GCSS Intercomparison Study of Single-Column Models and Cloud System Models

    International Nuclear Information System (INIS)

    Cederwall, R.T.; Rodriques, D.J.; Krueger, S.K.; Randall, D.A.

    1999-01-01

    The Single-Column Model (SCM) Working Group (WC) and the Cloud Working Group (CWG) in the Atmospheric Radiation Measurement (ARM) Program have begun a collaboration with the GEWEX Cloud System Study (GCSS) WGs. The forcing data sets derived from the special ARM radiosonde measurements made during the SCM Intensive Observation Periods (IOPs), the wealth of cloud and related data sets collected by the ARM Program, and the ARM infrastructure support of the SCM WG are of great value to GCSS. In return, GCSS brings the efforts of an international group of cloud system modelers to bear on ARM data sets and ARM-related scientific questions. The first major activity of the ARM-GCSS collaboration is a model intercomparison study involving SCMs and cloud system models (CSMs), also known as cloud-resolving or cloud-ensemble models. The SCM methodologies developed in the ARM Program have matured to the point where an intercomparison will help identify the strengths and weaknesses of various approaches. CSM simulations will bring much additional information about clouds to evaluate cloud parameterizations used in the SCMs. CSMs and SCMs have been compared successfully in previous GCSS intercomparison studies for tropical conditions. The ARM Southern Great Plains (SGP) site offers an opportunity for GCSS to test their models in continental, mid-latitude conditions. The Summer 1997 SCM IOP has been chosen since it provides a wide range of summertime weather events that will be a challenging test of these models

  10. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    International Nuclear Information System (INIS)

    Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.; Gaidamaka, Yuliya V.; Gudkova, Irina A.; Sopin, Eduard S.

    2015-01-01

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. For better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures

  11. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    Energy Technology Data Exchange (ETDEWEB)

    Shorgin, Sergey Ya.; Pechinkin, Alexander V. [Institute of Informatics Problems, Russian Academy of Sciences (Russian Federation); Samouylov, Konstantin E.; Gaidamaka, Yuliya V.; Gudkova, Irina A.; Sopin, Eduard S. [Telecommunication Systems Department, Peoples’ Friendship University of Russia (Russian Federation)

    2015-03-10

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. For better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.

  12. Evolution of ERP Systems in the Cloud: A Study on System Updates

    Directory of Open Access Journals (Sweden)

    Elise Bjelland

    2018-06-01

    Full Text Available Cloud-based enterprise resource planning (ERP systems emerged around the new millennium, and since then there has been a lack of research regarding the evolution and update processes of these systems. From the users’ perspective, updates in a traditional on-premise ERP system are carried at their own request; while cloud-based ERPs are compulsory updated. Through an established ERP lifecycle framework, this study investigates how the process of updates is conducted in a cloud ERP context, from both the users’ and vendors’ perspectives. A multiple case study was conducted in Norway at 10 client organizations, as well as a cloud ERP vendor. Our main findings suggest that the vendor and the users view the process of updates differently. The main challenges with the process of updates from the users’ perspective are the size and date of the updates, lack of information and communication during the process, and extinction of certain functionalities. Yet, the main advantages are that all system users will always have the same version of the system, users do not need to spend time on updating the system and paying attention to the ERP market, which leads to more focus on their core competences instead.

  13. Monitoring cirrus cloud and tropopause height over Hanoi using a compact lidar system

    International Nuclear Information System (INIS)

    Bui Van Hai; Dinh Van Trung; Nguyen Xuan Tuan; Dao Duy Thang; Nguyen Thanh Binh

    2012-01-01

    Cirrus clouds in the upper troposphere and the lower stratosphere have attracted great attention due to their important role and impact on the atmospheric radioactive balance. Because cirrus clouds are located high in the atmosphere, their study requires a high resolution remote sensing technique not only for detection but also for the characterization of their properties. The lidar technique with its inherent high sensitivity and resolution has become an indispensable tool for studying and improving our understanding of cirrus cloud. Using lidar technique we can simultaneously measure the cloud height, thickness and follow its temporal evolution. In this paper we describe the development of a compact and highly sensitive lidar system with the aim to remotely monitor for the first time the cirrus clouds over Hanoi (2101:42 N, 10551:12 W). From the lidar data collected during the year 2011. We derive the mean cloud height, location of cloud top, the cloud mean thickness and their temporal evolution. We then compare the location of the cloud top with the position of the tropopause determined the radiosonde data and found good that the distance between cloud top and tropopause remains fairly stable, indicating that generally the top of cirrus clouds is the good tracer of the tropopause. We found that the cirrus clouds are generally located at height between 11.2 to 15 km with average height of 13.4 km. Their thickness is between 0.3 and 3.8 km with average value of 1.7 km. We also compare the properties of cirrus cloud with that observed at other locations around the world based on lidar technique. (author)

  14. Construction of Urban Design Support System using Cloud Computing Type Virtual Reality and Case Study

    OpenAIRE

    Zhenhan, Lei; Shunta, Shimizu; Natuska, Ota; Yuji, Ito; Yuesong, Zhang

    2017-01-01

    This paper contributes a design support system based on cloud-computing type virtual reality (cloud-based VR) for urban planning and urban design. A platform for Cloud-based VR technology, i.e. a VR-Cloud server, is used to open a VR dataset to public collaboration over the Internet. The digital attributes representing the design scheme of design concepts includes the land use zone, building regulations, urban design style, and other design details of architectural design, landscape, and traf...

  15. Point-Cloud Compression for Vehicle-Based Mobile Mapping Systems Using Portable Network Graphics

    Science.gov (United States)

    Kohira, K.; Masuda, H.

    2017-09-01

    A mobile mapping system is effective for capturing dense point-clouds of roads and roadside objects Point-clouds of urban areas, residential areas, and arterial roads are useful for maintenance of infrastructure, map creation, and automatic driving. However, the data size of point-clouds measured in large areas is enormously large. A large storage capacity is required to store such point-clouds, and heavy loads will be taken on network if point-clouds are transferred through the network. Therefore, it is desirable to reduce data sizes of point-clouds without deterioration of quality. In this research, we propose a novel point-cloud compression method for vehicle-based mobile mapping systems. In our compression method, point-clouds are mapped onto 2D pixels using GPS time and the parameters of the laser scanner. Then, the images are encoded in the Portable Networking Graphics (PNG) format and compressed using the PNG algorithm. In our experiments, our method could efficiently compress point-clouds without deteriorating the quality.

  16. POINT-CLOUD COMPRESSION FOR VEHICLE-BASED MOBILE MAPPING SYSTEMS USING PORTABLE NETWORK GRAPHICS

    Directory of Open Access Journals (Sweden)

    K. Kohira

    2017-09-01

    Full Text Available A mobile mapping system is effective for capturing dense point-clouds of roads and roadside objects.Point-clouds of urban areas, residential areas, and arterial roads are useful for maintenance of infrastructure, map creation, and automatic driving. However, the data size of point-clouds measured in large areas is enormously large. A large storage capacity is required to store such point-clouds, and heavy loads will be taken on network if point-clouds are transferred through the network. Therefore, it is desirable to reduce data sizes of point-clouds without deterioration of quality. In this research, we propose a novel point-cloud compression method for vehicle-based mobile mapping systems. In our compression method, point-clouds are mapped onto 2D pixels using GPS time and the parameters of the laser scanner. Then, the images are encoded in the Portable Networking Graphics (PNG format and compressed using the PNG algorithm. In our experiments, our method could efficiently compress point-clouds without deteriorating the quality.

  17. The potential of cloud point system as a novel two-phase partitioning system for biotransformation.

    Science.gov (United States)

    Wang, Zhilong

    2007-05-01

    Although the extractive biotransformation in two-phase partitioning systems have been studied extensively, such as the water-organic solvent two-phase system, the aqueous two-phase system, the reverse micelle system, and the room temperature ionic liquid, etc., this has not yet resulted in a widespread industrial application. Based on the discussion of the main obstacles, an exploitation of a cloud point system, which has already been applied in a separation field known as a cloud point extraction, as a novel two-phase partitioning system for biotransformation, is reviewed by analysis of some topical examples. At the end of the review, the process control and downstream processing in the application of the novel two-phase partitioning system for biotransformation are also briefly discussed.

  18. CLOUD COMPUTING BASED INFORMATION SYSTEMS -PRESENT AND FUTURE

    Directory of Open Access Journals (Sweden)

    Maximilian ROBU

    2012-12-01

    Full Text Available The current economic crisis and the global recession have affected the IT market as well. A solution camefrom the Cloud Computing area by optimizing IT budgets and eliminating different types of expenses (servers, licenses,and so on. Cloud Computing is an exciting and interesting phenomenon, because of its relative novelty and explodinggrowth. Because of its raise in popularity and usage Cloud Computing has established its role as a research topic.However the tendency is to focus on the technical aspects of Cloud Computing, thus leaving the potential that thistechnology offers unexplored. With the help of this technology new market player arise and they manage to break thetraditional value chain of service provision. The main focus of this paper is the business aspects of Cloud. In particularwe will talk about the economic aspects that cover using Cloud Computing (when, why and how to use, and theimpacts on the infrastructure, the legalistic issues that come from using Cloud Computing; the scalability and partiallyunclear legislation.

  19. Cloud Computing in the Curricula of Schools of Computer Science and Information Systems

    Science.gov (United States)

    Lawler, James P.

    2011-01-01

    The cloud continues to be a developing area of information systems. Evangelistic literature in the practitioner field indicates benefit for business firms but disruption for technology departments of the firms. Though the cloud currently is immature in methodology, this study defines a model program by which computer science and information…

  20. Secure system for personal finances on the cloud

    OpenAIRE

    Quintana i Vidal, Xavier

    2015-01-01

    Este documento contiene la memoria de la realización del Trabajo Final de Máster que tiene como objetivo la creación de un Cloud para el almacenamiento de forma segura de las facturas electrónicas. This document contains the memory of the accomplishment of the final Project, which aims to create a Cloud for secure storage of the electronic invoices. Aquest document conté la memòria de la realització del Treball Final de Màster que té com a objectiu la creació d'un Cloud per a l'emmagatz...

  1. Electric field measuring and display system. [for cloud formations

    Science.gov (United States)

    Wojtasinski, R. J.; Lovall, D. D. (Inventor)

    1974-01-01

    An apparatus is described for monitoring the electric fields of cloud formations within a particular area. It utilizes capacitor plates that are alternately shielded from the clouds for generating an alternating signal corresponding to the intensity of the electric field of the clouds. A synchronizing signal is produced for controlling sampling of the alternating signal. Such samplings are fed through a filter and converted by an analogue to digital converter into digital form and subsequently fed to a transmitter for transmission to the control station for recording.

  2. Social Empowerment of Intellectually Impaired through a Cloud Mobile System

    Directory of Open Access Journals (Sweden)

    Laura Freina

    2015-11-01

    Full Text Available There is not a unique definition of “empowerment”, nevertheless the idea that it involves, on the one hand, people having control over their own lives and, on the other, some social aspects seems to be a common characteristic. Most authors recognize three levels of empowerment: individual, group, and community level, which are interconnected and changes at one level influence the others. Enhancing individual competence and self-esteem has a direct effect on the control of one’s own life and, in turn, on the social components of empowerment. In this paper we present Smart Angel, a project that aims at creating a network involving families, caregivers, experts, and tutors, as well as the final users and their friends, based on a mobile cloud system in support of both everyday living and urban mobility for people with medium-mild intellectual disabilities, with particular attention to the Down syndrome. The system can be seen as a tool to empower its users to be more independent and therefore increasing their possibility to have an active role in their life and an active participation to the community.

  3. A cloud system for mobile medical services of traditional Chinese medicine.

    Science.gov (United States)

    Hu, Nian-Ze; Lee, Chia-Ying; Hou, Mark C; Chen, Ying-Ling

    2013-12-01

    Many medical centers in Taiwan have started to provide Traditional Chinese Medicine (TCM) services for hospitalized patients. Due to the complexity of TCM modality and the increasing need for providing TCM services for patients in different wards at distantly separate locations within the hospital, it is getting difficult to manage the situation in the traditional way. A computerized system with mobile ability can therefore provide a practical solution to the challenge presented. The study tries to develop a cloud system equipped with mobile devices to integrate electronic medical records, facilitate communication between medical workers, and improve the quality of TCM services for the hospitalized patients in a medical center. The system developed in the study includes mobile devices carrying Android operation system and a PC as a cloud server. All the devices use the same TCM management system developed by the study. A website of database is set up for information sharing. The cloud system allows users to access and update patients' medical information, which is of great help to medical workers for verifying patients' identification and giving proper treatments to patients. The information then can be wirelessly transmitted between medical personnel through the cloud system. Several quantitative and qualitative evaluation indexes are developed to measure the effectiveness of the cloud system on the quality of the TCM service. The cloud system is tested and verified based on a sample of hospitalized patients receiving the acupuncture treatment at the Lukang Branch of Changhua Christian Hospital (CCH) in Taiwan. The result shows a great improvement in operating efficiency of the TCM service in that a significant saving in labor time can be attributable to the cloud system. In addition, the cloud system makes it easy to confirm patients' identity through taking a picture of the patient upon receiving any medical treatment. The result also shows that the cloud system

  4. Analysis of cloud-based solutions on EHRs systems in different scenarios.

    Science.gov (United States)

    Fernández-Cardeñosa, Gonzalo; de la Torre-Díez, Isabel; López-Coronado, Miguel; Rodrigues, Joel J P C

    2012-12-01

    Nowadays with the growing of the wireless connections people can access all the resources hosted in the Cloud almost everywhere. In this context, organisms can take advantage of this fact, in terms of e-Health, deploying Cloud-based solutions on e-Health services. In this paper two Cloud-based solutions for different scenarios of Electronic Health Records (EHRs) management system are proposed. We have researched articles published between the years 2005 and 2011 about the implementation of e-Health services based on the Cloud in Medline. In order to analyze the best scenario for the deployment of Cloud Computing two solutions for a large Hospital and a network of Primary Care Health centers have been studied. Economic estimation of the cost of the implementation for both scenarios has been done via the Amazon calculator tool. As a result of this analysis two solutions are suggested depending on the scenario: To deploy a Cloud solution for a large Hospital a typical Cloud solution in which are hired just the needed services has been assumed. On the other hand to work with several Primary Care Centers it's suggested the implementation of a network, which interconnects these centers with just one Cloud environment. Finally it's considered the fact of deploying a hybrid solution: in which EHRs with images will be hosted in the Hospital or Primary Care Centers and the rest of them will be migrated to the Cloud.

  5. Upper tropospheric cloud systems determined from IR Sounders and their influence on the atmosphere

    Science.gov (United States)

    Stubenrauch, Claudia; Protopapadaki, Sofia; Feofilov, Artem; Velasco, Carola Barrientos

    2017-02-01

    Covering about 30% of the Earth, upper tropospheric clouds play a key role in the climate system by modulating the Earth's energy budget and heat transport. Infrared Sounders reliably identify cirrus down to an IR optical depth of 0.1. Recently LMD has built global cloud climate data records from AIRS and IASI observations, covering the periods from 2003-2015 and 2008-2015, respectively. Upper tropospheric clouds often form mesoscale systems. Their organization and properties are being studied by (1) distinguishing cloud regimes within 2° × 2° regions and (2) applying a spatial composite technique on adjacent cloud pressures, which estimates the horizontal extent of the mesoscale cloud systems. Convective core, cirrus anvil and thin cirrus of these systems are then distinguished by their emissivity. Compared to other studies of tropical mesoscale convective systems our data include also the thinner anvil parts, which make out about 30% of the area of tropical mesoscale convective systems. Once the horizontal and vertical structure of these upper tropospheric cloud systems is known, we can estimate their radiative effects in terms of top of atmosphere and surface radiative fluxes and by computing their heating rates.

  6. Information Security Management as a Bridge in Cloud Systems from Private to Public Organizations

    Directory of Open Access Journals (Sweden)

    Myeonggil Choi

    2015-08-01

    Full Text Available Cloud computing has made it possible for private companies to make rapid changes in their computing environments. However, in the public sector, security issues hinder institutions from adopting cloud computing. To solve these security challenges, in this paper, we propose a methodology for information security management, which quantitatively classifies the importance of information in cloud systems in the public sector. In this study, we adopt a Delphi approach to establish the classification criteria of the proposed methodology in an objective and systematic manner. Further, through a case study of a public corporation, we try to validate the usefulness of the proposed methodology. The results of this study will help public institutions to consider introducing cloud computing and to manage cloud systems effectively and securely.

  7. Atmospheric System Research Marine Low Clouds Workshop Report, January 27-29,2016

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, M. [Brookhaven National Laboratory (BNL), Upton, NY (United States); Wang, J. [Brookhaven National Laboratory (BNL), Upton, NY (United States); Wood, R. [Brookhaven National Laboratory (BNL), Upton, NY (United States)

    2016-06-01

    Marine low clouds are a major determinant of the Earth?s albedo and are a major source of uncertainty in how the climate responds to changing greenhouse gas levels and anthropogenic aerosol. Marine low clouds are particularly difficult to simulate accurately in climate models, and their remote locations present a significant observational challenge. A complex set of interacting controlling processes determine the coverage, condensate loading, and microphysical and radiative properties of marine low clouds. Marine low clouds are sensitive to atmospheric aerosol in several ways. Interactions at microphysical scales involve changes in the concentration of cloud droplets and precipitation, which induce cloud dynamical impacts including changes in entrainment and mesoscale organization. Marine low clouds are also impacted by atmospheric heating changes due to absorbing aerosols. The response of marine low clouds to aerosol perturbations depends strongly upon the unperturbed aerosol-cloud state, which necessitates greater understanding of processes controlling the budget of aerosol in the marine boundary layer. Entrainment and precipitation mediate the response of low clouds to aerosols but these processes also play leading roles in controlling the aerosol budget. The U.S. Department of Energy Atmospheric Radiation Measurement (ARM) Climate Research Facility and Atmospheric System Research (ASR) program are making major recent investments in observational data sets from fixed and mobile sites dominated by marine low clouds. This report provides specific action items for how these measurements can be used together with process modeling to make progress on understanding and quantifying the key cloud and aerosol controlling processes in the next 5-10 years. Measurements of aerosol composition and its variation with particle size are needed to advance a quantitative, process-level understanding of marine boundary-layer aerosol budget. Quantitative precipitation estimates

  8. ELASTIC CLOUD COMPUTING ARCHITECTURE AND SYSTEM FOR HETEROGENEOUS SPATIOTEMPORAL COMPUTING

    Directory of Open Access Journals (Sweden)

    X. Shi

    2017-10-01

    Full Text Available Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs, while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  9. Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing

    Science.gov (United States)

    Shi, X.

    2017-10-01

    Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  10. A secure online image trading system for untrusted cloud environments.

    Science.gov (United States)

    Munadi, Khairul; Arnia, Fitri; Syaryadhi, Mohd; Fujiyoshi, Masaaki; Kiya, Hitoshi

    2015-01-01

    In conventional image trading systems, images are usually stored unprotected on a server, rendering them vulnerable to untrusted server providers and malicious intruders. This paper proposes a conceptual image trading framework that enables secure storage and retrieval over Internet services. The process involves three parties: an image publisher, a server provider, and an image buyer. The aim is to facilitate secure storage and retrieval of original images for commercial transactions, while preventing untrusted server providers and unauthorized users from gaining access to true contents. The framework exploits the Discrete Cosine Transform (DCT) coefficients and the moment invariants of images. Original images are visually protected in the DCT domain, and stored on a repository server. Small representation of the original images, called thumbnails, are generated and made publicly accessible for browsing. When a buyer is interested in a thumbnail, he/she sends a query to retrieve the visually protected image. The thumbnails and protected images are matched using the DC component of the DCT coefficients and the moment invariant feature. After the matching process, the server returns the corresponding protected image to the buyer. However, the image remains visually protected unless a key is granted. Our target application is the online market, where publishers sell their stock images over the Internet using public cloud servers.

  11. Network systems and cloud applications in livestock farming

    Directory of Open Access Journals (Sweden)

    Daniel Herd

    2014-10-01

    Full Text Available Der Einsatz von Automatisierungstechnik und von Sensoren zur Tierüberwachung wächst und damit auch die Datenmenge aus der Tierhaltung. Die Herausforderungen an die Datenanalyse und einfache Informationsdarstellung steigen. Die Beispiele aus Wissenschaft und Praxis zeigen Lösungsmöglichkeiten. Dabei müssen Anlagen unterschiedlicher Hersteller gekoppelt und Daten zielgerichtet ausgewertet werden. Während in wissenschaftlich orientierten Projekten meist Systeme unterschiedlicher Hersteller vertreten sind, um z. B. die Kommunikation und Kooperation zu stärken sowie komplexe Fragestellungen zu beantworten, wird dies in herstellerspezifischen Projekten eher vermieden, da hier der konkrete Anwendervorteil im Vordergrund steht. Anhand ausgewählter Beispiele wird dargestellt, dass mobile Anwendungen als Frühwarnsysteme für Gesundheitsveränderungen in Beständen oder zur Anlagensteuerung implementiert und genutzt werden. Insgesamt ist deutlich zu erkennen, dass sich die Datenauswertung und –nutzung in die Cloud verschiebt. Mit diesen Cloudsystemen erweitert sich das Spektrum der Datenauswertung dahingehend, dass komplexe Algorithmen und mobile Services (Apps, Webberatung oder soziale Netzwerke umgesetzt werden.

  12. Moving towards Virtual Learning Clouds from Traditional Learning: Higher Educational Systems in India

    Directory of Open Access Journals (Sweden)

    Vasanthi Muniasamy

    2014-10-01

    Full Text Available E-Learning has become an increasingly popular learning approach in higher Education institutions due to the rapid growth of Communication and Information Technology (CIT. In recent years, it has been integrated in many university programs and it is one of the new learning trends. But in many Indian Universities did not implement this novel technology in their Educational Systems. E-Learning is not intended to replace the traditional classroom setting, but to provide new opportunities and new virtual environment for interaction and communication between the students and teacher. E-Learning through Cloud is now becoming an interesting and very useful revolutionary technology in the field of education. E-Learning system usually requires huge amount of hardware and software resources. Due to the cost, many universities in India do not want to implement the E-Learning technology in their Educational system and they cannot afford such investments. Cloud Virtual Learning is the only solution for this problem. This paper presents the benefits of using cloud technology in E-Learning system, working mode, Services, Models. And also we discuss the cloud computing educational environment and how higher education may take advantage of clouds not only in terms of cost but also in terms of Security, flexibility, portability, efficiency and reliability. And also we present some educational clouds introduced by popular cloud providers.

  13. HammerCloud: A Stress Testing System for Distributed Analysis

    International Nuclear Information System (INIS)

    Ster, Daniel C van der; García, Mario Úbeda; Paladin, Massimo; Elmsheuser, Johannes

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  14. Applying Data Mining Techniques to Improve Information Security in the Cloud: A Single Cache System Approach

    Directory of Open Access Journals (Sweden)

    Amany AlShawi

    2016-01-01

    Full Text Available Presently, the popularity of cloud computing is gradually increasing day by day. The purpose of this research was to enhance the security of the cloud using techniques such as data mining with specific reference to the single cache system. From the findings of the research, it was observed that the security in the cloud could be enhanced with the single cache system. For future purposes, an Apriori algorithm can be applied to the single cache system. This can be applied by all cloud providers, vendors, data distributors, and others. Further, data objects entered into the single cache system can be extended into 12 components. Database and SPSS modelers can be used to implement the same.

  15. Information Risks Analysis in the Cloud Computing System on the basis of Intellectual Technologies

    Directory of Open Access Journals (Sweden)

    Alina Yurievna Sentsova

    2013-02-01

    Full Text Available In this article the possibility of the fuzzy cognitive maps application for the purpose of artificial neural network sample data set formation are used for information security risks estimation in cloud computing system.

  16. NUCAPS: NOAA Unique Combined Atmospheric Processing System Cloud-Cleared Radiances (CCR)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset consists of Cloud-Cleared Radiances (CCRs) from the NOAA Unique Combined Atmospheric Processing System (NUCAPS). NUCAPS was developed by the NOAA/NESDIS...

  17. Cloud-Based Social Media Visual Analytics Disaster Response System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose a next-generation cloud-based social media visual analytics disaster response system that will enable decision-makers and first-responders to obtain...

  18. Is evaporative colling important for shallow clouds?

    Science.gov (United States)

    Gentine, P.; Park, S. B.; Davini, P.; D'Andrea, F.

    2017-12-01

    We here investigate and test using large-eddy simulations the hypothesis that evaporative cooling might not be crucial for shallow clouds. Results from various Shallow convection and stratocumulus LES experiments show that the influence of evaporative cooling is secondary compared to turbulent mixing, which dominates the buoyancy reversal. In shallow cumulus subising shells are not due to evaporative cooling but rather reflect a vortical structure, with a postive buoyancy anomaly in the core due to condensation. Disabling evaporative cooling has negligible impact on this vortical structure and on buoyancy reversal. Similarly in non-precipitating stratocumuli evaporative cooling is negeligible copmared to other factors, especially turbulent mixing and pressure effects. These results emphasize that it may not be critical to icnlude evaporative cooling in parameterizations of shallow clouds and that it does not alter entrainment.

  19. OpenID Connect as a security service in cloud-based medical imaging systems.

    Science.gov (United States)

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-04-01

    The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model.

  20. Cloud service performance evaluation: status, challenges, and opportunities – a survey from the system modeling perspective

    Directory of Open Access Journals (Sweden)

    Qiang Duan

    2017-05-01

    Full Text Available With rapid advancement of Cloud computing and networking technologies, a wide spectrum of Cloud services have been developed by various providers and utilized by numerous organizations as indispensable ingredients of their information systems. Cloud service performance has a significant impact on performance of the future information infrastructure. Thorough evaluation on Cloud service performance is crucial and beneficial to both service providers and consumers; thus forming an active research area. Some key technologies for Cloud computing, such as virtualization and the Service-Oriented Architecture (SOA, bring in special challenges to service performance evaluation. A tremendous amount of effort has been put by the research community to address these challenges and exciting progress has been made. Among the work on Cloud performance analysis, evaluation approaches developed with a system modeling perspective play an important role. However, related works have been reported in different sections of the literature; thus lacking a big picture that shows the latest status of this area. The objectives of this article is to present a survey that reflects the state of the art of Cloud service performance evaluation from the system modeling perspective. This articles also examines open issues and challenges to the surveyed evaluation approaches and identifies possible opportunities for future research in this important field.

  1. Cloud Computing and its Challenges and Benefits in the Bank System

    Directory of Open Access Journals (Sweden)

    Bogdan NEDELCU

    2015-07-01

    Full Text Available The purpose of this article is to highlight the current situation of Cloud Computing systems. There is a tendency for enterprises and banks to seek such databases, so the article tries to answer the question: "Is Cloud Computing safe". Answering this question requires an analysis of the security system (strengths and weaknesses, accompanied by arguments for and against this trend and suggestions for improvement that can increase the customers confidence in the future.

  2. The Cloud-Aerosol Transport System (CATS): A New Lidar for Aerosol and Cloud Profiling from the International Space Station

    Science.gov (United States)

    Welton, Ellsworth J.; McGill, Mathew J.; Yorks. John E.; Hlavka, Dennis L.; Hart, William D.; Palm, Stephen P.; Colarco, Peter R.

    2012-01-01

    Spaceborne lidar profiling of aerosol and cloud layers has been successfully implemented during a number of prior missions, including LITE, ICESat, and CALIPSO. Each successive mission has added increased capability and further expanded the role of these unique measurements in wide variety of applications ranging from climate, to air quality, to special event monitoring (ie, volcanic plumes). Many researchers have come to rely on the availability of profile data from CALIPSO, especially data coincident with measurements from other A-Train sensors. The CALIOP lidar on CALIPSO continues to operate well as it enters its fifth year of operations. However, active instruments have more limited lifetimes than their passive counterparts, and we are faced with a potential gap in lidar profiling from space if the CALIOP lidar fails before a new mission is operational. The ATLID lidar on EarthCARE is not expected to launch until 2015 or later, and the lidar component of NASA's proposed Aerosols, Clouds, and Ecosystems (ACE) mission would not be until after 2020. Here we present a new aerosol and cloud lidar that was recently selected to provide profiling data from the International Space Station (ISS) starting in 2013. The Cloud-Aerosol Transport System (CATS) is a three wavelength (1064,532,355 nm) elastic backscatter lidar with HSRL capability at 532 nm. Depolarization measurements will be made at all wavelengths. The primary objective of CATS is to continue the CALIPSO aerosol and cloud profile data record, ideally with overlap between both missions and EarthCARE. In addition, the near real time (NRT) data capability ofthe ISS will enable CATS to support operational applications such as aerosol and air quality forecasting and special event monitoring. The HSRL channel will provide a demonstration of technology and a data testbed for direct extinction retrievals in support of ACE mission development. An overview of the instrument and mission will be provided, along with a

  3. NAFFS: network attached flash file system for cloud storage on portable consumer electronics

    Science.gov (United States)

    Han, Lin; Huang, Hao; Xie, Changsheng

    Cloud storage technology has become a research hotspot in recent years, while the existing cloud storage services are mainly designed for data storage needs with stable high speed Internet connection. Mobile Internet connections are often unstable and the speed is relatively low. These native features of mobile Internet limit the use of cloud storage in portable consumer electronics. The Network Attached Flash File System (NAFFS) presented the idea of taking the portable device built-in NAND flash memory as the front-end cache of virtualized cloud storage device. Modern portable devices with Internet connection have built-in more than 1GB NAND Flash, which is quite enough for daily data storage. The data transfer rate of NAND flash device is much higher than mobile Internet connections[1], and its non-volatile feature makes it very suitable as the cache device of Internet cloud storage on portable device, which often have unstable power supply and intermittent Internet connection. In the present work, NAFFS is evaluated with several benchmarks, and its performance is compared with traditional network attached file systems, such as NFS. Our evaluation results indicate that the NAFFS achieves an average accessing speed of 3.38MB/s, which is about 3 times faster than directly accessing cloud storage by mobile Internet connection, and offers a more stable interface than that of directly using cloud storage API. Unstable Internet connection and sudden power off condition are tolerable, and no data in cache will be lost in such situation.

  4. Feasibility and demonstration of a cloud-based RIID analysis system

    Science.gov (United States)

    Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.

    2015-06-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed.

  5. Simulation of a Feedback System for the Attenuation of e-Cloud Driven Instability

    International Nuclear Information System (INIS)

    Vay, J.-L.; Furman, M.A.; Fox, J.; Rivetta, C.; de Maria, R.; Rumolo, G.

    2009-01-01

    Electron clouds impose limitations on current accelerators that may be more severe for future machines, unless adequate measures of mitigation are taken. Recently, it has been proposed to use feedback systems operating at high frequency (in the GHz range) to damp single-bunch transverse coherent oscillations that may otherwise be amplified during the interaction of the beam with ambient electron clouds. We have used the simulation package WARP-POSINST and the code Headtail to study the growth rate and frequency patterns in space-time of the electron cloud driven beam breakup instability in the CERN SPS accelerator with, or without, an idealized feedback model for damping the instability.

  6. Evaluation of the Huawei UDS cloud storage system for CERN specific data

    International Nuclear Information System (INIS)

    Resines, M Zotes; Hughes, J; Wang, L; Heikkila, S S; Duellmann, D; Adde, G; Toebbicke, R

    2014-01-01

    Cloud storage is an emerging architecture aiming to provide increased scalability and access performance, compared to more traditional solutions. CERN is evaluating this promise using Huawei UDS and OpenStack SWIFT storage deployments, focusing on the needs of high-energy physics. Both deployed setups implement S3, one of the protocols that are emerging as a standard in the cloud storage market. A set of client machines is used to generate I/O load patterns to evaluate the storage system performance. The presented read and write test results indicate scalability both in metadata and data perspectives. Futher the Huawei UDS cloud storage is shown to be able to recover from a major failure of losing 16 disks. Both cloud storages are finally demonstrated to function as back-end storage systems to a filesystem, which is used to deliver high energy physics software.

  7. Evaluation of the Huawei UDS cloud storage system for CERN specific data

    Science.gov (United States)

    Zotes Resines, M.; Heikkila, S. S.; Duellmann, D.; Adde, G.; Toebbicke, R.; Hughes, J.; Wang, L.

    2014-06-01

    Cloud storage is an emerging architecture aiming to provide increased scalability and access performance, compared to more traditional solutions. CERN is evaluating this promise using Huawei UDS and OpenStack SWIFT storage deployments, focusing on the needs of high-energy physics. Both deployed setups implement S3, one of the protocols that are emerging as a standard in the cloud storage market. A set of client machines is used to generate I/O load patterns to evaluate the storage system performance. The presented read and write test results indicate scalability both in metadata and data perspectives. Futher the Huawei UDS cloud storage is shown to be able to recover from a major failure of losing 16 disks. Both cloud storages are finally demonstrated to function as back-end storage systems to a filesystem, which is used to deliver high energy physics software.

  8. A batch system for HEP applications on a distributed IaaS cloud

    International Nuclear Information System (INIS)

    Gable, I; Agarwal, A; Anderson, M; Armstrong, P; Fransham, K; Leavett-Brown, D Harris C; Paterson, M; Penfold-Brown, D; Sobie, R J; Vliet, M; Charbonneau, A; Impey, R; Podaima, W

    2011-01-01

    The emergence of academic and commercial Infrastructure-as-a-Service (IaaS) clouds is opening access to new resources for the HEP community. In this paper we will describe a system we have developed for creating a single dynamic batch environment spanning multiple IaaS clouds of different types (e.g. Nimbus, OpenNebula, Amazon EC2). A HEP user interacting with the system submits a job description file with a pointer to their VM image. VM images can either be created by users directly or provided to the users. We have created a new software component called Cloud Scheduler that detects waiting jobs and boots the user VM required on any one of the available cloud resources. As the user VMs appear, they are attached to the job queues of a central Condor job scheduler, the job scheduler then submits the jobs to the VMs. The number of VMs available to the user is expanded and contracted dynamically depending on the number of user jobs. We present the motivation and design of the system with particular emphasis on Cloud Scheduler. We show that the system provides the ability to exploit academic and commercial cloud sites in a transparent fashion.

  9. Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.

    Science.gov (United States)

    Liao, Wen-Hwa; Qiu, Wan-Li

    2016-01-01

    Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.

  10. Towards Constraint-based High Performance Cloud System in the Process of Cloud Computing Adoption in an Organization

    OpenAIRE

    Simalango, Mikael Fernandus; Kang, Mun-Young; Oh, Sangyoon

    2010-01-01

    Cloud computing is penetrating into various domains and environments, from theoretical computer science to economy, from marketing hype to educational curriculum and from R&D lab to enterprise IT infrastructure. Yet, the currently developing state of cloud computing leaves several issues to address and also affects cloud computing adoption by organizations. In this paper, we explain how the transition into the cloud can occur in an organization and describe the mechanism for transforming lega...

  11. A City Parking Integration System Combined with Cloud Computing Technologies and Smart Mobile Devices

    Science.gov (United States)

    Yeh, Her-Tyan; Chen, Bing-Chang; Wang, Bo-Xun

    2016-01-01

    The current study applied cloud computing technology and smart mobile devices combined with a streaming server for parking lots to plan a city parking integration system. It is also equipped with a parking search system, parking navigation system, parking reservation service, and car retrieval service. With this system, users can quickly find…

  12. EMERGING SCOPE OF MEDICAL LABORATORIES SYSTEMS USING CLOUD COMPUTING FROM END-USER PERSPECTIVE

    OpenAIRE

    RAFİ, Zeeshan; DAĞ, Hasan; AYDIN, Mehmet N.

    2016-01-01

    In today’s world the rapid and reliable information extraction has become everybody’s need. Cloud computing is one of the emerging technology solutions to answer this query. This technology is providing many opportunities to the users in different terms to produce rapid and cost effective solution. This study helps in understanding the scope of the cloud computing as a solution in the field of medical laboratory systems. A study has been conducted to determine the need of the services require...

  13. Cloud-Based Collaborative Decision Making: Design Considerations and Architecture of the GRUPO-MOD System

    OpenAIRE

    Heiko Thimm

    2012-01-01

    The complexity of many decision problems of today’s globalized world requires new innovative solutions that are built upon proven decision support technology and also recent advancements in the area of information and communication technology (ICT) such as Cloud Computing and Mobile Communication. A combination of the cost-effective Cloud Computing approach with extended group decision support system technology bears several interesting unprecedented opportunities for the development of suc...

  14. A cloud-based X73 ubiquitous mobile healthcare system: design and implementation.

    Science.gov (United States)

    Ji, Zhanlin; Ganchev, Ivan; O'Droma, Máirtín; Zhang, Xin; Zhang, Xueji

    2014-01-01

    Based on the user-centric paradigm for next generation networks, this paper describes a ubiquitous mobile healthcare (uHealth) system based on the ISO/IEEE 11073 personal health data (PHD) standards (X73) and cloud computing techniques. A number of design issues associated with the system implementation are outlined. The system includes a middleware on the user side, providing a plug-and-play environment for heterogeneous wireless sensors and mobile terminals utilizing different communication protocols and a distributed "big data" processing subsystem in the cloud. The design and implementation of this system are envisaged as an efficient solution for the next generation of uHealth systems.

  15. The hipster approach for improving cloud system efficiency

    OpenAIRE

    Nishtala, Rajiv; Carpenter, Paul Matthew; Petrucci, Vinicius; Martorell Bofill, Xavier

    2017-01-01

    In 2013, U.S. data centers accounted for 2.2% of the country’s total electricity consumption, a figure that is projected to increase rapidly over the next decade. Many important data center workloads in cloud computing are interactive, and they demand strict levels of quality-of-service (QoS) to meet user expectations, making it challenging to optimize power consumption along with increasing performance demands. This article introduces Hipster, a technique that combines heuristics and rein...

  16. Improving aerosol interaction with clouds and precipitation in a regional chemical weather modeling system

    Science.gov (United States)

    Zhou, C.; Zhang, X.; Gong, S.; Wang, Y.; Xue, M.

    2016-01-01

    A comprehensive aerosol-cloud-precipitation interaction (ACI) scheme has been developed under a China Meteorological Administration (CMA) chemical weather modeling system, GRAPES/CUACE (Global/Regional Assimilation and PrEdiction System, CMA Unified Atmospheric Chemistry Environment). Calculated by a sectional aerosol activation scheme based on the information of size and mass from CUACE and the thermal-dynamic and humid states from the weather model GRAPES at each time step, the cloud condensation nuclei (CCN) are interactively fed online into a two-moment cloud scheme (WRF Double-Moment 6-class scheme - WDM6) and a convective parameterization to drive cloud physics and precipitation formation processes. The modeling system has been applied to study the ACI for January 2013 when several persistent haze-fog events and eight precipitation events occurred.The results show that aerosols that interact with the WDM6 in GRAPES/CUACE obviously increase the total cloud water, liquid water content, and cloud droplet number concentrations, while decreasing the mean diameters of cloud droplets with varying magnitudes of the changes in each case and region. These interactive microphysical properties of clouds improve the calculation of their collection growth rates in some regions and hence the precipitation rate and distributions in the model, showing 24 to 48 % enhancements of threat score for 6 h precipitation in almost all regions. The aerosols that interact with the WDM6 also reduce the regional mean bias of temperature by 3 °C during certain precipitation events, but the monthly means bias is only reduced by about 0.3 °C.

  17. Aerosol effects on cloud water amounts were successfully simulated by a global cloud-system resolving model.

    Science.gov (United States)

    Sato, Yousuke; Goto, Daisuke; Michibata, Takuro; Suzuki, Kentaroh; Takemura, Toshihiko; Tomita, Hirofumi; Nakajima, Teruyuki

    2018-03-07

    Aerosols affect climate by modifying cloud properties through their role as cloud condensation nuclei or ice nuclei, called aerosol-cloud interactions. In most global climate models (GCMs), the aerosol-cloud interactions are represented by empirical parameterisations, in which the mass of cloud liquid water (LWP) is assumed to increase monotonically with increasing aerosol loading. Recent satellite observations, however, have yielded contradictory results: LWP can decrease with increasing aerosol loading. This difference implies that GCMs overestimate the aerosol effect, but the reasons for the difference are not obvious. Here, we reproduce satellite-observed LWP responses using a global simulation with explicit representations of cloud microphysics, instead of the parameterisations. Our analyses reveal that the decrease in LWP originates from the response of evaporation and condensation processes to aerosol perturbations, which are not represented in GCMs. The explicit representation of cloud microphysics in global scale modelling reduces the uncertainty of climate prediction.

  18. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  19. Best practices for implementing, testing and using a cloud-based communication system in a disaster situation.

    Science.gov (United States)

    Makowski, Dale

    2016-01-01

    This paper sets out the basics for approaching the selection and implementation of a cloud-based communication system to support a business continuity programme, including: • consideration for how a cloud-based communication system can enhance a business continuity programme; • descriptions of some of the more popular features of a cloud-based communication system; • options to evaluate when selecting a cloud-based communication system; • considerations for how to design a system to be most effective for an organisation; • best practices for how to conduct the initial load of data to a cloud-based communication system; • best practices for how to conduct an initial validation of the data loaded to a cloud-based communication system; • considerations for how to keep contact information in the cloud-based communication system current and accurate; • best practices for conducting ongoing system testing; • considerations for how to conduct user training; • review of other potential uses of a cloud-based communication system; and • review of other tools and features many cloud-based communication systems may offer.

  20. The implications of dust ice nuclei effect on cloud top temperature in a complex mesoscale convective system.

    Science.gov (United States)

    Li, Rui; Dong, Xue; Guo, Jingchao; Fu, Yunfei; Zhao, Chun; Wang, Yu; Min, Qilong

    2017-10-23

    Mineral dust is the most important natural source of atmospheric ice nuclei (IN) which may significantly mediate the properties of ice cloud through heterogeneous nucleation and lead to crucial impacts on hydrological and energy cycle. The potential dust IN effect on cloud top temperature (CTT) in a well-developed mesoscale convective system (MCS) was studied using both satellite observations and cloud resolving model (CRM) simulations. We combined satellite observations from passive spectrometer, active cloud radar, lidar, and wind field simulations from CRM to identify the place where ice cloud mixed with dust particles. For given ice water path, the CTT of dust-mixed cloud is warmer than that in relatively pristine cloud. The probability distribution function (PDF) of CTT for dust-mixed clouds shifted to the warmer end and showed two peaks at about -45 °C and -25 °C. The PDF for relatively pristine cloud only show one peak at -55 °C. Cloud simulations with different microphysical schemes agreed well with each other and showed better agreement with satellite observations in pristine clouds, but they showed large discrepancies in dust-mixed clouds. Some microphysical schemes failed to predict the warm peak of CTT related to heterogeneous ice formation.

  1. Benefits and challenges of cloud ERP systems – A systematic literature review

    Directory of Open Access Journals (Sweden)

    Mohamed A. Abd Elmonem

    2016-12-01

    Full Text Available Enterprise Resource Planning (ERP systems provide extensive benefits and facilities to the whole enterprise. ERP systems help the enterprise to share and transfer data and information across all functions units inside and outside the enterprise. Sharing data and information between enterprise departments helps in many aspects and aims to achieve different objectives. Cloud computing is a computing model which takes place over the internet and provides scalability, reliability, availability and low cost of computer reassures. Implementing and running ERP systems over the cloud offers great advantages and benefits, in spite of its many difficulties and challenges. In this paper, we follow the Systematic Literature Review (SLR research method to explore the benefits and challenges of implementing ERP systems over a cloud environment.

  2. Criteria for the evaluation of a cloud-based hospital information system outsourcing provider.

    Science.gov (United States)

    Low, Chinyao; Hsueh Chen, Ya

    2012-12-01

    As cloud computing technology has proliferated rapidly worldwide, there has been a trend toward adopting cloud-based hospital information systems (CHISs). This study examines the critical criteria for selecting the CHISs outsourcing provider. The fuzzy Delphi method (FDM) is used to evaluate the primary indicator collected from 188 useable responses at a working hospital in Taiwan. Moreover, the fuzzy analytic hierarchy process (FAHP) is employed to calculate the weights of these criteria and establish a fuzzy multi-criteria model of CHISs outsourcing provider selection from 42 experts. The results indicate that the five most critical criteria related to CHISs outsourcing provider selection are (1) system function, (2) service quality, (3) integration, (4) professionalism, and (5) economics. This study may contribute to understanding how cloud-based hospital systems can reinforce content design and offer a way to compete in the field by developing more appropriate systems.

  3. Cloud-based systems for monitoring earthquakes and other environmental quantities

    Science.gov (United States)

    Clayton, R. W.; Olson, M.; Liu, A.; Chandy, M.; Bunn, J.; Guy, R.

    2013-12-01

    There are many advantages to using a cloud-based system to record and analyze environmental quantities such as earthquakes, radiation, various gases, dust and meteorological parameters. These advantages include robustness and dynamic scalability, and also reduced costs. In this paper, we present our experiences over the last three years in developing a cloud-based earthquake monitoring system (the Community Seismic Network). This network consists of over 600 sensors (accelerometers) in the S. California region that send data directly to the Google App Engine where they are analyzed. The system is capable of handing many other types of sensor data and generating a situation-awareness analysis as a product. Other advantages to the cloud-based system are integration with other peer networks, and being able to deploy anywhere in the world without have to build addition computing infrastructure.

  4. Design of Technical Support System for Retail Company Based on Cloud

    Directory of Open Access Journals (Sweden)

    Shao Ping

    2017-01-01

    Full Text Available With the retail side of the market in China, the sale of electricity companies as a new source of power retail, they participate in the electricity market business. National and local governments subsequently introduced the corresponding policies and rules, the technical support system becomes one of the necessary conditions for the access of the retail company. Retail electricity companies have started the system construction, but has not yet formed a standardized, complete architecture. This paper analyzes the business and data interaction requirements of retail electricity companies, and then designs the functional architecture based on basic application, advanced application and value-added application, and the technical architecture based on “cloud”. On this basis, the paper discusses the selection of private cloud, public cloud and mixed cloud model, and the rationalization suggestion of system construction. Which can provide reference for the construction of the technical support system of the domestic retail enterprises.

  5. Cloud-assisted mutual authentication and privacy preservation protocol for telecare medical information systems.

    Science.gov (United States)

    Li, Chun-Ta; Shih, Dong-Her; Wang, Chun-Cheng

    2018-04-01

     With the rapid development of wireless communication technologies and the growing prevalence of smart devices, telecare medical information system (TMIS) allows patients to receive medical treatments from the doctors via Internet technology without visiting hospitals in person. By adopting mobile device, cloud-assisted platform and wireless body area network, the patients can collect their physiological conditions and upload them to medical cloud via their mobile devices, enabling caregivers or doctors to provide patients with appropriate treatments at anytime and anywhere. In order to protect the medical privacy of the patient and guarantee reliability of the system, before accessing the TMIS, all system participants must be authenticated.  Mohit et al. recently suggested a lightweight authentication protocol for cloud-based health care system. They claimed their protocol ensures resilience of all well-known security attacks and has several important features such as mutual authentication and patient anonymity. In this paper, we demonstrate that Mohit et al.'s authentication protocol has various security flaws and we further introduce an enhanced version of their protocol for cloud-assisted TMIS, which can ensure patient anonymity and patient unlinkability and prevent the security threats of report revelation and report forgery attacks.  The security analysis proves that our enhanced protocol is secure against various known attacks as well as found in Mohit et al.'s protocol. Compared with existing related protocols, our enhanced protocol keeps the merits of all desirable security requirements and also maintains the efficiency in terms of computation costs for cloud-assisted TMIS.  We propose a more secure mutual authentication and privacy preservation protocol for cloud-assisted TMIS, which fixes the mentioned security weaknesses found in Mohit et al.'s protocol. According to our analysis, our authentication protocol satisfies most functionality features

  6. myBlackBox: Blackbox Mobile Cloud Systems for Personalized Unusual Event Detection

    Directory of Open Access Journals (Sweden)

    Junho Ahn

    2016-05-01

    Full Text Available We demonstrate the feasibility of constructing a novel and practical real-world mobile cloud system, called myBlackBox, that efficiently fuses multimodal smartphone sensor data to identify and log unusual personal events in mobile users’ daily lives. The system incorporates a hybrid architectural design that combines unsupervised classification of audio, accelerometer and location data with supervised joint fusion classification to achieve high accuracy, customization, convenience and scalability. We show the feasibility of myBlackBox by implementing and evaluating this end-to-end system that combines Android smartphones with cloud servers, deployed for 15 users over a one-month period.

  7. 3D Viewer Platform of Cloud Clustering Management System: Google Map 3D

    Science.gov (United States)

    Choi, Sung-Ja; Lee, Gang-Soo

    The new management system of framework for cloud envrionemnt is needed by the platfrom of convergence according to computing environments of changes. A ISV and small business model is hard to adapt management system of platform which is offered from super business. This article suggest the clustering management system of cloud computing envirionments for ISV and a man of enterprise in small business model. It applies the 3D viewer adapt from map3D & earth of google. It is called 3DV_CCMS as expand the CCMS[1].

  8. Management of Virtual Machine as an Energy Conservation in Private Cloud Computing System

    Directory of Open Access Journals (Sweden)

    Fauzi Akhmad

    2016-01-01

    Full Text Available Cloud computing is a service model that is packaged in a base computing resources that can be accessed through the Internet on demand and placed in the data center. Data center architecture in cloud computing environments are heterogeneous and distributed, composed of a cluster of network servers with different capacity computing resources in different physical servers. The problems on the demand and availability of cloud services can be solved by fluctuating data center cloud through abstraction with virtualization technology. Virtual machine (VM is a representation of the availability of computing resources that can be dynamically allocated and reallocated on demand. In this study the consolidation of VM as energy conservation in Private Cloud Computing Systems with the target of process optimization selection policy and migration of the VM on the procedure consolidation. VM environment cloud data center to consider hosting a type of service a particular application at the instance VM requires a different level of computing resources. The results of the use of computing resources on a VM that is not balanced in physical servers can be reduced by using a live VM migration to achieve workload balancing. A practical approach used in developing OpenStack-based cloud computing environment by integrating Cloud VM and VM Placement selection procedure using OpenStack Neat VM consolidation. Following the value of CPU Time used as a fill to get the average value in MHz CPU utilization within a specific time period. The average value of a VM’s CPU utilization in getting from the current CPU_time reduced by CPU_time from the previous data retrieval multiplied by the maximum frequency of the CPU. The calculation result is divided by the making time CPU_time when it is reduced to the previous taking time CPU_time multiplied by milliseconds.

  9. CIMS: A Context-Based Intelligent Multimedia System for Ubiquitous Cloud Computing

    Directory of Open Access Journals (Sweden)

    Abhilash Sreeramaneni

    2015-06-01

    Full Text Available Mobile users spend a tremendous amount of time surfing multimedia contents over the Internet to pursue their interests. A resource-constrained smart device demands more intensive computing tasks and lessens the battery life. To address the resource limitations (i.e., memory, lower maintenance cost, easier access, computing tasks in mobile devices, mobile cloud computing is needed. Several approaches have been proposed to confront the challenges of mobile cloud computing, but difficulties still remain. However, in the coming years, context collecting, processing, and interchanging the results on a heavy network will cause vast computations and reduce the battery life in mobiles. In this paper, we propose a “context-based intelligent multimedia system” (CIMS for ubiquitous cloud computing. The main goal of this research is to lessen the computing percentage, storage complexities, and battery life for mobile users by using pervasive cloud computing. Moreover, to reduce the computing and storage concerns in mobiles, the cloud server collects several groups of user profiles with similarities by executing K-means clustering on users’ data (context and multimedia contents. The distribution process conveys real-time notifications to smartphone users, according to what is stated in his/her profile. We considered a mobile cloud offloading system, which decides the offloading actions to/from cloud servers. Context-aware decision-making (CAD customizes the mobile device performance with different specifications such as short response time and lesser energy consumption. The analysis says that our CIMS takes advantage of cost-effective features to produce high-quality information for mobile (or smart device users in real time. Moreover, our CIMS lessens the computation and storage complexities for mobile users, as well as cloud servers. Simulation analysis suggests that our approach is more efficient than existing domains.

  10. 3D reconstruction of tropospheric cirrus clouds by stereovision system

    Science.gov (United States)

    Nadjib Kouahla, Mohamed; Moreels, Guy; Seridi, Hamid

    2016-07-01

    A stereo imaging method is applied to measure the altitude of cirrus clouds and provide a 3D map of the altitude of the layer centroid. They are located in the high troposphere and, sometimes in the lower stratosphere, between 6 and 10 km high. Two simultaneous images of the same scene are taken with Canon cameras (400D) in two sites distant of 37 Km. Each image processed in order to invert the perspective effect and provide a satellite-type view of the layer. Pairs of matched points that correspond to a physical emissive point in the common area are identified in calculating a correlation coefficient (ZNCC: Zero mean Normalized Cross-correlation or ZSSD: as Zero mean Sum of Squared Differences). This method is suitable for obtaining 3D representations in the case of low-contrast objects. An observational campaign was conducted in June 2014 in France. The images were taken simultaneously at Marnay (47°17'31.5" N, 5°44'58.8" E; altitude 275 m) 25 km northwest of Besancon and in Mont poupet (46°58'31.5" N, 5°52'22.7" E; altitude 600 m) southwest of Besancon at 43 km. 3D maps of the Natural cirrus clouds and artificial like "aircraft trails" are retrieved. They are compared with pseudo-relief intensity maps of the same region. The mean altitude of the cirrus barycenter is located at 8.5 ± 1km on June 11.

  11. Improving aerosol interaction with clouds and precipitation in a regional chemical weather modeling system

    Directory of Open Access Journals (Sweden)

    C. Zhou

    2016-01-01

    Full Text Available A comprehensive aerosol–cloud–precipitation interaction (ACI scheme has been developed under a China Meteorological Administration (CMA chemical weather modeling system, GRAPES/CUACE (Global/Regional Assimilation and PrEdiction System, CMA Unified Atmospheric Chemistry Environment. Calculated by a sectional aerosol activation scheme based on the information of size and mass from CUACE and the thermal-dynamic and humid states from the weather model GRAPES at each time step, the cloud condensation nuclei (CCN are interactively fed online into a two-moment cloud scheme (WRF Double-Moment 6-class scheme – WDM6 and a convective parameterization to drive cloud physics and precipitation formation processes. The modeling system has been applied to study the ACI for January 2013 when several persistent haze-fog events and eight precipitation events occurred.The results show that aerosols that interact with the WDM6 in GRAPES/CUACE obviously increase the total cloud water, liquid water content, and cloud droplet number concentrations, while decreasing the mean diameters of cloud droplets with varying magnitudes of the changes in each case and region. These interactive microphysical properties of clouds improve the calculation of their collection growth rates in some regions and hence the precipitation rate and distributions in the model, showing 24 to 48 % enhancements of threat score for 6 h precipitation in almost all regions. The aerosols that interact with the WDM6 also reduce the regional mean bias of temperature by 3 °C during certain precipitation events, but the monthly means bias is only reduced by about 0.3 °C.

  12. Preliminary physician and pharmacist survey of the National Health Insurance PharmaCloud system in Taiwan.

    Science.gov (United States)

    Tseng, Yu-Ting; Chang, Elizabeth H; Kuo, Li-Na; Shen, Wan-Chen; Bai, Kuan-Jen; Wang, Chih-Chi; Chen, Hsiang-Yin

    2017-10-01

    The PharmaCloud system, a cloud-based medication system, was launched by the Taiwan National Health Insurance Administration (NHIA) in 2013 to integrate patients' medication lists among different medical institutions. The aim of the preliminary study was to evaluate satisfaction with this system among physicians and pharmacists at the early stage of system implementation. A questionnaire was developed through a review of the literature and discussion in 6 focus groups to understand the level of satisfaction, attitudes, and intentions of physicians and pharmacists using the PharmaCloud system. It was then administered nationally in Taiwan in July to September 2015. Descriptive statistics and multiple regression were performed to identify variables influencing satisfaction and intention to use the system. In total, 895 pharmacist and 105 physician questionnaires were valid for analysis. The results showed that satisfaction with system quality warranted improvement. Positive attitudes toward medication reconciliation among physicians and pharmacists, which were significant predictors of the intention to use the system (β= 0.223, p Taiwan PharmaCloud system a convenient platform for medication reconciliation. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Microphysical processing of aerosol particles in orographic clouds

    Science.gov (United States)

    Pousse-Nottelmann, S.; Zubler, E. M.; Lohmann, U.

    2015-08-01

    An explicit and detailed treatment of cloud-borne particles allowing for the consideration of aerosol cycling in clouds has been implemented into COSMO-Model, the regional weather forecast and climate model of the Consortium for Small-scale Modeling (COSMO). The effects of aerosol scavenging, cloud microphysical processing and regeneration upon cloud evaporation on the aerosol population and on subsequent cloud formation are investigated. For this, two-dimensional idealized simulations of moist flow over two bell-shaped mountains were carried out varying the treatment of aerosol scavenging and regeneration processes for a warm-phase and a mixed-phase orographic cloud. The results allowed us to identify different aerosol cycling mechanisms. In the simulated non-precipitating warm-phase cloud, aerosol mass is incorporated into cloud droplets by activation scavenging and released back to the atmosphere upon cloud droplet evaporation. In the mixed-phase cloud, a first cycle comprises cloud droplet activation and evaporation via the Wegener-Bergeron-Findeisen (WBF) process. A second cycle includes below-cloud scavenging by precipitating snow particles and snow sublimation and is connected to the first cycle via the riming process which transfers aerosol mass from cloud droplets to snowflakes. In the simulated mixed-phase cloud, only a negligible part of the total aerosol mass is incorporated into ice crystals. Sedimenting snowflakes reaching the surface remove aerosol mass from the atmosphere. The results show that aerosol processing and regeneration lead to a vertical redistribution of aerosol mass and number. Thereby, the processes impact the total aerosol number and mass and additionally alter the shape of the aerosol size distributions by enhancing the internally mixed/soluble Aitken and accumulation mode and generating coarse-mode particles. Concerning subsequent cloud formation at the second mountain, accounting for aerosol processing and regeneration increases

  14. CloudScan - A Configuration-Free Invoice Analysis System Using Recurrent Neural Networks

    DEFF Research Database (Denmark)

    Palm, Rasmus Berg; Winther, Ole; Laws, Florian

    2017-01-01

    We present CloudScan; an invoice analysis system that requires zero configuration or upfront annotation. In contrast to previous work, CloudScan does not rely on templates of invoice layout, instead it learns a single global model of invoices that naturally generalizes to unseen invoice layouts....... The model is trained using data automatically extracted from end-user provided feedback. This automatic training data extraction removes the requirement for users to annotate the data precisely. We describe a recurrent neural network model that can capture long range context and compare it to a baseline...... logistic regression model corresponding to the current CloudScan production system. We train and evaluate the system on 8 important fields using a dataset of 326,471 invoices. The recurrent neural network and baseline model achieve 0.891 and 0.887 average F1 scores respectively on seen invoice layouts...

  15. Assessment of In-Cloud Enterprise Resource Planning System Performed in a Virtual Cluster

    Directory of Open Access Journals (Sweden)

    Bao Rong Chang

    2015-01-01

    Full Text Available This paper introduces a high-performed high-availability in-cloud enterprise resources planning (in-cloud ERP which has deployed in the virtual machine cluster. The proposed approach can resolve the crucial problems of ERP failure due to unexpected downtime and failover between physical hosts in enterprises, causing operation termination and hence data loss. Besides, the proposed one together with the access control authentication and network security is capable of preventing intrusion hacked and/or malicious attack via internet. Regarding system assessment, cost-performance (C-P ratio, a remarkable cost effectiveness evaluation, has been applied to several remarkable ERP systems. As a result, C-P ratio evaluated from the experiments shows that the proposed approach outperforms two well-known benchmark ERP systems, namely, in-house ECC 6.0 and in-cloud ByDesign.

  16. Analysis of the Health Information and Communication System and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2015-05-01

    Full Text Available This paper describes an analysis and shows its use in analysing strengths, weaknesses, opportunities and threats (risks within the health care system.The aim is further more to show strengths, weaknesses, opportunities and threats when using cloud computing in the health care system. Cloud computing in medicine is an integral part of telemedicine. Based on the information presented in this paper, employees may identify the advantages and disadvantages of using cloud computing. When introducing new information technologies in the health care business the implementers will encounter numerous problems, such as: the complexity of the existing and the new information system, the costs of maintaining and updating the software, the cost of implementing new modules,a way of protecting the existing data in the database and the data that will be collected in the diagnosis. Using the SWOT analysis this paper evaluates the feasibility and possibility of adopting cloud computing in the health sector to improve health services based on samples (examples from abroad. The intent of cloud computing in medicine is to send data of the patient to the doctor instead of the patient sending it himself/herself.

  17. Modeling and Security Threat Assessments of Data Processed in Cloud Based Information Systems

    Directory of Open Access Journals (Sweden)

    Darya Sergeevna Simonenkova

    2016-03-01

    Full Text Available The subject of the research is modeling and security threat assessments of data processed in cloud based information systems (CBIS. This method allow to determine the current security threats of CBIS, state of the system in which vulnerabilities exists, level of possible violators, security properties and to generate recommendations for neutralizing security threats of CBIS.

  18. DIFFUSION OF A CLOUD-BASED SAP ENTERPRISE SYSTEM TO DANISH MUNICIPALITITES

    DEFF Research Database (Denmark)

    Frisenvang, Jakob Fogh; Pedersen, Christoffer Ejerskov; Svejvig, Per

    as a Service (ESaaS) (Svejvig, Storgaard, and Møller, 2013). ESaaS encompasses Enterprise systems such as ERP, CRM, and BI systems. From a purely cloud-computing perspective, the delivery of such services does not significantly distinguish itself from the SaaS model, but from a practice perspective...

  19. Diffusion of a Cloud-based SAP Enterprise System to Danish Municipalities

    DEFF Research Database (Denmark)

    Fogh Frisenvang, Jakob; Ejerskov Pedersen, Christoffer; Svejvig, Per

    as a Service (ESaaS) (Svejvig, Storgaard, and Møller, 2013). ESaaS encompasses Enterprise systems such as ERP, CRM, and BI systems. From a purely cloud-computing perspective, the delivery of such services does not significantly distinguish itself from the SaaS model, but from a practice perspective...

  20. Distributed measurement system for long term monitoring of clouding effects on large PV plants

    DEFF Research Database (Denmark)

    Paasch, K. M.; Nymand, M.; Haase, F.

    2013-01-01

    A recording system for the generation of current-voltage characteristics of solar panels is presented. The system is intended for large area PV power plants. The recorded curves are used to optimize the energy output of PV power plants, which are likely to be influenced by passing clouds...

  1. Test-driven modeling and development of cloud-enabled cyber-physical smart systems

    DEFF Research Database (Denmark)

    Munck, Allan; Madsen, Jan

    2017-01-01

    Embedded products currently tend to evolve into large and complex smart systems where products are enriched with services through clouds and other web technologies. The complex characteristics of smart systems make it very difficult to guarantee functionality, safety, security and performance...

  2. EDUCATIONAL USE OF CLOUD COMPUTING AND AT-MEGA MICROCONTROLLER - A CASE STUDY OF AN ALARM SYSTEM

    Directory of Open Access Journals (Sweden)

    Tomasz Cieplak

    2016-06-01

    Full Text Available The article shows a case study of Cloud Computing model combined with AT-Mega microcontrollers for educational purposes. The presented system takes advantage of many aspects of Internet of Things model, thus conjoining Cloud Management system with measurement-execution module based on Arduino platform. One benefit of this solution is a cost-effective way of showcasing machine and device integration with distinct cloud services. This article is based on practical experience with students' projects and an home alarm system with use of a Cloud Computing services will be described.

  3. Information Technology Service Management with Cloud Computing Approach to Improve Administration System and Online Learning Performance

    Directory of Open Access Journals (Sweden)

    Wilianto Wilianto

    2015-10-01

    Full Text Available This work discusses the development of information technology service management using cloud computing approach to improve the performance of administration system and online learning at STMIK IBBI Medan, Indonesia. The network topology is modeled and simulated for system administration and online learning. The same network topology is developed in cloud computing using Amazon AWS architecture. The model is designed and modeled using Riverbed Academic Edition Modeler to obtain values of the parameters: delay, load, CPU utilization, and throughput. The simu- lation results are the following. For network topology 1, without cloud computing, the average delay is 54  ms, load 110 000 bits/s, CPU utilization 1.1%, and throughput 440  bits/s.  With  cloud  computing,  the  average  delay  is 45 ms,  load  2 800  bits/s,  CPU  utilization  0.03%,  and throughput 540 bits/s. For network topology 2, without cloud computing, the average delay is 39  ms, load 3 500 bits/s, CPU utilization 0.02%, and throughput database server 1 400 bits/s. With cloud computing, the average delay is 26 ms, load 5 400 bits/s, CPU utilization email server 0.0001%, FTP server 0.001%, HTTP server 0.0002%, throughput email server 85 bits/s, FTP    server 100 bits/sec, and HTTP server 95  bits/s.  Thus,  the  delay, the load, and the CPU utilization decrease; but,  the throughput increases. Information technology service management with cloud computing approach has better performance.

  4. Evaluating Maximum Photovoltaic Integration in District Distribution Systems Considering Optimal Inverter Dispatch and Cloud Shading Conditions

    DEFF Research Database (Denmark)

    Ding, Tao; Kou, Yu; Yang, Yongheng

    2017-01-01

    . However, the intermittency of solar PV energy (e.g., due to passing clouds) may affect the PV generation in the district distribution network. To address this issue, the voltage magnitude constraints under the cloud shading conditions should be taken into account in the optimization model, which can......As photovoltaic (PV) integration increases in distribution systems, to investigate the maximum allowable PV integration capacity for a district distribution system becomes necessary in the planning phase, an optimization model is thus proposed to evaluate the maximum PV integration capacity while...

  5. Determination of ice water path in ice-over-water cloud systems using combined MODIS and AMSR-E measurements

    Science.gov (United States)

    Huang, Jianping; Minnis, Patrick; Lin, Bing; Yi, Yuhong; Fan, T.-F.; Sun-Mack, Sunny; Ayers, J. K.

    2006-11-01

    To provide more accurate ice cloud microphysical properties, the multi-layered cloud retrieval system (MCRS) is used to retrieve ice water path (IWP) in ice-over-water cloud systems globally over oceans using combined instrument data from Aqua. The liquid water path (LWP) of lower-layer water clouds is estimated from the Advanced Microwave Scanning Radiometer for EOS (AMSR-E) measurements. The properties of the upper-level ice clouds are then derived from Moderate Resolution Imaging Spectroradiometer (MODIS) measurements by matching simulated radiances from a two-cloud-layer radiative transfer model. The results show that the MCRS can significantly improve the accuracy and reduce the over-estimation of optical depth and IWP retrievals for ice-over-water cloud systems. The mean daytime ice cloud optical depth and IWP for overlapped ice-over-water clouds over oceans from Aqua are 7.6 and 146.4 gm-2, respectively, down from the initial single-layer retrievals of 17.3 and 322.3 gm-2. The mean IWP for actual single-layer clouds is 128.2 gm-2.

  6. A Systematic Mapping Study of Software Architectures for Cloud Based Systems

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    Context: Cloud computing has gained significant attention of researchers and practitioners. This emerging paradigm is being used to provide solutions in multiple domains without huge upfront investment because of its on demand recourse-provisioning model. However, the information about how software...... of this study is to systematically identify and analyze the currently published research on the topics related to software architectures for cloud-based systems in order to identify architecture solutions for achieving quality requirements. Method: We decided to carry out a systematic mapping study to find...... as much peer-reviewed literature on the topics related to software architectures for cloud-based systems as possible. This study has been carried out by following the guidelines for conducting systematic literature reviews and systematic mapping studies as reported in the literature. Based on our paper...

  7. Microphysical processing of aerosol particles in orographic clouds

    Directory of Open Access Journals (Sweden)

    S. Pousse-Nottelmann

    2015-08-01

    aerosol cycling in clouds has been implemented into COSMO-Model, the regional weather forecast and climate model of the Consortium for Small-scale Modeling (COSMO. The effects of aerosol scavenging, cloud microphysical processing and regeneration upon cloud evaporation on the aerosol population and on subsequent cloud formation are investigated. For this, two-dimensional idealized simulations of moist flow over two bell-shaped mountains were carried out varying the treatment of aerosol scavenging and regeneration processes for a warm-phase and a mixed-phase orographic cloud. The results allowed us to identify different aerosol cycling mechanisms. In the simulated non-precipitating warm-phase cloud, aerosol mass is incorporated into cloud droplets by activation scavenging and released back to the atmosphere upon cloud droplet evaporation. In the mixed-phase cloud, a first cycle comprises cloud droplet activation and evaporation via the Wegener–Bergeron–Findeisen (WBF process. A second cycle includes below-cloud scavenging by precipitating snow particles and snow sublimation and is connected to the first cycle via the riming process which transfers aerosol mass from cloud droplets to snowflakes. In the simulated mixed-phase cloud, only a negligible part of the total aerosol mass is incorporated into ice crystals. Sedimenting snowflakes reaching the surface remove aerosol mass from the atmosphere. The results show that aerosol processing and regeneration lead to a vertical redistribution of aerosol mass and number. Thereby, the processes impact the total aerosol number and mass and additionally alter the shape of the aerosol size distributions by enhancing the internally mixed/soluble Aitken and accumulation mode and generating coarse-mode particles. Concerning subsequent cloud formation at the second mountain, accounting for aerosol processing and regeneration increases the cloud droplet number concentration with possible implications for the ice crystal number

  8. Lightweight Data Systems in the Cloud: Costs, Benefits and Best Practices

    Science.gov (United States)

    Fatland, R.; Arendt, A. A.; Howe, B.; Hess, N. J.; Futrelle, J.

    2015-12-01

    We present here a simple analysis of both the cost and the benefit of using the cloud in environmental science circa 2016. We present this set of ideas to enable the potential 'cloud adopter' research scientist to explore and understand the tradeoffs in moving some aspect of their compute work to the cloud. We present examples, design patterns and best practices as an evolving body of knowledge that help optimize benefit to the research team. Thematically this generally means not starting from a blank page but rather learning how to find 90% of the solution to a problem pre-built. We will touch on four topics of interest. (1) Existing cloud data resources (NASA, WHOI BCO DMO, etc) and how they can be discovered, used and improved. (2) How to explore, compare and evaluate cost and compute power from many cloud options, particularly in relation to data scale (size/complexity). (3) What are simple / fast 'Lightweight Data System' procedures that take from 20 minutes to one day to implement and that have a clear immediate payoff in environmental data-driven research. Examples include publishing a SQL Share URL at (EarthCube's) CINERGI as a registered data resource and creating executable papers on a cloud-hosted Jupyter instance, particularly iPython notebooks. (4) Translating the computational terminology landscape ('cloud', 'HPC cluster', 'hadoop', 'spark', 'machine learning') into examples from the community of practice to help the geoscientist build or expand their mental map. In the course of this discussion -- which is about resource discovery, adoption and mastery -- we provide direction to online resources in support of these themes.

  9. Cloud-point measurement for (sulphate salts + polyethylene glycol 15000 + water) systems by the particle counting method

    International Nuclear Information System (INIS)

    Imani, A.; Modarress, H.; Eliassi, A.; Abdous, M.

    2009-01-01

    The phase separation of (water + salt + polyethylene glycol 15000) systems was studied by cloud-point measurements using the particle counting method. The effect of three kinds of sulphate salt (Na 2 SO 4 , K 2 SO 4 , (NH 4 ) 2 SO 4 ) concentration, polyethylene glycol 15000 concentration, mass ratio of polymer to salt on the cloud-point temperature of these systems have been investigated. The results obtained indicate that the cloud-point temperatures decrease linearly with increase in polyethylene glycol concentrations for different salts. Also, the cloud points decrease with an increase in mass ratio of salt to polymer.

  10. Data Privacy in Cloud-assisted Healthcare Systems: State of the Art and Future Challenges.

    Science.gov (United States)

    Sajid, Anam; Abbas, Haider

    2016-06-01

    The widespread deployment and utility of Wireless Body Area Networks (WBAN's) in healthcare systems required new technologies like Internet of Things (IoT) and cloud computing, that are able to deal with the storage and processing limitations of WBAN's. This amalgamation of WBAN-based healthcare systems to cloud-based healthcare systems gave rise to serious privacy concerns to the sensitive healthcare data. Hence, there is a need for the proactive identification and effective mitigation mechanisms for these patient's data privacy concerns that pose continuous threats to the integrity and stability of the healthcare environment. For this purpose, a systematic literature review has been conducted that presents a clear picture of the privacy concerns of patient's data in cloud-assisted healthcare systems and analyzed the mechanisms that are recently proposed by the research community. The methodology used for conducting the review was based on Kitchenham guidelines. Results from the review show that most of the patient's data privacy techniques do not fully address the privacy concerns and therefore require more efforts. The summary presented in this paper would help in setting research directions for the techniques and mechanisms that are needed to address the patient's data privacy concerns in a balanced and light-weight manner by considering all the aspects and limitations of the cloud-assisted healthcare systems.

  11. A Model for the Acceptance of Cloud Computing Technology Using DEMATEL Technique and System Dynamics Approach

    Directory of Open Access Journals (Sweden)

    seyyed mohammad zargar

    2018-03-01

    Full Text Available Cloud computing is a new method to provide computing resources and increase computing power in organizations. Despite the many benefits this method shares, it has not been universally used because of some obstacles including security issues and has become a concern for IT managers in organization. In this paper, the general definition of cloud computing is presented. In addition, having reviewed previous studies, the researchers identified effective variables on technology acceptance and, especially, cloud computing technology. Then, using DEMATEL technique, the effectiveness and permeability of the variable were determined. The researchers also designed a model to show the existing dynamics in cloud computing technology using system dynamics approach. The validity of the model was confirmed through evaluation methods in dynamics model by using VENSIM software. Finally, based on different conditions of the proposed model, a variety of scenarios were designed. Then, the implementation of these scenarios was simulated within the proposed model. The results showed that any increase in data security, government support and user training can lead to the increase in the adoption and use of cloud computing technology.

  12. Dynamic Performance Optimization for Cloud Computing Using M/M/m Queueing System

    Directory of Open Access Journals (Sweden)

    Lizheng Guo

    2014-01-01

    Full Text Available Successful development of cloud computing has attracted more and more people and enterprises to use it. On one hand, using cloud computing reduces the cost; on the other hand, using cloud computing improves the efficiency. As the users are largely concerned about the Quality of Services (QoS, performance optimization of the cloud computing has become critical to its successful application. In order to optimize the performance of multiple requesters and services in cloud computing, by means of queueing theory, we analyze and conduct the equation of each parameter of the services in the data center. Then, through analyzing the performance parameters of the queueing system, we propose the synthesis optimization mode, function, and strategy. Lastly, we set up the simulation based on the synthesis optimization mode; we also compare and analyze the simulation results to the classical optimization methods (short service time first and first in, first out method, which show that the proposed model can optimize the average wait time, average queue length, and the number of customer.

  13. HammerCloud: A Stress Testing System for Distributed Analysis

    CERN Document Server

    van der Ster, Daniel C; Ubeda Garcia, Mario; Paladin, Massimo

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud (HC) is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HC was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HC has been ...

  14. Dark Matter and Extragalactic Gas Clouds in the NGC 4532/DDO 137 System

    Science.gov (United States)

    Hoffman, G. L.; Lu, N. Y.; Salpeter, E. E.; Connell, B. M.

    1998-01-01

    H I synthesis mapping of NGC 4532 and DDO 137, a pair of Sm galaxies on the edge of the Virgo cluster, is used to determine rotation curves for each of the galaxies and to resolve the structure and kinematics of three extragalactic H I clouds embedded in an extended envelope of diffuse HI discovered in earlier Arecibo studies of the system.

  15. Adaptive Resource Utilization Prediction System for Infrastructure as a Service Cloud

    Directory of Open Access Journals (Sweden)

    Qazi Zia Ullah

    2017-01-01

    Full Text Available Infrastructure as a Service (IaaS cloud provides resources as a service from a pool of compute, network, and storage resources. Cloud providers can manage their resource usage by knowing future usage demand from the current and past usage patterns of resources. Resource usage prediction is of great importance for dynamic scaling of cloud resources to achieve efficiency in terms of cost and energy consumption while keeping quality of service. The purpose of this paper is to present a real-time resource usage prediction system. The system takes real-time utilization of resources and feeds utilization values into several buffers based on the type of resources and time span size. Buffers are read by R language based statistical system. These buffers’ data are checked to determine whether their data follows Gaussian distribution or not. In case of following Gaussian distribution, Autoregressive Integrated Moving Average (ARIMA is applied; otherwise Autoregressive Neural Network (AR-NN is applied. In ARIMA process, a model is selected based on minimum Akaike Information Criterion (AIC values. Similarly, in AR-NN process, a network with the lowest Network Information Criterion (NIC value is selected. We have evaluated our system with real traces of CPU utilization of an IaaS cloud of one hundred and twenty servers.

  16. Integrating Learning Services in the Cloud: An Approach That Benefits Both Systems and Learning

    Science.gov (United States)

    Gutiérrez-Carreón, Gustavo; Daradoumis, Thanasis; Jorba, Josep

    2015-01-01

    Currently there is an increasing trend to implement functionalities that allow for the development of applications based on Cloud computing. In education there are high expectations for Learning Management Systems since they can be powerful tools to foster more effective collaboration within a virtual classroom. Tools can also be integrated with…

  17. Reference Models of Information Systems Constructed with the use of Technologies of Cloud Calculations

    Directory of Open Access Journals (Sweden)

    Darya Sergeevna Simonenkova

    2013-09-01

    Full Text Available The subject of the research is analysis of various models of the information system constructed with the use of technologies of cloud calculations. Analysis of models is required for constructing a new reference model which will be used for develop a security threats model.

  18. Cloud computing models and their application in LTE based cellular systems

    NARCIS (Netherlands)

    Staring, A.J.; Karagiannis, Georgios

    2013-01-01

    As cloud computing emerges as the next novel concept in computer science, it becomes clear that the model applied in large data storage systems used to resolve issues coming forth from an increasing demand, could also be used to resolve the very high bandwidth requirements on access network, core

  19. Selection and provisioning of services in a cloud using recommender systems approach for SMME

    CSIR Research Space (South Africa)

    Manqele, S

    2013-06-01

    Full Text Available Peninsula University of Technology, 10 September 2013 Selection and provisioning of services in a cloud using recommender systems approach for SMME S. Manqele1, N.Dlodlo2, P.Mvelase3, M. Dlodlo4 , S.S. Xulu5, M. Adigun6 1, 2, 3 CSIR – Meraka...

  20. Cost Savings Associated with the Adoption of a Cloud Computing Data Transfer System for Trauma Patients.

    Science.gov (United States)

    Feeney, James M; Montgomery, Stephanie C; Wolf, Laura; Jayaraman, Vijay; Twohig, Michael

    2016-09-01

    Among transferred trauma patients, challenges with the transfer of radiographic studies include problems loading or viewing the studies at the receiving hospitals, and problems manipulating, reconstructing, or evalu- ating the transferred images. Cloud-based image transfer systems may address some ofthese problems. We reviewed the charts of patients trans- ferred during one year surrounding the adoption of a cloud computing data transfer system. We compared the rates of repeat imaging before (precloud) and af- ter (postcloud) the adoption of the cloud-based data transfer system. During the precloud period, 28 out of 100 patients required 90 repeat studies. With the cloud computing transfer system in place, three out of 134 patients required seven repeat films. There was a statistically significant decrease in the proportion of patients requiring repeat films (28% to 2.2%, P < .0001). Based on an annualized volume of 200 trauma patient transfers, the cost savings estimated using three methods of cost analysis, is between $30,272 and $192,453.

  1. Web-based Tsunami Early Warning System with instant Tsunami Propagation Calculations in the GPU Cloud

    Science.gov (United States)

    Hammitzsch, M.; Spazier, J.; Reißland, S.

    2014-12-01

    Usually, tsunami early warning and mitigation systems (TWS or TEWS) are based on several software components deployed in a client-server based infrastructure. The vast majority of systems importantly include desktop-based clients with a graphical user interface (GUI) for the operators in early warning centers. However, in times of cloud computing and ubiquitous computing the use of concepts and paradigms, introduced by continuously evolving approaches in information and communications technology (ICT), have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in three research projects - 'German Indonesian Tsunami Early Warning System' (GITEWS), 'Distant Early Warning System' (DEWS), and 'Collaborative, Complex, and Critical Decision-Support in Evolving Crises' (TRIDEC) - new technologies are exploited to implement a cloud-based and web-based prototype to open up new prospects for EWS. This prototype, named 'TRIDEC Cloud', merges several complementary external and in-house cloud-based services into one platform for automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The prototype in its current version addresses tsunami early warning and mitigation. The integration of GPU accelerated tsunami simulation computations have been an integral part of this prototype to foster early warning with on-demand tsunami predictions based on actual source parameters. However, the platform is meant for researchers around the world to make use of the cloud-based GPU computation to analyze other types of geohazards and natural hazards and react upon the computed situation picture with a web-based GUI in a web browser at remote sites. The current website is an early alpha version for demonstration purposes to give the

  2. Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues

    Science.gov (United States)

    Chakravarthy, Srinivas R.; Rumyantsev, Alexander

    2018-03-01

    Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication) for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.

  3. Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues

    Directory of Open Access Journals (Sweden)

    Chakravarthy Srinivas R.

    2018-03-01

    Full Text Available Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.

  4. A Secure Cloud-Assisted Wireless Body Area Network in Mobile Emergency Medical Care System.

    Science.gov (United States)

    Li, Chun-Ta; Lee, Cheng-Chi; Weng, Chi-Yao

    2016-05-01

    Recent advances in medical treatment and emergency applications, the need of integrating wireless body area network (WBAN) with cloud computing can be motivated by providing useful and real time information about patients' health state to the doctors and emergency staffs. WBAN is a set of body sensors carried by the patient to collect and transmit numerous health items to medical clouds via wireless and public communication channels. Therefore, a cloud-assisted WBAN facilitates response in case of emergency which can save patients' lives. Since the patient's data is sensitive and private, it is important to provide strong security and protection on the patient's medical data over public and insecure communication channels. In this paper, we address the challenge of participant authentication in mobile emergency medical care systems for patients supervision and propose a secure cloud-assisted architecture for accessing and monitoring health items collected by WBAN. For ensuring a high level of security and providing a mutual authentication property, chaotic maps based authentication and key agreement mechanisms are designed according to the concept of Diffie-Hellman key exchange, which depends on the CMBDLP and CMBDHP problems. Security and performance analyses show how the proposed system guaranteed the patient privacy and the system confidentiality of sensitive medical data while preserving the low computation property in medical treatment and remote medical monitoring.

  5. A Hierarchical Auction-Based Mechanism for Real-Time Resource Allocation in Cloud Robotic Systems.

    Science.gov (United States)

    Wang, Lujia; Liu, Ming; Meng, Max Q-H

    2017-02-01

    Cloud computing enables users to share computing resources on-demand. The cloud computing framework cannot be directly mapped to cloud robotic systems with ad hoc networks since cloud robotic systems have additional constraints such as limited bandwidth and dynamic structure. However, most multirobotic applications with cooperative control adopt this decentralized approach to avoid a single point of failure. Robots need to continuously update intensive data to execute tasks in a coordinated manner, which implies real-time requirements. Thus, a resource allocation strategy is required, especially in such resource-constrained environments. This paper proposes a hierarchical auction-based mechanism, namely link quality matrix (LQM) auction, which is suitable for ad hoc networks by introducing a link quality indicator. The proposed algorithm produces a fast and robust method that is accurate and scalable. It reduces both global communication and unnecessary repeated computation. The proposed method is designed for firm real-time resource retrieval for physical multirobot systems. A joint surveillance scenario empirically validates the proposed mechanism by assessing several practical metrics. The results show that the proposed LQM auction outperforms state-of-the-art algorithms for resource allocation.

  6. Development of a cloud-based system for remote monitoring of a PVT panel

    Science.gov (United States)

    Saraiva, Luis; Alcaso, Adérito; Vieira, Paulo; Ramos, Carlos Figueiredo; Cardoso, Antonio Marques

    2016-10-01

    The paper presents a monitoring system developed for an energy conversion system based on the sun and known as thermophotovoltaic panel (PVT). The project was implemented using two embedded microcontrollers platforms (arduino Leonardo and arduino yún), wireless transmission systems (WI-FI and XBEE) and net computing ,commonly known as cloud (Google cloud). The main objective of the project is to provide remote access and real-time data monitoring (like: electrical current, electrical voltage, input fluid temperature, output fluid temperature, backward fluid temperature, up PV glass temperature, down PV glass temperature, ambient temperature, solar radiation, wind speed, wind direction and fluid mass flow). This project demonstrates the feasibility of using inexpensive microcontroller's platforms and free internet service in theWeb, to support the remote study of renewable energy systems, eliminating the acquisition of dedicated systems typically more expensive and limited in the kind of processing proposed.

  7. Proposed Network Intrusion Detection System ‎Based on Fuzzy c Mean Algorithm in Cloud ‎Computing Environment

    Directory of Open Access Journals (Sweden)

    Shawq Malik Mehibs

    2017-12-01

    Full Text Available Nowadays cloud computing had become is an integral part of IT industry, cloud computing provides Working environment allow a user of environmental to share data and resources over the internet. Where cloud computing its virtual grouping of resources offered over the internet, this lead to different matters related to the security and privacy in cloud computing. And therefore, create intrusion detection very important to detect outsider and insider intruders of cloud computing with high detection rate and low false positive alarm in the cloud environment. This work proposed network intrusion detection module using fuzzy c mean algorithm. The kdd99 dataset used for experiments .the proposed system characterized by a high detection rate with low false positive alarm

  8. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  9. Detailed Information Security in Cloud Computing

    OpenAIRE

    Pavel Valerievich Ivonin

    2013-01-01

    The object of research in this article is technology of public clouds, structure and security system of clouds. Problems of information security in clouds are considered, elements of security system in public clouds are described.

  10. The design of an m-Health monitoring system based on a cloud computing platform

    Science.gov (United States)

    Xu, Boyi; Xu, Lida; Cai, Hongming; Jiang, Lihong; Luo, Yang; Gu, Yizhi

    2017-01-01

    Compared to traditional medical services provided within hospitals, m-Health monitoring systems (MHMSs) face more challenges in personalised health data processing. To achieve personalised and high-quality health monitoring by means of new technologies, such as mobile network and cloud computing, in this paper, a framework of an m-Health monitoring system based on a cloud computing platform (Cloud-MHMS) is designed to implement pervasive health monitoring. Furthermore, the modules of the framework, which are Cloud Storage and Multiple Tenants Access Control Layer, Healthcare Data Annotation Layer, and Healthcare Data Analysis Layer, are discussed. In the data storage layer, a multiple tenant access method is designed to protect patient privacy. In the data annotation layer, linked open data are adopted to augment health data interoperability semantically. In the data analysis layer, the process mining algorithm and similarity calculating method are implemented to support personalised treatment plan selection. These three modules cooperate to implement the core functions in the process of health monitoring, which are data storage, data processing, and data analysis. Finally, we study the application of our architecture in the monitoring of antimicrobial drug usage to demonstrate the usability of our method in personal healthcare analysis.

  11. CubeSat Constellation Cloud Winds(C3Winds) A New Wind Observing System to Study Mesoscale Cloud Dynamics and Processes

    Science.gov (United States)

    Wu, D. L.; Kelly, M.A.; Yee, J.-H.; Boldt, J.; Demajistre, R.; Reynolds, E. L.; Tripoli, G. J.; Oman, L. D.; Prive, N.; Heidinger, A. K.; hide

    2016-01-01

    The CubeSat Constellation Cloud Winds (C3Winds) is a NASA Earth Venture Instrument (EV-I) concept with the primary objective to better understand mesoscale dynamics and their structures in severe weather systems. With potential catastrophic damage and loss of life, strong extratropical and tropical cyclones (ETCs and TCs) have profound three-dimensional impacts on the atmospheric dynamic and thermodynamic structures, producing complex cloud precipitation patterns, strong low-level winds, extensive tropopause folds, and intense stratosphere-troposphere exchange. Employing a compact, stereo IR-visible imaging technique from two formation-flying CubeSats, C3Winds seeks to measure and map high-resolution (2 km) cloud motion vectors (CMVs) and cloud geometric height (CGH) accurately by tracking cloud features within 5-15 min. Complementary to lidar wind observations from space, the high-resolution wind fields from C3Winds will allow detailed investigations on strong low-level wind formation in an occluded ETC development, structural variations of TC inner-core rotation, and impacts of tropopause folding events on tropospheric ozone and air quality. Together with scatterometer ocean surface winds, C3Winds will provide a more comprehensive depiction of atmosphere-boundary-layer dynamics and interactive processes. Built upon mature imaging technologies and long history of stereoscopic remote sensing, C3Winds provides an innovative, cost-effective solution to global wind observations with potential of increased diurnal sampling via CubeSat constellation.

  12. New Stereo Vision Digital Camera System for Simultaneous Measurement of Cloud Base Height and Atmospheric Visibility

    Science.gov (United States)

    Janeiro, F. M.; Carretas, F.; Palma, N.; Ramos, P. M.; Wagner, F.

    2013-12-01

    Clouds play an important role in many aspects of everyday life. They affect both the local weather as well as the global climate and are an important parameter on climate change studies. Cloud parameters are also important for weather prediction models which make use of actual measurements. It is thus important to have low-cost instrumentation that can be deployed in the field to measure those parameters. This kind of instruments should also be automated and robust since they may be deployed in remote places and be subject to adverse weather conditions. Although clouds are very important in environmental systems, they are also an essential component of airplane safety when visual flight rules (VFR) are enforced, such as in most small aerodromes where it is not economically viable to install instruments for assisted flying. Under VFR there are strict limits on the height of the cloud base, cloud cover and atmospheric visibility that ensure the safety of the pilots and planes. Although there are instruments, available in the market, to measure those parameters, their relatively high cost makes them unavailable in many local aerodromes. In this work we present a new prototype which has been recently developed and deployed in a local aerodrome as proof of concept. It is composed by two digital cameras that capture photographs of the sky and allow the measurement of the cloud height from the parallax effect. The new developments consist on having a new geometry which allows the simultaneous measurement of cloud base height, wind speed at cloud base height and atmospheric visibility, which was not previously possible with only two cameras. The new orientation of the cameras comes at the cost of a more complex geometry to measure the cloud base height. The atmospheric visibility is calculated from the Lambert-Beer law after the measurement of the contrast between a set of dark objects and the background sky. The prototype includes the latest hardware developments that

  13. Searchable Encryption in Cloud Storage

    OpenAIRE

    Ren-Junn Hwang; Chung-Chien Lu; Jain-Shing Wu

    2014-01-01

    Cloud outsource storage is one of important services in cloud computing. Cloud users upload data to cloud servers to reduce the cost of managing data and maintaining hardware and software. To ensure data confidentiality, users can encrypt their files before uploading them to a cloud system. However, retrieving the target file from the encrypted files exactly is difficult for cloud server. This study proposes a protocol for performing multikeyword searches for encrypted cloud data by applying ...

  14. 77 FR 8895 - Public Land Order No. 7788; Withdrawal of National Forest System Land for the Red Cloud...

    Science.gov (United States)

    2012-02-15

    ... Land Order No. 7788; Withdrawal of National Forest System Land for the Red Cloud Campground; New Mexico... Cloud Campground within the Cibola National Forest, and to protect a capital investment in the... (FIRS) at 1-800-877-8339 to contact either of the above individuals during normal business hours. The...

  15. Development of Student Information Management System based on Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    Ibrahim A. ALAMERI

    2017-10-01

    Full Text Available The management and provision of information about the educational process is an essential part of effective management of the educational process in the institutes of higher education. In this paper the requirements of a reliable student management system are analyzed, formed a use-case model of student information management system, designed and implemented the architecture of the application. Regarding the implementation process, modern approaches were used to develop and deploy a reliable online application in cloud computing environments specifically.

  16. Synoptic Traveling Weather Systems on Mars: Effects of Radiatively-Active Water Ice Clouds

    Science.gov (United States)

    Hollingsworth, Jeffery; Kahre, Melinda; Haberle, Robert; Urata, Richard

    2017-01-01

    Atmospheric aerosols on Mars are critical in determining the nature of its thermal structure, its large-scale circulation, and hence the overall climate of the planet. We conduct multi-annual simulations with the latest version of the NASA Ames Mars global climate model (GCM), gcm2.3+, that includes a modernized radiative-transfer package and complex water-ice cloud microphysics package which permit radiative effects and interactions of suspended atmospheric aerosols (e.g., water ice clouds, water vapor, dust, and mutual interactions) to influence the net diabatic heating. Results indicate that radiatively active water ice clouds profoundly affect the seasonal and annual mean climate. The mean thermal structure and balanced circulation patterns are strongly modified near the surface and aloft. Warming of the subtropical atmosphere at altitude and cooling of the high latitude atmosphere at low levels takes place, which increases the mean pole-to-equator temperature contrast (i.e., "baroclinicity"). With radiatively active water ice clouds (RAC) compared to radiatively inert water ice clouds (nonRAC), significant changes in the intensity of the mean state and forced stationary Rossby modes occur, both of which affect the vigor and intensity of traveling, synoptic period weather systems.Such weather systems not only act as key agents in the transport of heat and momentum beyond the extent of the Hadley circulation, but also the transport of trace species such as water vapor, water ice-clouds, dust and others. The northern hemisphere (NH) forced Rossby waves and resultant wave train are augmented in the RAC case: the modes are more intense and the wave train is shifted equatorward. Significant changes also occur within the subtropics and tropics. The Rossby wave train sets up, combined with the traveling synoptic period weather systems (i.e., cyclones and anticyclones), the geographic extent of storm zones (or storm tracks) within the NH. A variety of circulation

  17. Fault Detection Variants of the CloudBus Protocol for IoT Distributed Embedded Systems

    Directory of Open Access Journals (Sweden)

    BARKALOV, A.

    2017-05-01

    Full Text Available Distributed embedded systems have become larger, more complex and complicated. More often, such systems operate accordingly to the IoT or Industry 4.0 concept. However, large number of end modules operating in the system leads to a significant load and consequently, to an overload of the communication interfaces. The CloudBus protocol is one of the methods which is used for data exchange and concurrent process synchronization in the distributed systems. It allows the significant savings in the amount of transmitted data between end modules, especially when compared with the other protocols used in the industry. Nevertheless, basic version of the protocol does not protect against the system failure in the event of failure of one of the nodes. This paper proposes four novel variants of the CloudBus protocol, which allow the fault detection. The comparison and performance analysis was executed for all proposed CloudBus variants. The verification and behavior analysis of the distributed systems were performed on SoC hardware research platform. Furthermore, a simple test application was proposed.

  18. A Correlated Model for Evaluating Performance and Energy of Cloud System Given System Reliability

    Directory of Open Access Journals (Sweden)

    Hongli Zhang

    2015-01-01

    Full Text Available The serious issue of energy consumption for high performance computing systems has attracted much attention. Performance and energy-saving have become important measures of a computing system. In the cloud computing environment, the systems usually allocate various resources (such as CPU, Memory, Storage, etc. on multiple virtual machines (VMs for executing tasks. Therefore, the problem of resource allocation for running VMs should have significant influence on both system performance and energy consumption. For different processor utilizations assigned to the VM, there exists the tradeoff between energy consumption and task completion time when a given task is executed by the VMs. Moreover, the hardware failure, software failure and restoration characteristics also have obvious influences on overall performance and energy. In this paper, a correlated model is built to analyze both performance and energy in the VM execution environment given the reliability restriction, and an optimization model is presented to derive the most effective solution of processor utilization for the VM. Then, the tradeoff between energy-saving and task completion time is studied and balanced when the VMs execute given tasks. Numerical examples are illustrated to build the performance-energy correlated model and evaluate the expected values of task completion time and consumed energy.

  19. MICROSOFT CLOUD SERVICES IN DISTANCE LEARNING SYSTEM “KHERSON VIRTUAL UNIVERSITY”

    Directory of Open Access Journals (Sweden)

    H. Kravtsov

    2014-07-01

    Full Text Available E-learning with using spreadsheets requires the implementation of Excel-documents into the distance learning system. Simple and convenient solution of the problem of Excel-documents implementation is the use of cloud services. Using cloud services, you can access to information resources of any level and any type with division of the rights of various groups of users to resources, using only an Internet connection and a Web browser. Subject of research is the Microsoft cloud services. The purpose of research is development and implementation of software module «ExcelReader» to use Excel spreadsheets on the Web pages of distance learning systems. In this paper we solve the following tasks: 1 analyze the known software solutions to display Excel-documents in the WEB-based applications; 2 select an efficient software technology of processing Excel-documents; 3 design the access system and use of Web services of processing Excel-documents in distance learning system; 4 develop software module «ExcelReader» for correct display and edit Excel-documents on Web pages in distance learning; 5 implement software module «ExcelReader» in the distance learning system “Kherson Virtual University”. Processes of creating, editing and implementation of MS Office documents into electronic resources of distance learning systems were modeled. In particular, the software module «ExcelReader» for usage Excel spreadsheets on Web pages of distance learning system “Kherson Virtual University” using "cloud" service Excel Web App from Microsoft was developed and implemented in the educational process.

  20. Biotoxicity and bioavailability of hydrophobic organic compounds solubilized in nonionic surfactant micelle phase and cloud point system.

    Science.gov (United States)

    Pan, Tao; Liu, Chunyan; Zeng, Xinying; Xin, Qiao; Xu, Meiying; Deng, Yangwu; Dong, Wei

    2017-06-01

    A recent work has shown that hydrophobic organic compounds solubilized in the micelle phase of some nonionic surfactants present substrate toxicity to microorganisms with increasing bioavailability. However, in cloud point systems, biotoxicity is prevented, because the compounds are solubilized into a coacervate phase, thereby leaving a fraction of compounds with cells in a dilute phase. This study extends the understanding of the relationship between substrate toxicity and bioavailability of hydrophobic organic compounds solubilized in nonionic surfactant micelle phase and cloud point system. Biotoxicity experiments were conducted with naphthalene and phenanthrene in the presence of mixed nonionic surfactants Brij30 and TMN-3, which formed a micelle phase or cloud point system at different concentrations. Saccharomyces cerevisiae, unable to degrade these compounds, was used for the biotoxicity experiments. Glucose in the cloud point system was consumed faster than in the nonionic surfactant micelle phase, indicating that the solubilized compounds had increased toxicity to cells in the nonionic surfactant micelle phase. The results were verified by subsequent biodegradation experiments. The compounds were degraded faster by PAH-degrading bacterium in the cloud point system than in the micelle phase. All these results showed that biotoxicity of the hydrophobic organic compounds increases with bioavailability in the surfactant micelle phase but remains at a low level in the cloud point system. These results provide a guideline for the application of cloud point systems as novel media for microbial transformation or biodegradation.

  1. CIMIDx: Prototype for a Cloud-Based System to Support Intelligent Medical Image Diagnosis With Efficiency.

    Science.gov (United States)

    Bhavani, Selvaraj Rani; Senthilkumar, Jagatheesan; Chilambuchelvan, Arul Gnanaprakasam; Manjula, Dhanabalachandran; Krishnamoorthy, Ramasamy; Kannan, Arputharaj

    2015-03-27

    The Internet has greatly enhanced health care, helping patients stay up-to-date on medical issues and general knowledge. Many cancer patients use the Internet for cancer diagnosis and related information. Recently, cloud computing has emerged as a new way of delivering health services but currently, there is no generic and fully automated cloud-based self-management intervention for breast cancer patients, as practical guidelines are lacking. We investigated the prevalence and predictors of cloud use for medical diagnosis among women with breast cancer to gain insight into meaningful usage parameters to evaluate the use of generic, fully automated cloud-based self-intervention, by assessing how breast cancer survivors use a generic self-management model. The goal of this study was implemented and evaluated with a new prototype called "CIMIDx", based on representative association rules that support the diagnosis of medical images (mammograms). The proposed Cloud-Based System Support Intelligent Medical Image Diagnosis (CIMIDx) prototype includes two modules. The first is the design and development of the CIMIDx training and test cloud services. Deployed in the cloud, the prototype can be used for diagnosis and screening mammography by assessing the cancers detected, tumor sizes, histology, and stage of classification accuracy. To analyze the prototype's classification accuracy, we conducted an experiment with data provided by clients. Second, by monitoring cloud server requests, the CIMIDx usage statistics were recorded for the cloud-based self-intervention groups. We conducted an evaluation of the CIMIDx cloud service usage, in which browsing functionalities were evaluated from the end-user's perspective. We performed several experiments to validate the CIMIDx prototype for breast health issues. The first set of experiments evaluated the diagnostic performance of the CIMIDx framework. We collected medical information from 150 breast cancer survivors from hospitals

  2. Online Monitoring and Controlling Water Plant System Based on IoT Cloud Computing and Arduino

    Directory of Open Access Journals (Sweden)

    Ali Najim Abdullah

    2017-07-01

    Full Text Available Water is basis of the existence of life on earth and its invaluable because it’s an essential requirement for all the human beings but, presently water preparation and processing systems are suffering from different problems such as real-time operations problems, loss of large amounts of water in the liquidation and distribution operations, less amount of water sources, i.e. The increase in water problems coincides with the increase in population numbers and residential areas such as (water distribution, consumption, Interrupted water sources problems as well as water quality. Therefore, to eliminate these problems and make more efficient water systems, effective and reliable there is necessity for accurate monitoring and proper controlling system. In this paper, we are focusing on the design of water system in real-time and on the continuous monitoring of water based on IoT cloud computing and Arduino microcontroller. Water system with proper control algorithm and continuous monitoring any place and any time makes a stable distribution so that, we can have a record of height of water in tanks and we can change the devices status in the plant. Internet of things is a network of physical connected objects equipped with software, electronics circuits, sensors, and network connection part which allow monitoring and controlling anywhere around the world. Through using cloud computing proved by free severs, the water system’s data continuously is uploaded to cloud allowing the real time monitoring operation by the use of sensors and microcontroller (Arduino as Minicomputer to control and monitor the system operation from cloud with efficient (client to server connection.

  3. The Cloud Systems Used in Education: Properties and Overview

    OpenAIRE

    Agah Tuğrul Korucu; Handan Atun

    2017-01-01

    Diversity and usefulness of information that used in education are have increased due to development of technology. Web technologies have made enormous contributions to the distance learning system especially. Mobile systems, one of the most widely used technology in distance education, made much easier to access web technologies. Not bounding by space and time, individuals have had the opportunity to access the information on web. In addition to this, the storage of educational information a...

  4. Climate Model Evaluation using New Datasets from the Clouds and the Earth's Radiant Energy System (CERES)

    Science.gov (United States)

    Loeb, Norman G.; Wielicki, Bruce A.; Doelling, David R.

    2008-01-01

    There are some in the science community who believe that the response of the climate system to anthropogenic radiative forcing is unpredictable and we should therefore call off the quest . The key limitation in climate predictability is associated with cloud feedback. Narrowing the uncertainty in cloud feedback (and therefore climate sensitivity) requires optimal use of the best available observations to evaluate and improve climate model processes and constrain climate model simulations over longer time scales. The Clouds and the Earth s Radiant Energy System (CERES) is a satellite-based program that provides global cloud, aerosol and radiative flux observations for improving our understanding of cloud-aerosol-radiation feedbacks in the Earth s climate system. CERES is the successor to the Earth Radiation Budget Experiment (ERBE), which has widely been used to evaluate climate models both at short time scales (e.g., process studies) and at decadal time scales. A CERES instrument flew on the TRMM satellite and captured the dramatic 1998 El Nino, and four other CERES instruments are currently flying aboard the Terra and Aqua platforms. Plans are underway to fly the remaining copy of CERES on the upcoming NPP spacecraft (mid-2010 launch date). Every aspect of CERES represents a significant improvement over ERBE. While both CERES and ERBE measure broadband radiation, CERES calibration is a factor of 2 better than ERBE. In order to improve the characterization of clouds and aerosols within a CERES footprint, we use coincident higher-resolution imager observations (VIRS, MODIS or VIIRS) to provide a consistent cloud-aerosol-radiation dataset at climate accuracy. Improved radiative fluxes are obtained by using new CERES-derived Angular Distribution Models (ADMs) for converting measured radiances to fluxes. CERES radiative fluxes are a factor of 2 more accurate than ERBE overall, but the improvement by cloud type and at high latitudes can be as high as a factor of 5

  5. Lessons Learned while Exploring Cloud-Native Architectures for NASA EOSDIS Applications and Systems

    Science.gov (United States)

    Pilone, D.

    2016-12-01

    As new, high data rate missions begin collecting data, the NASA's Earth Observing System Data and Information System (EOSDIS) archive is projected to grow roughly 20x to over 300PBs by 2025. To prepare for the dramatic increase in data and enable broad scientific inquiry into larger time series and datasets, NASA has been exploring the impact of applying cloud technologies throughout EOSDIS. In this talk we will provide an overview of NASA's prototyping and lessons learned in applying cloud architectures to: Highly scalable and extensible ingest and archive of EOSDIS data Going "all-in" on cloud based application architectures including "serverless" data processing pipelines and evaluating approaches to vendor-lock in Rethinking data distribution and approaches to analysis in a cloud environment Incorporating and enforcing security controls while minimizing the barrier for research efforts to deploy to NASA compliant, operational environments. NASA's Earth Observing System (EOS) is a coordinated series of satellites for long term global observations. NASA's Earth Observing System Data and Information System (EOSDIS) is a multi-petabyte-scale archive of environmental data that supports global climate change research by providing end-to-end services from EOS instrument data collection to science data processing to full access to EOS and other earth science data. On a daily basis, the EOSDIS ingests, processes, archives and distributes over 3 terabytes of data from NASA's Earth Science missions representing over 6000 data products ranging from various types of science disciplines. EOSDIS has continually evolved to improve the discoverability, accessibility, and usability of high-impact NASA data spanning the multi-petabyte-scale archive of Earth science data products.

  6. Cloud-Top Entrainment in Stratocumulus Clouds

    Science.gov (United States)

    Mellado, Juan Pedro

    2017-01-01

    Cloud entrainment, the mixing between cloudy and clear air at the boundary of clouds, constitutes one paradigm for the relevance of small scales in the Earth system: By regulating cloud lifetimes, meter- and submeter-scale processes at cloud boundaries can influence planetary-scale properties. Understanding cloud entrainment is difficult given the complexity and diversity of the associated phenomena, which include turbulence entrainment within a stratified medium, convective instabilities driven by radiative and evaporative cooling, shear instabilities, and cloud microphysics. Obtaining accurate data at the required small scales is also challenging, for both simulations and measurements. During the past few decades, however, high-resolution simulations and measurements have greatly advanced our understanding of the main mechanisms controlling cloud entrainment. This article reviews some of these advances, focusing on stratocumulus clouds, and indicates remaining challenges.

  7. An Implementation Model of Teaching Evaluation Questionnaire System Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Chih-Yung Chen

    2014-01-01

    Full Text Available The teaching evaluation questionnaire in the university has been an essential routine. The administrative staff of academic affairs shall obey the rules, fairness, and valid principles within the period of finishing teaching evaluation questionnaire. To ensure the accuracy and timeliness of the large number of questionnaires, it makes the administrative staff of academic affairs face the challenge and pressure if the students are too many. Program potency low or bad often came from the development staff being familiar with bad program development idea; the bad designs of system program in itself causes the factor proportion to occupy more higher, because the similar application program mode of writing is, respectively, different. However, the exploiter may find the method of the little consumption of system resources to complete the similar work task. We develop the software service of cloud platform and the educational administration personnel operates process in the software service of cloud platform. The cloud platform reduces the system wait and calculation time of questionnaire, improves and simplifies system operation flow, and promotes 10 times of program efficiencies as well as solving the load overweight problem of potency bottleneck.

  8. Cloud-based shaft torque estimation for electric vehicle equipped with integrated motor-transmission system

    Science.gov (United States)

    Zhu, Xiaoyuan; Zhang, Hui; Yang, Bo; Zhang, Guichen

    2018-01-01

    In order to improve oscillation damping control performance as well as gear shift quality of electric vehicle equipped with integrated motor-transmission system, a cloud-based shaft torque estimation scheme is proposed in this paper by using measurable motor and wheel speed signals transmitted by wireless network. It can help reduce computational burden of onboard controllers and also relief network bandwidth requirement of individual vehicle. Considering possible delays during signal wireless transmission, delay-dependent full-order observer design is proposed to estimate the shaft torque in cloud server. With these random delays modeled by using homogenous Markov chain, robust H∞ performance is adopted to minimize the effect of wireless network-induced delays, signal measurement noise as well as system modeling uncertainties on shaft torque estimation error. Observer parameters are derived by solving linear matrix inequalities, and simulation results using acceleration test and tip-in, tip-out test demonstrate the effectiveness of proposed shaft torque observer design.

  9. Mobile and cloud based systems proposal for a centralized management of educational institutions

    Directory of Open Access Journals (Sweden)

    Leandro Medeiros de Almeida Machado

    2017-06-01

    Full Text Available The computational technological efforts towards a centralized data storage and processing have contributed to provide more sustainable solutions under the environmental, administrative and business perspectives. However, it is not yet worldwide adopted by public institutions, specially, in the Brazilian educational systems, where these technological models are still under constant discussions and development. In this sense, this work presents a brief survey about cloud and mobile integrated technologies and their possible contributions to support a centralized data management in educational systems, relating improvements in governance, data security, mobility, economic viability and environmental impact. Therefore, this work also present a list of already free and private technologies and their advantages and disadvantages in the Brazilian scenario. In this sense, the herein technological aspects considers the integration between cloud and mobile technologies as essential alternative to suppress the online requirements, which a limitation for a large number of public institutions that have problems to be effectively connected on the Internet.

  10. Developing Applications in the Era of Cloud-based SaaS Library Systems

    Directory of Open Access Journals (Sweden)

    Josh Weisman

    2014-10-01

    Full Text Available As the move to cloud-based SaaS library systems accelerates, we must consider what it means to develop applications when the core of the system isn't under the library's control. The entire application lifecycle is changing, from development to testing to production. Developing applications for cloud solutions raises new concerns, such as security, multi-tenancy, latency, and analytics. In this article, we review the landscape and suggest a view of how to be successful for the benefit of library staff and end-users in this new reality. We discuss what kinds of APIs and protocols vendors should be supporting, and suggest how best to take advantage of the innovations being introduced.

  11. Aerosol and Cloud Microphysical Properties in the Asir region of Saudi Arabia

    Science.gov (United States)

    Axisa, Duncan; Kucera, Paul; Burger, Roelof; Li, Runjun; Collins, Don; Freney, Evelyn; Posada, Rafael; Buseck, Peter

    2010-05-01

    In recent advertent and inadvertent weather modification studies, a considerable effort has been made to understand the impact of varying aerosol properties and concentration on cloud properties. Significant uncertainties exist with aerosol-cloud interactions for which complex microphysical processes link the aerosol and cloud properties. Under almost all environmental conditions, increased aerosol concentrations within polluted air masses will enhance cloud droplet concentration relative to that in unperturbed regions. The interaction between dust particles and clouds are significant, yet the conditions in which dust particles become cloud condensation nuclei (CCN) are uncertain. In order to quantify this aerosol effect on clouds and precipitation, a field campaign was launched in the Asir region of Saudi Arabia as part of a Precipitation Enhancement Feasibility Study. Ground measurements of aerosol size distributions, hygroscopic growth factor, CCN concentrations as well as aircraft measurements of cloud hydrometeor size distributions were done in the Asir region of Saudi Arabia in August 2009. Research aircraft operations focused primarily on conducting measurements in clouds that are targeted for cloud top-seeding, on their microphysical characterization, especially the preconditions necessary for precipitation; understanding the evolution of droplet coalescence, supercooled liquid water, cloud ice and precipitation hydrometeors is necessary if advances are to be made in the study of cloud modification by cloud seeding. Non-precipitating mixed-phase clouds less than 3km in diameter that developed on top of the stable inversion were characterized by flying at the convective cloud top just above the inversion. Aerosol measurements were also done during the climb to cloud base height. The presentation will include a summary of the analysis and results with a focus on the unique features of the Asir region in producing convective clouds, characterization of the

  12. A home healthcare system in the cloud - Addressing security and privacy challenges

    OpenAIRE

    Deng M.; Petkovic M.; Nalin M.; Baroni I.

    2011-01-01

    Cloud computing is an emerging technology that is expected to support Internet scale critical applications which could be essential to the healthcare sector. Its scalability, resilience, adaptability, connectivity, cost reduction, and high performance features have high potential to lift the efficiency and quality of healthcare. However,it is also important to understand specific risks related to security and privacy that this technology brings. This paper focuses on a home healthcare system ...

  13. High-Resolution Global Modeling of the Effects of Subgrid-Scale Clouds and Turbulence on Precipitating Cloud Systems

    Energy Technology Data Exchange (ETDEWEB)

    Bogenschutz, Peter [National Center for Atmospheric Research, Boulder, CO (United States); Moeng, Chin-Hoh [National Center for Atmospheric Research, Boulder, CO (United States)

    2015-10-13

    The PI’s at the National Center for Atmospheric Research (NCAR), Chin-Hoh Moeng and Peter Bogenschutz, have primarily focused their time on the implementation of the Simplified-Higher Order Turbulence Closure (SHOC; Bogenschutz and Krueger 2013) to the Multi-scale Modeling Framework (MMF) global model and testing of SHOC on deep convective cloud regimes.

  14. A Coupled fcGCM-GCE Modeling System: A 3D Cloud Resolving Model and a Regional Scale Model

    Science.gov (United States)

    Tao, Wei-Kuo

    2005-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and ore sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. The Goddard MMF is based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM), and it has started production runs with two years results (1998 and 1999). Also, at Goddard, we have implemented several Goddard microphysical schemes (21CE, several 31CE), Goddard radiation (including explicity calculated cloud optical properties), and Goddard Land Information (LIS, that includes the CLM and NOAH land surface models) into a next generation regional scale model, WRF. In this talk, I will present: (1) A Brief review on GCE model and its applications on precipitation processes (microphysical and land processes), (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), and preliminary results (the comparison with traditional GCMs), (3) A discussion on the Goddard WRF version (its developments and applications), and (4) The characteristics of the four-dimensional cloud data

  15. Proposed Network Intrusion Detection System ‎In Cloud Environment Based on Back ‎Propagation Neural Network

    Directory of Open Access Journals (Sweden)

    Shawq Malik Mehibs

    2017-12-01

    Full Text Available Cloud computing is distributed architecture, providing computing facilities and storage resource as a service over the internet. This low-cost service fulfills the basic requirements of users. Because of the open nature and services introduced by cloud computing intruders impersonate legitimate users and misuse cloud resource and services. To detect intruders and suspicious activities in and around the cloud computing environment, intrusion detection system used to discover the illegitimate users and suspicious action by monitors different user activities on the network .this work proposed based back propagation artificial neural network to construct t network intrusion detection in the cloud environment. The proposed module evaluated with kdd99 dataset the experimental results shows promising approach to detect attack with high detection rate and low false alarm rate

  16. Cloud Monitoring for Solar Plants with Support Vector Machine Based Fault Detection System

    Directory of Open Access Journals (Sweden)

    Hong-Chan Chang

    2014-01-01

    Full Text Available This study endeavors to develop a cloud monitoring system for solar plants. This system incorporates numerous subsystems, such as a geographic information system, an instantaneous power-consumption information system, a reporting system, and a failure diagnosis system. Visual C# was integrated with ASP.NET and SQL technologies for the proposed monitoring system. A user interface for database management system was developed to enable users to access solar power information and management systems. In addition, by using peer-to-peer (P2P streaming technology and audio/video encoding/decoding technology, real-time video data can be transmitted to the client end, providing instantaneous and direct information. Regarding smart failure diagnosis, the proposed system employs the support vector machine (SVM theory to train failure mathematical models. The solar power data are provided to the SVM for analysis in order to determine the failure types and subsequently eliminate failures at an early stage. The cloud energy-management platform developed in this study not only enhances the management and maintenance efficiency of solar power plants but also increases the market competitiveness of solar power generation and renewable energy.

  17. Lagrangian evolution of the marine boundary layer from the Cloud System Evolution in the Trades (CSET) campaign

    Science.gov (United States)

    Mohrmann, J.; Ghate, V. P.; McCoy, I. L.; Bretherton, C. S.; Wood, R.; Minnis, P.; Palikonda, R.

    2017-12-01

    The Cloud System Evolution in the Trades (CSET) field campaign took place July/August 2015 to study the evolution of clouds, precipitation, and aerosols in the stratocumulus-to-cumulus (Sc-Cu) transition region of the northeast Pacific marine boundary layer (MBL). Aircraft observations sampled across a wide range of cloud and aerosol conditions. The sampling strategy, where MBL airmasses were sampled with the NSF/NCAR Gulfstream-V (HIAPER) and resampled then at their advected location two days later, resulted in a dataset of 14 paired flights suitable for Lagrangian analysis. This analysis shows that Lagrangian coherence of long-lived species (namely CO and O3) across 48 hours are high, but that of subcloud aerosol, MBL depth, and cloud properties is limited. Geostationary satellite retrievals are compared against aircraft observations; these are combined with reanalysis data and HYSPLIT trajectories to document the Lagrangian evolution of cloud fraction, cloud droplet number concentration, liquid water path, estimated inversion strength (EIS), and MBL depth, which are used to expand upon and validate the aircraft-based analysis. Many of the trajectories sampled by the aircraft show a clear Sc-Cu transition. Although satellite cloud fraction and EIS were found to be strongly spatiotemporally correlated, changes in MBL cloud fraction along trajectories did not correlate with any measure of EIS forcing.

  18. Near Real Time Vertical Profiles of Clouds and Aerosols from the Cloud-Aerosol Transport System (CATS) on the International Space Station

    Science.gov (United States)

    Yorks, J. E.; McGill, M. J.; Nowottnick, E. P.

    2015-12-01

    Plumes from hazardous events, such as ash from volcanic eruptions and smoke from wildfires, can have a profound impact on the climate system, human health and the economy. Global aerosol transport models are very useful for tracking hazardous plumes and predicting the transport of these plumes. However aerosol vertical distributions and optical properties are a major weakness of global aerosol transport models, yet a key component of tracking and forecasting smoke and ash. The Cloud-Aerosol Transport System (CATS) is an elastic backscatter lidar designed to provide vertical profiles of clouds and aerosols while also demonstrating new in-space technologies for future Earth Science missions. CATS has been operating on the Japanese Experiment Module - Exposed Facility (JEM-EF) of the International Space Station (ISS) since early February 2015. The ISS orbit provides more comprehensive coverage of the tropics and mid-latitudes than sun-synchronous orbiting sensors, with nearly a three-day repeat cycle. The ISS orbit also provides CATS with excellent coverage over the primary aerosol transport tracks, mid-latitude storm tracks, and tropical convection. Data from CATS is used to derive properties of clouds and aerosols including: layer height, layer thickness, backscatter, optical depth, extinction, and depolarization-based discrimination of particle type. The measurements of atmospheric clouds and aerosols provided by the CATS payload have demonstrated several science benefits. CATS provides near-real-time observations of cloud and aerosol vertical distributions that can be used as inputs to global models. The infrastructure of the ISS allows CATS data to be captured, transmitted, and received at the CATS ground station within several minutes of data collection. The CATS backscatter and vertical feature mask are part of a customized near real time (NRT) product that the CATS processing team produces within 6 hours of collection. The continuous near real time CATS data

  19. Effects of explosively venting aerosol-sized particles through earth-containment systems on the cloud-stabilization height

    International Nuclear Information System (INIS)

    Dyckes, G.W.

    1980-07-01

    A method of approximating the cloud stabilization height for aerosol-sized particles vented explosively through earth containment systems is presented. The calculated values for stabilization heights are in fair agreement with those obtained experimentally

  20. Design and Development of Smart Aquaculture System Based on IFTTT Model and Cloud Integration

    Directory of Open Access Journals (Sweden)

    Dzulqornain Muhammad Iskandar

    2018-01-01

    Full Text Available The internet of things technology (IoT is growing very rapidly. IoT implementation has been conducted in several sectors. One of them is for aquaculture. For the traditional farmers, they face problems for monitoring water quality and the way to increase the quality of the water quickly and efficiently. This paper presents a real-time monitoring and controlling system for aquaculture based on If This Then That (IFTTT model and cloud integration. This system was composed of smart sensor module which supports modularity, smart aeration system for controlling system, local network system, cloud computing system and client visualization data. In order to monitor the water condition, we collect the data from smart sensor module. Smart sensor module consists of sensor dissolved oxygen, potential of hydrogen, water temperature and water level. The components of smart aeration system are microcontroller NodeMCU v3, relay, power supply, and propeller that can produce oxygen. The system could set the IFTTT rules for the ideal water condition for the pond in any kinds of aquaculture based on its needs through the web and android application. The experimental result shows that use IFTTT model makes the aquaculture monitoring system more customizable, expandable and dynamic.

  1. A cloud-based production system for information and service integration: an internet of things case study on waste electronics

    Science.gov (United States)

    Wang, Xi Vincent; Wang, Lihui

    2017-08-01

    Cloud computing is the new enabling technology that offers centralised computing, flexible data storage and scalable services. In the manufacturing context, it is possible to utilise the Cloud technology to integrate and provide industrial resources and capabilities in terms of Cloud services. In this paper, a function block-based integration mechanism is developed to connect various types of production resources. A Cloud-based architecture is also deployed to offer a service pool which maintains these resources as production services. The proposed system provides a flexible and integrated information environment for the Cloud-based production system. As a specific type of manufacturing, Waste Electrical and Electronic Equipment (WEEE) remanufacturing experiences difficulties in system integration, information exchange and resource management. In this research, WEEE is selected as the example of Internet of Things to demonstrate how the obstacles and bottlenecks are overcome with the help of Cloud-based informatics approach. In the case studies, the WEEE recycle/recovery capabilities are also integrated and deployed as flexible Cloud services. Supporting mechanisms and technologies are presented and evaluated towards the end of the paper.

  2. Replicas Strategy and Cache Optimization of Video Surveillance Systems Based on Cloud Storage

    Directory of Open Access Journals (Sweden)

    Rongheng Li

    2018-04-01

    Full Text Available With the rapid development of video surveillance technology, especially the popularity of cloud-based video surveillance applications, video data begins to grow explosively. However, in the cloud-based video surveillance system, replicas occupy an amount of storage space. Also, the slow response to video playback constrains the performance of the system. In this paper, considering the characteristics of video data comprehensively, we propose a dynamic redundant replicas mechanism based on security levels that can dynamically adjust the number of replicas. Based on the location correlation between cameras, this paper also proposes a data cache strategy to improve the response speed of data reading. Experiments illustrate that: (1 our dynamic redundant replicas mechanism can save storage space while ensuring data security; (2 the cache mechanism can predict the playback behaviors of the users in advance and improve the response speed of data reading according to the location and time correlation of the front-end cameras; and (3 in terms of cloud-based video surveillance, our proposed approaches significantly outperform existing methods.

  3. Learning in the cloud: a new challenge for a global teaching system in optics and photonics

    Science.gov (United States)

    Sultana, Razia; Christ, Andreas; Feisst, Markus; Curticapean, Dan

    2014-07-01

    Nowadays, it is assumed of many applications, companies and parts of the society to be always available online. However, according to [Times, Oct, 31 2011], 73% of the world population do not use the internet and thus aren't "online" at all. The most common reasons for not being "online" are expensive personal computer equipment and high costs for data connections, especially in developing countries that comprise most of the world's population (e.g. parts of Africa, Asia, Central and South America). However it seems that these countries are leap-frogging the "PC and landline" age and moving directly to the "mobile" age. Decreasing prices for smart phones with internet connectivity and PC-like operating systems make it more affordable for these parts of the world population to join the "always-online" community. Storing learning content in a way accessible to everyone, including mobile and smart phones, seems therefore to be beneficial. This way, learning content can be accessed by personal computers as well as by mobile and smart phones and thus be accessible for a big range of devices and users. A new trend in the Internet technologies is to go to "the cloud". This paper discusses the changes, challenges and risks of storing learning content in the "cloud". The experiences were gathered during the evaluation of the necessary changes in order to make our solutions and systems "cloud-ready".

  4. Cloud-based hospital information system as a service for grassroots healthcare institutions.

    Science.gov (United States)

    Yao, Qin; Han, Xiong; Ma, Xi-Kun; Xue, Yi-Feng; Chen, Yi-Jun; Li, Jing-Song

    2014-09-01

    Grassroots healthcare institutions (GHIs) are the smallest administrative levels of medical institutions, where most patients access health services. The latest report from the National Bureau of Statistics of China showed that 96.04 % of 950,297 medical institutions in China were at the grassroots level in 2012, including county-level hospitals, township central hospitals, community health service centers, and rural clinics. In developing countries, these institutions are facing challenges involving a shortage of funds and talent, inconsistent medical standards, inefficient information sharing, and difficulties in management during the adoption of health information technologies (HIT). Because of the necessity and gravity for GHIs, our aim is to provide hospital information services for GHIs using Cloud computing technologies and service modes. In this medical scenario, the computing resources are pooled by means of a Cloud-based Virtual Desktop Infrastructure (VDI) to serve multiple GHIs, with different hospital information systems dynamically assigned and reassigned according to demand. This paper is concerned with establishing a Cloud-based Hospital Information Service Center to provide hospital information software as a service (HI-SaaS) with the aim of providing GHIs with an attractive and high-performance medical information service. Compared with individually establishing all hospital information systems, this approach is more cost-effective and affordable for GHIs and does not compromise HIT performance.

  5. Modeling Exoplanetary Haze and Cloud Effects for Transmission Spectroscopy in the TRAPPIST-1 System

    Science.gov (United States)

    Moran, Sarah E.; Horst, Sarah M.; Lewis, Nikole K.; Batalha, Natasha E.; de Wit, Julien

    2018-01-01

    We present theoretical transmission spectra of the planets TRAPPIST-1d, e, f, and g using a version of the CaltecH Inverse ModEling and Retrieval Algorithms (CHIMERA) atmospheric modeling code. We use particle size, aerosol production rates, and aerosol composition inputs from recent laboratory experiments relevant for the TRAPPIST-1 system to constrain cloud and haze behavior and their effects on transmission spectra. We explore these cloud and haze cases for a variety of theoretical atmospheric compositions including hydrogen-, nitrogen-, and carbon dioxide-dominated atmospheres. Then, we demonstrate the feasibility of physically-motivated, laboratory-supported clouds and hazes to obscure spectral features at wavelengths and resolutions relevant to instruments on the Hubble Space Telescope and the upcoming James Webb Space Telescope. Lastly, with laboratory based constraints of haze production rates for terrestrial exoplanets, we constrain possible bulk atmospheric compositions of the TRAPPIST-1 planets based on current observations. We show that continued collection of optical data, beyond the supported wavelength range of the James Webb Telescope, is necessary to explore the full effect of hazes for transmission spectra of exoplanetary atmospheres like the TRAPPIST-1 system.

  6. An integrated system for land resources supervision based on the IoT and cloud computing

    Science.gov (United States)

    Fang, Shifeng; Zhu, Yunqiang; Xu, Lida; Zhang, Jinqu; Zhou, Peiji; Luo, Kan; Yang, Jie

    2017-01-01

    Integrated information systems are important safeguards for the utilisation and development of land resources. Information technologies, including the Internet of Things (IoT) and cloud computing, are inevitable requirements for the quality and efficiency of land resources supervision tasks. In this study, an economical and highly efficient supervision system for land resources has been established based on IoT and cloud computing technologies; a novel online and offline integrated system with synchronised internal and field data that includes the entire process of 'discovering breaches, analysing problems, verifying fieldwork and investigating cases' was constructed. The system integrates key technologies, such as the automatic extraction of high-precision information based on remote sensing, semantic ontology-based technology to excavate and discriminate public sentiment on the Internet that is related to illegal incidents, high-performance parallel computing based on MapReduce, uniform storing and compressing (bitwise) technology, global positioning system data communication and data synchronisation mode, intelligent recognition and four-level ('device, transfer, system and data') safety control technology. The integrated system based on a 'One Map' platform has been officially implemented by the Department of Land and Resources of Guizhou Province, China, and was found to significantly increase the efficiency and level of land resources supervision. The system promoted the overall development of informatisation in fields related to land resource management.

  7. Design and Implementation of a Set-Top Box-Based Homecare System Using Hybrid Cloud.

    Science.gov (United States)

    Lin, Bor-Shing; Hsiao, Pei-Chi; Cheng, Po-Hsun; Lee, I-Jung; Jan, Gene Eu

    2015-11-01

    Telemedicine has become a prevalent topic in recent years, and several telemedicine systems have been proposed; however, such systems are an unsuitable fit for the daily requirements of users. The system proposed in this study was developed as a set-top box integrated with the Android™ (Google, Mountain View, CA) operating system to provide a convenient and user-friendly interface. The proposed system can assist with family healthcare management, telemedicine service delivery, and information exchange among hospitals. To manage the system, a novel type of hybrid cloud architecture was also developed. Updated information is stored on a public cloud, enabling medical staff members to rapidly access information when diagnosing patients. In the long term, the stored data can be reduced to improve the efficiency of the database. The proposed design offers a robust architecture for storing data in a homecare system and can thus resolve network overload and congestion resulting from accumulating data, which are inherent problems in centralized architectures, thereby improving system efficiency.

  8. The Energy Savings and Environmental Benefits for Small and Medium Enterprises by Cloud Energy Management System

    Directory of Open Access Journals (Sweden)

    Yen-Chieh Tseng

    2016-06-01

    Full Text Available Small and medium enterprises (SMES play an important role in Taiwan’s economy. The reduction of energy costs and carbon dioxide (CO2 emissions are critical to preserving the environment. This paper uses the experimental results from 65 sites, gathered over two years since 2012, to determine how the integration of Internet communication, cloud computing technologies and a cloud energy management service (cloud EMS can reduce energy consumption by cost-effective means. The EMS has three levels: infrastructure as a service (IaaS, platform as a service (PaaS and software as a service (SaaS. Working jointly with ChungHwa Telecom, Taiwan’s leading telecom service provider, data from detection devices, control devices, air-conditioning and lighting systems are all uploaded to a cloud EMS platform, to give a so called intelligent energy management network application service platform (IEN-ASP. Various energy saving management functions are developed using this platform: (1 air conditioning optimization; (2 lighting system optimization; (3 scheduling control; (4 power billing control and (5 occupancy detection and timing control. Using the international performance measurement and verification protocol (IPMVP, the energy used at the test sites, before and after the use of the IEN-ASP, are compared to calculate the energy saved. The experimental results show that there is an average energy saving of 5724 kWh per year, which represents a saving ratio of 5.84%. This translates to a total reduction in CO2 emissions of 9,926,829 kg per year. Using the data collected, a regression model is used to demonstrate the correlation between the power that is consumed, the energy that is saved and the area of the sites. Another interesting result is that, if the experimental sites are maintained by experienced electricians or other personnel and EMS protocols are followed, the energy saving can be as great as 6.59%.

  9. The Regional Water Cycle and Water Ice Clouds in the Tharsis - Valles Marineris System

    Science.gov (United States)

    Leung, C. W. S.; Rafkin, S. C.

    2017-12-01

    The regional atmospheric circulation on Mars is highly influenced by local topographic gradients. Terrain-following air parcels forced along the slopes of the major Tharsis volcanoes and the steep canyon walls of Valles Marineris significantly impact the local water vapor concentration and the associated conditions for cloud formation. Using a non-hydrostatic mesoscale atmospheric model with aerosol & cloud microphysics, we investigate the meteorological conditions for water ice cloud formation in the coupled Tharsis - Valles Marineris system near the aphelion season. The usage of a limited area regional model ensures that topographic slopes are well resolved compared to the typical resolutions of a global-coverage general circulation model. The effects of shadowing and slope angle geometries on the energy budget is also taken into account. Diurnal slope winds in complex terrains are typically characterized by the reversal of wind direction twice per sol: upslope during the day, and downslope at night. However, our simulation results of the regional circulation and diurnal water cycle indicate substantial asymmetries in the day-night circulation. The convergence of moist air masses enters Valles Marineris via easterly flows, whereas dry air sweep across the plateau of the canyon system from the south towards the north. We emphasize the non-uniform vertical distribution of water vapor in our model results. Water vapor mixing ratios in the lower planetary boundary layer may be factors greater than the mixing ratio aloft. Water ice clouds are important contributors to the climatic forcing on Mars, and their effects on the mesoscale circulations in the Tharsis - Valles Marineris region significantly contribute to the regional perturbations in the large-scale global atmospheric circulation.

  10. Addressing security, collaboration, and usability with tactical edge mobile devices and strategic cloud-based systems

    Science.gov (United States)

    Graham, Christopher J.

    2012-05-01

    Success in the future battle space is increasingly dependent on rapid access to the right information. Faced with a shrinking budget, the Government has a mandate to improve intelligence productivity, quality, and reliability. To achieve increased ISR effectiveness, leverage of tactical edge mobile devices via integration with strategic cloud-based infrastructure is the single, most likely candidate area for dramatic near-term impact. This paper discusses security, collaboration, and usability components of this evolving space. These three paramount tenets outlined below, embody how mission information is exchanged securely, efficiently, with social media cooperativeness. Tenet 1: Complete security, privacy, and data integrity, must be ensured within the net-centric battle space. This paper discusses data security on a mobile device, data at rest on a cloud-based system, authorization and access control, and securing data transport between entities. Tenet 2: Lack of collaborative information sharing and content reliability jeopardizes mission objectives and limits the end user capability. This paper discusses cooperative pairing of mobile devices and cloud systems, enabling social media style interaction via tagging, meta-data refinement, and sharing of pertinent data. Tenet 3: Fielded mobile solutions must address usability and complexity. Simplicity is a powerful paradigm on mobile platforms, where complex applications are not utilized, and simple, yet powerful, applications flourish. This paper discusses strategies for ensuring mobile applications are streamlined and usable at the tactical edge through focused features sets, leveraging the power of the back-end cloud, minimization of differing HMI concepts, and directed end-user feedback.teInput=

  11. SU-E-T-11: A Cloud Based CT and LINAC QA Data Management System

    Energy Technology Data Exchange (ETDEWEB)

    Wiersma, R; Grelewicz, Z; Belcher, A; Liu, X [The University of Chicago, Chicago, IL (United States)

    2015-06-15

    Purpose: The current status quo of QA data management consists of a mixture of paper-based forms and spreadsheets for recording the results of daily, monthly, and yearly QA tests for both CT scanners and LINACs. Unfortunately, such systems suffer from a host of problems as, (1) records can be easily lost or destroyed, (2) data is difficult to access — one must physically hunt down records, (3) poor or no means of historical data analysis, and (4) no remote monitoring of machine performance off-site. To address these issues, a cloud based QA data management system was developed and implemented. Methods: A responsive tablet interface that optimizes clinic workflow with an easy-to-navigate interface accessible from any web browser was implemented in HTML/javascript/CSS to allow user mobility when entering QA data. Automated image QA was performed using a phantom QA kit developed in Python that is applicable to any phantom and is currently being used with the Gammex ACR, Las Vegas, Leeds, and Catphan phantoms for performing automated CT, MV, kV, and CBCT QAs, respectively. A Python based resource management system was used to distribute and manage intensive CPU tasks such as QA phantom image analysis or LaTeX-to-PDF QA report generation to independent process threads or different servers such that website performance is not affected. Results: To date the cloud QA system has performed approximately 185 QA procedures. Approximately 200 QA parameters are being actively tracked by the system on a monthly basis. Electronic access to historical QA parameter information was successful in proactively identifying a Linac CBCT scanner’s performance degradation. Conclusion: A fully comprehensive cloud based QA data management system was successfully implemented for the first time. Potential machine performance issues were proactively identified that would have been otherwise missed by a paper or spreadsheet based QA system.

  12. SU-E-T-11: A Cloud Based CT and LINAC QA Data Management System

    International Nuclear Information System (INIS)

    Wiersma, R; Grelewicz, Z; Belcher, A; Liu, X

    2015-01-01

    Purpose: The current status quo of QA data management consists of a mixture of paper-based forms and spreadsheets for recording the results of daily, monthly, and yearly QA tests for both CT scanners and LINACs. Unfortunately, such systems suffer from a host of problems as, (1) records can be easily lost or destroyed, (2) data is difficult to access — one must physically hunt down records, (3) poor or no means of historical data analysis, and (4) no remote monitoring of machine performance off-site. To address these issues, a cloud based QA data management system was developed and implemented. Methods: A responsive tablet interface that optimizes clinic workflow with an easy-to-navigate interface accessible from any web browser was implemented in HTML/javascript/CSS to allow user mobility when entering QA data. Automated image QA was performed using a phantom QA kit developed in Python that is applicable to any phantom and is currently being used with the Gammex ACR, Las Vegas, Leeds, and Catphan phantoms for performing automated CT, MV, kV, and CBCT QAs, respectively. A Python based resource management system was used to distribute and manage intensive CPU tasks such as QA phantom image analysis or LaTeX-to-PDF QA report generation to independent process threads or different servers such that website performance is not affected. Results: To date the cloud QA system has performed approximately 185 QA procedures. Approximately 200 QA parameters are being actively tracked by the system on a monthly basis. Electronic access to historical QA parameter information was successful in proactively identifying a Linac CBCT scanner’s performance degradation. Conclusion: A fully comprehensive cloud based QA data management system was successfully implemented for the first time. Potential machine performance issues were proactively identified that would have been otherwise missed by a paper or spreadsheet based QA system

  13. Using cloud computing technologies in IP-video surveillance systems with the function of 3d-object modelling

    Directory of Open Access Journals (Sweden)

    Zhigalov Kirill

    2018-01-01

    Full Text Available This article is devoted to the integration of cloud technology functions into 3D IP video surveil-lance systems in order to conduct further video Analytics, incoming real-time data, as well as stored video materials on the server in the «cloud». The main attention is devoted to «cloud technologies» usage optimizing the process of recognition of the desired object by increasing the criteria of flexibility and scalability of the system. Transferring image load from the client to the cloud server, to the virtual part of the system. The development of the issues considered in the article in terms of data analysis, which will significantly improve the effectiveness of the implementation of special tasks facing special units.

  14. Multiseasonal Tree Crown Structure Mapping with Point Clouds from OTS Quadrocopter Systems

    Science.gov (United States)

    Hese, S.; Behrendt, F.

    2017-08-01

    OTF (Off The Shelf) quadro copter systems provide a cost effective (below 2000 Euro), flexible and mobile platform for high resolution point cloud mapping. Various studies showed the full potential of these small and flexible platforms. Especially in very tight and complex 3D environments the automatic obstacle avoidance, low copter weight, long flight times and precise maneuvering are important advantages of these small OTS systems in comparison with larger octocopter systems. This study examines the potential of the DJI Phantom 4 pro series and the Phantom 3A series for within-stand and forest tree crown 3D point cloud mapping using both within stand oblique imaging in different altitude levels and data captured from a nadir perspective. On a test site in Brandenburg/Germany a beach crown was selected and measured with 3 different altitude levels in Point Of Interest (POI) mode with oblique data capturing and deriving one nadir mosaic created with 85/85 % overlap using Drone Deploy automatic mapping software. Three different flight campaigns were performed, one in September 2016 (leaf-on), one in March 2017 (leaf-off) and one in May 2017 (leaf-on) to derive point clouds from different crown structure and phenological situations - covering the leaf-on and leafoff status of the tree crown. After height correction, the point clouds where used with GPS geo referencing to calculate voxel based densities on 50 × 10 × 10 cm voxel definitions using a topological network of chessboard image objects in 0,5 m height steps in an object based image processing environment. Comparison between leaf-off and leaf-on status was done on volume pixel definitions comparing the attributed point densities per volume and plotting the resulting values as a function of distance to the crown center. In the leaf-off status SFM (structure from motion) algorithms clearly identified the central stem and also secondary branch systems. While the penetration into the crown

  15. MULTISEASONAL TREE CROWN STRUCTURE MAPPING WITH POINT CLOUDS FROM OTS QUADROCOPTER SYSTEMS

    Directory of Open Access Journals (Sweden)

    S. Hese

    2017-08-01

    Full Text Available OTF (Off The Shelf quadro copter systems provide a cost effective (below 2000 Euro, flexible and mobile platform for high resolution point cloud mapping. Various studies showed the full potential of these small and flexible platforms. Especially in very tight and complex 3D environments the automatic obstacle avoidance, low copter weight, long flight times and precise maneuvering are important advantages of these small OTS systems in comparison with larger octocopter systems. This study examines the potential of the DJI Phantom 4 pro series and the Phantom 3A series for within-stand and forest tree crown 3D point cloud mapping using both within stand oblique imaging in different altitude levels and data captured from a nadir perspective. On a test site in Brandenburg/Germany a beach crown was selected and measured with 3 different altitude levels in Point Of Interest (POI mode with oblique data capturing and deriving one nadir mosaic created with 85/85 % overlap using Drone Deploy automatic mapping software. Three different flight campaigns were performed, one in September 2016 (leaf-on, one in March 2017 (leaf-off and one in May 2017 (leaf-on to derive point clouds from different crown structure and phenological situations – covering the leaf-on and leafoff status of the tree crown. After height correction, the point clouds where used with GPS geo referencing to calculate voxel based densities on 50 × 10 × 10 cm voxel definitions using a topological network of chessboard image objects in 0,5 m height steps in an object based image processing environment. Comparison between leaf-off and leaf-on status was done on volume pixel definitions comparing the attributed point densities per volume and plotting the resulting values as a function of distance to the crown center. In the leaf-off status SFM (structure from motion algorithms clearly identified the central stem and also secondary branch systems. While the penetration into the

  16. Perturbation of Fractional Multi-Agent Systems in Cloud Entropy Computing

    Directory of Open Access Journals (Sweden)

    Rabha W. Ibrahim

    2016-01-01

    Full Text Available A perturbed multi-agent system is a scheme self-possessed of multiple networking agents within a location. This scheme can be used to discuss problems that are impossible or difficult for a specific agent to solve. Intelligence cloud entropy management systems involve functions, methods, procedural approaches, and algorithms. In this study, we introduce a new perturbed algorithm based on the fractional Poisson process. The discrete dynamics are suggested by using fractional entropy and fractional type Tsallis entropy. Moreover, we study the algorithm stability.

  17. High-pressure cloud point data for the system glycerol + olive oil + n-butane + AOT

    OpenAIRE

    Bender,J. P.; Junges,A.; Franceschi,E.; Corazza,F. C.; Dariva,C.; Oliveira,J. Vladimir; Corazza,M. L.

    2008-01-01

    This work reports high-pressure cloud point data for the quaternary system glycerol + olive oil + n-butane + AOT surfactant. The static synthetic method, using a variable-volume view cell, was employed for obtaining the experimental data at pressures up to 27 MPa. The effects of glycerol/olive oil concentration and surfactant addition on the pressure transition values were evaluated in the temperature range from 303 K to 343 K. For the system investigated, vapor-liquid (VLE), liquid-liquid (L...

  18. The method of a joint intraday security check system based on cloud computing

    Science.gov (United States)

    Dong, Wei; Feng, Changyou; Zhou, Caiqi; Cai, Zhi; Dan, Xu; Dai, Sai; Zhang, Chuancheng

    2017-01-01

    The intraday security check is the core application in the dispatching control system. The existing security check calculation only uses the dispatch center’s local model and data as the functional margin. This paper introduces the design of all-grid intraday joint security check system based on cloud computing and its implementation. To reduce the effect of subarea bad data on the all-grid security check, a new power flow algorithm basing on comparison and adjustment with inter-provincial tie-line plan is presented. And the numerical example illustrated the effectiveness and feasibility of the proposed method.

  19. A Dynamic Pricing Reverse Auction-Based Resource Allocation Mechanism in Cloud Workflow Systems

    Directory of Open Access Journals (Sweden)

    Xuejun Li

    2016-01-01

    Full Text Available Market-oriented reverse auction is an efficient and cost-effective method for resource allocation in cloud workflow systems since it can dynamically allocate resources depending on the supply-demand relationship of the cloud market. However, during the auction the price of cloud resource is usually fixed, and the current resource allocation mechanisms cannot adapt to the changeable market properly which results in the low efficiency of resource utilization. To address such a problem, a dynamic pricing reverse auction-based resource allocation mechanism is proposed. During the auction, resource providers can change prices according to the trading situation so that our novel mechanism can increase the chances of making a deal and improve efficiency of resource utilization. In addition, resource providers can improve their competitiveness in the market by lowering prices, and thus users can obtain cheaper resources in shorter time which would decrease monetary cost and completion time for workflow execution. Experiments with different situations and problem sizes are conducted for dynamic pricing-based allocation mechanism (DPAM on resource utilization and the measurement of Time⁎Cost (TC. The results show that our DPAM can outperform its representative in resource utilization, monetary cost, and completion time and also obtain the optimal price reduction rates.

  20. CLOUD-BASED VS DESKTOP-BASED PROPERTY MANAGEMENT SYSTEMS IN HOTEL

    Directory of Open Access Journals (Sweden)

    Mustafa\tGULMEZ

    2015-06-01

    Full Text Available Even though keeping up with the modern developments in IT sector is crucial for the success and competitiveness of a hotel, it is usually very hard for new technologies to be accepted and implemented. This is the case with the cloud technology for which the opinions between hoteliers are divided on those who think that it is just another fashion trend, unnecessary to be taken into consideration and those that believe that it helps in performing daily operations more easily, leaving space for more interaction with guests both in virtual and real world. Usage of cloud technology in hotels is still in its beginning phase and hoteliers still have to learn more about its advantages and adequate usage for the benefit of overall hotel operating. On the example of hotel property management system (PMS and comparison between features of its older desktop-version and new web-based programs, this research aims at finding out at which stage and how effective is usage of cloud technology in hotels. For this, qualitative research with semi-structured interviews with hotel mangers that use one of these programs was conducted. Reasons for usage and advantages of each version are discussed.

  1. VMware vCloud security

    CERN Document Server

    Sarkar, Prasenjit

    2013-01-01

    VMware vCloud Security provides the reader with in depth knowledge and practical exercises sufficient to implement a secured private cloud using VMware vCloud Director and vCloud Networking and Security.This book is primarily for technical professionals with system administration and security administration skills with significant VMware vCloud experience who want to learn about advanced concepts of vCloud security and compliance.

  2. Sensitivity of tropical climate to low-level clouds in the NCEP climate forecast system

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Zeng-Zhen [Center for Ocean-Land-Atmosphere Studies, Calverton, MD (United States); NCEP/NWS/NOAA, Climate Prediction Center, Camp Springs, MD (United States); Huang, Bohua; Schneider, Edwin K. [Center for Ocean-Land-Atmosphere Studies, Calverton, MD (United States); George Mason University, Department of Atmospheric, Oceanic, and Earth Sciences, College of Science, Fairfax, VA (United States); Hou, Yu-Tai; Yang, Fanglin [NCEP/NWS/NOAA, Environmental Modeling Center, Camp Springs, MD (United States); Wang, Wanqiu [NCEP/NWS/NOAA, Climate Prediction Center, Camp Springs, MD (United States); Stan, Cristiana [Center for Ocean-Land-Atmosphere Studies, Calverton, MD (United States)

    2011-05-15

    In this work, we examine the sensitivity of tropical mean climate and seasonal cycle to low clouds and cloud liquid water path (CLWP) by prescribing them in the NCEP climate forecast system (CFS). It is found that the change of low cloud cover alone has a minor influence on the amount of net shortwave radiation reaching the surface and on the warm biases in the southeastern Atlantic. In experiments where CLWP is prescribed using observations, the mean climate in the tropics is improved significantly, implying that shortwave radiation absorption by CLWP is mainly responsible for reducing the excessive surface net shortwave radiation over the southern oceans in the CFS. Corresponding to large CLWP values in the southeastern oceans, the model generates large low cloud amounts. That results in a reduction of net shortwave radiation at the ocean surface and the warm biases in the sea surface temperature in the southeastern oceans. Meanwhile, the cold tongue and associated surface wind stress in the eastern oceans become stronger and more realistic. As a consequence of the overall improvement of the tropical mean climate, the seasonal cycle in the tropical Atlantic is also improved. Based on the results from these sensitivity experiments, we propose a model bias correction approach, in which CLWP is prescribed only in the southeastern Atlantic by using observed annual mean climatology of CLWP. It is shown that the warm biases in the southeastern Atlantic are largely eliminated, and the seasonal cycle in the tropical Atlantic Ocean is significantly improved. Prescribing CLWP in the CFS is then an effective interim technique to reduce model biases and to improve the simulation of seasonal cycle in the tropics. (orig.)

  3. A cloud medication safety support system using QR code and Web services for elderly outpatients.

    Science.gov (United States)

    Tseng, Ming-Hseng; Wu, Hui-Ching

    2014-01-01

    Drug is an important part of disease treatment, but medication errors happen frequently and have significant clinical and financial consequences. The prevalence of prescription medication use among the ambulatory adult population increases with advancing age. Because of the global aging society, outpatients need to improve medication safety more than inpatients. The elderly with multiple chronic conditions face the complex task of medication management. To reduce the medication errors for the elder outpatients with chronic diseases, a cloud medication safety supporting system is designed, demonstrated and evaluated. The proposed system is composed of a three-tier architecture: the front-end tier, the mobile tier and the cloud tier. The mobile tier will host the personalized medication safety supporting application on Android platforms that provides some primary functions including reminders for medication, assistance with pill-dispensing, recording of medications, position of medications and notices of forgotten medications for elderly outpatients. Finally, the hybrid technology acceptance model is employed to understand the intention and satisfaction level of the potential users to use this mobile medication safety support application system. The result of the system acceptance testing indicates that this developed system, implementing patient-centered services, is highly accepted by the elderly. This proposed M-health system could assist elderly outpatients' homecare in preventing medication errors and improving their medication safety.

  4. Optimizing the Use of Storage Systems Provided by Cloud Computing Environments

    Science.gov (United States)

    Gallagher, J. H.; Potter, N.; Byrne, D. A.; Ogata, J.; Relph, J.

    2013-12-01

    Cloud computing systems present a set of features that include familiar computing resources (albeit augmented to support dynamic scaling of processing power) bundled with a mix of conventional and unconventional storage systems. The linux base on which many Cloud environments (e.g., Amazon) are based make it tempting to assume that any Unix software will run efficiently in this environment efficiently without change. OPeNDAP and NODC collaborated on a short project to explore how the S3 and Glacier storage systems provided by the Amazon Cloud Computing infrastructure could be used with a data server developed primarily to access data stored in a traditional Unix file system. Our work used the Amazon cloud system, but we strived for designs that could be adapted easily to other systems like OpenStack. Lastly, we evaluated different architectures from a computer security perspective. We found that there are considerable issues associated with treating S3 as if it is a traditional file system, even though doing so is conceptually simple. These issues include performance penalties because using a software tool that emulates a traditional file system to store data in S3 performs poorly when compared to a storing data directly in S3. We also found there are important benefits beyond performance to ensuring that data written to S3 can directly accessed without relying on a specific software tool. To provide a hierarchical organization to the data stored in S3, we wrote 'catalog' files, using XML. These catalog files map discrete files to S3 access keys. Like a traditional file system's directories, the catalogs can also contain references to other catalogs, providing a simple but effective hierarchy overlaid on top of S3's flat storage space. An added benefit to these catalogs is that they can be viewed in a web browser; our storage scheme provides both efficient access for the data server and access via a web browser. We also looked at the Glacier storage system and

  5. Interactions between aerosol absorption, thermodynamics, dynamics, and microphysics and their impacts on a multiple-cloud system

    Science.gov (United States)

    Lee, Seoung Soo; Li, Zhanqing; Mok, Jungbin; Ahn, Myoung-Hwan; Kim, Byung-Gon; Choi, Yong-Sang; Jung, Chang-Hoon; Yoo, Hye Lim

    2017-12-01

    This study investigates how the increasing concentration of black carbon aerosols, which act as radiation absorbers as well as agents for the cloud-particle nucleation, affects stability, dynamics and microphysics in a multiple-cloud system using simulations. Simulations show that despite increases in stability due to increasing concentrations of black carbon aerosols, there are increases in the averaged updraft mass fluxes (over the whole simulation domain and period). This is because aerosol-enhanced evaporative cooling intensifies convergence near the surface. This increase in the intensity of convergence induces an increase in the frequency of updrafts with the low range of speeds, leading to the increase in the averaged updraft mass fluxes. The increase in the frequency of updrafts induces that in the number of condensation entities and this leads to more condensation and cloud liquid that acts to be a source of the accretion of cloud liquid by precipitation. Hence, eventually, there is more accretion that offsets suppressed autoconversion, which results in negligible changes in cumulative precipitation as aerosol concentrations increase. The increase in the frequency of updrafts with the low range of speeds alters the cloud-system organization (represented by cloud-depth spatiotemporal distributions and cloud-cell population) by supporting more low-depth clouds. The altered organization in turn alters precipitation spatiotemporal distributions by generating more weak precipitation events. Aerosol-induced reduction in solar radiation that reaches the surface induces more occurrences of small-value surface heat fluxes, which in turn supports the more low-depth clouds and weak precipitation together with the greater occurrence of low-speed updrafts.

  6. Rebuilding and the private cloud of the hospital information system by the virtualization technology.

    Science.gov (United States)

    Yamashita, Yoshinori; Ogaito, Tatoku

    2013-01-01

    In our hospital, we managed an electronic health record system and many section subsystems as a hospital information system. By the expansion of these information systems, a system becomes complicated, and maintenance and operative cost increased. Furthermore, the environment that is available to medical information is demanded anywhere anytime by expansion of the computerization. However, the expansion of the information use becomes necessary for the expansion such as the personal protection of information for security. We became rebuilding and the private cloud of the hospital information system by the virtualization technology to solve such a problem. As a result, we were able to perform a decrease in number of the servers which constituted a system, a decrease in network traffic, reduction of the operative cost.

  7. A Cloud-Based System for Automatic Hazard Monitoring from Sentinel-1 SAR Data

    Science.gov (United States)

    Meyer, F. J.; Arko, S. A.; Hogenson, K.; McAlpin, D. B.; Whitley, M. A.

    2017-12-01

    Despite the all-weather capabilities of Synthetic Aperture Radar (SAR), and its high performance in change detection, the application of SAR for operational hazard monitoring was limited in the past. This has largely been due to high data costs, slow product delivery, and limited temporal sampling associated with legacy SAR systems. Only since the launch of ESA's Sentinel-1 sensors have routinely acquired and free-of-charge SAR data become available, allowing—for the first time—for a meaningful contribution of SAR to disaster monitoring. In this paper, we present recent technical advances of the Sentinel-1-based SAR processing system SARVIEWS, which was originally built to generate hazard products for volcano monitoring centers. We outline the main functionalities of SARVIEWS including its automatic database interface to Sentinel-1 holdings of the Alaska Satellite Facility (ASF), and its set of automatic processing techniques. Subsequently, we present recent system improvements that were added to SARVIEWS and allowed for a vast expansion of its hazard services; specifically: (1) In early 2017, the SARVIEWS system was migrated into the Amazon Cloud, providing access to cloud capabilities such as elastic scaling of compute resources and cloud-based storage; (2) we co-located SARVIEWS with ASF's cloud-based Sentinel-1 archive, enabling the efficient and cost effective processing of large data volumes; (3) we integrated SARVIEWS with ASF's HyP3 system (http://hyp3.asf.alaska.edu/), providing functionality such as subscription creation via API or map interface as well as automatic email notification; (4) we automated the production chains for seismic and volcanic hazards by integrating SARVIEWS with the USGS earthquake notification service (ENS) and the USGS eruption alert system. Email notifications from both services are parsed and subscriptions are automatically created when certain event criteria are met; (5) finally, SARVIEWS-generated hazard products are now

  8. Convective Systems Over the Japan Sea: Cloud-Resolving Model Simulations

    Science.gov (United States)

    Tao, Wei-Kuo; Yoshizaki, Masanori; Shie, Chung-Lin; Kato, Teryuki

    2002-01-01

    Wintertime observations of MCSs (Mesoscale Convective Systems) over the Sea of Japan - 2001 (WMO-01) were collected from January 12 to February 1, 2001. One of the major objectives is to better understand and forecast snow systems and accompanying disturbances and the associated key physical processes involved in the formation and development of these disturbances. Multiple observation platforms (e.g., upper-air soundings, Doppler radar, wind profilers, radiometers, etc.) during WMO-01 provided a first attempt at investigating the detailed characteristics of convective storms and air pattern changes associated with winter storms over the Sea of Japan region. WMO-01 also provided estimates of the apparent heat source (Q1) and apparent moisture sink (Q2). The vertical integrals of Q1 and Q2 are equal to the surface precipitation rates. The horizontal and vertical adjective components of Q1 and Q2 can be used as large-scale forcing for the Cloud Resolving Models (CRMs). The Goddard Cumulus Ensemble (GCE) model is a CRM (typically run with a 1-km grid size). The GCE model has sophisticated microphysics and allows explicit interactions between clouds, radiation, and surface processes. It will be used to understand and quantify precipitation processes associated with wintertime convective systems over the Sea of Japan (using data collected during the WMO-01). This is the first cloud-resolving model used to simulate precipitation processes in this particular region. The GCE model-simulated WMO-01 results will also be compared to other GCE model-simulated weather systems that developed during other field campaigns (i.e., South China Sea, west Pacific warm pool region, eastern Atlantic region and central USA).

  9. A continuous surface reconstruction method on point cloud captured from a 3D surface photogrammetry system.

    Science.gov (United States)

    Liu, Wenyang; Cheung, Yam; Sabouri, Pouya; Arai, Tatsuya J; Sawant, Amit; Ruan, Dan

    2015-11-01

    To accurately and efficiently reconstruct a continuous surface from noisy point clouds captured by a surface photogrammetry system (VisionRT). The authors have developed a level-set based surface reconstruction method on point clouds captured by a surface photogrammetry system (VisionRT). The proposed method reconstructs an implicit and continuous representation of the underlying patient surface by optimizing a regularized fitting energy, offering extra robustness to noise and missing measurements. By contrast to explicit/discrete meshing-type schemes, their continuous representation is particularly advantageous for subsequent surface registration and motion tracking by eliminating the need for maintaining explicit point correspondences as in discrete models. The authors solve the proposed method with an efficient narrowband evolving scheme. The authors evaluated the proposed method on both phantom and human subject data with two sets of complementary experiments. In the first set of experiment, the authors generated a series of surfaces each with different black patches placed on one chest phantom. The resulting VisionRT measurements from the patched area had different degree of noise and missing levels, since VisionRT has difficulties in detecting dark surfaces. The authors applied the proposed method to point clouds acquired under these different configurations, and quantitatively evaluated reconstructed surfaces by comparing against a high-quality reference surface with respect to root mean squared error (RMSE). In the second set of experiment, the authors applied their method to 100 clinical point clouds acquired from one human subject. In the absence of ground-truth, the authors qualitatively validated reconstructed surfaces by comparing the local geometry, specifically mean curvature distributions, against that of the surface extracted from a high-quality CT obtained from the same patient. On phantom point clouds, their method achieved submillimeter

  10. A continuous surface reconstruction method on point cloud captured from a 3D surface photogrammetry system

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Wenyang [Department of Bioengineering, University of California, Los Angeles, California 90095 (United States); Cheung, Yam; Sabouri, Pouya; Arai, Tatsuya J.; Sawant, Amit [Department of Radiation Oncology, University of Texas Southwestern, Dallas, Texas 75390 (United States); Ruan, Dan, E-mail: druan@mednet.ucla.edu [Department of Bioengineering, University of California, Los Angeles, California 90095 and Department of Radiation Oncology, University of California, Los Angeles, California 90095 (United States)

    2015-11-15

    Purpose: To accurately and efficiently reconstruct a continuous surface from noisy point clouds captured by a surface photogrammetry system (VisionRT). Methods: The authors have developed a level-set based surface reconstruction method on point clouds captured by a surface photogrammetry system (VisionRT). The proposed method reconstructs an implicit and continuous representation of the underlying patient surface by optimizing a regularized fitting energy, offering extra robustness to noise and missing measurements. By contrast to explicit/discrete meshing-type schemes, their continuous representation is particularly advantageous for subsequent surface registration and motion tracking by eliminating the need for maintaining explicit point correspondences as in discrete models. The authors solve the proposed method with an efficient narrowband evolving scheme. The authors evaluated the proposed method on both phantom and human subject data with two sets of complementary experiments. In the first set of experiment, the authors generated a series of surfaces each with different black patches placed on one chest phantom. The resulting VisionRT measurements from the patched area had different degree of noise and missing levels, since VisionRT has difficulties in detecting dark surfaces. The authors applied the proposed method to point clouds acquired under these different configurations, and quantitatively evaluated reconstructed surfaces by comparing against a high-quality reference surface with respect to root mean squared error (RMSE). In the second set of experiment, the authors applied their method to 100 clinical point clouds acquired from one human subject. In the absence of ground-truth, the authors qualitatively validated reconstructed surfaces by comparing the local geometry, specifically mean curvature distributions, against that of the surface extracted from a high-quality CT obtained from the same patient. Results: On phantom point clouds, their method

  11. Orchestrating Your Cloud Orchestra

    OpenAIRE

    Hindle, Abram

    2015-01-01

    Cloud computing potentially ushers in a new era of computer music performance with exceptionally large computer music instruments consisting of 10s to 100s of virtual machines which we propose to call a `cloud-orchestra'. Cloud computing allows for the rapid provisioning of resources, but to deploy such a complicated and interconnected network of software synthesizers in the cloud requires a lot of manual work, system administration knowledge, and developer/operator skills. This is a barrier ...

  12. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  13. SIMULATION OF E-CLOUD DRIVEN INSTABILITY AND ITS ATTENUATION USING A FEEDBACK SYSTEM IN THE CERN SPS

    International Nuclear Information System (INIS)

    Vay, J.-L.; Byrd, J.M.; Furman, M.A.; Secondo, R.; Venturini, M.; Fox, J.D.; Rivetta, C.H.; Hofle, W.

    2010-01-01

    Electron clouds have been shown to trigger fast growing instabilities on proton beams circulating in the SPS (1), and a feedback system to control the instabilities is under active development (2). We present the latest improvements to the Warp-Posinst simulation framework and feedback model, and its application to the self-consistent simulations of two consecutive bunches interacting with an electron cloud in the SPS.

  14. Case study: IBM Watson Analytics cloud platform as Analytics-as-a-Service system for heart failure early detection

    OpenAIRE

    Guidi, Gabriele; Miniati, Roberto; Mazzola, Matteo; Iadanza, Ernesto

    2016-01-01

    In the recent years the progress in technology and the increasing availability of fast connections have produced a migration of functionalities in Information Technologies services, from static servers to distributed technologies. This article describes the main tools available on the market to perform Analytics as a Service (AaaS) using a cloud platform. It is also described a use case of IBM Watson Analytics, a cloud system for data analytics, applied to the following research scope: detect...

  15. Simulation of E-Cloud Driven Instability And Its Attenuation Using a Feedback System in the CERN SPS

    International Nuclear Information System (INIS)

    Vay, Jean-Luc

    2012-01-01

    Electron clouds have been shown to trigger fast growing instabilities on proton beams circulating in the SPS, and a feedback system to control the instabilities is under active development. We present the latest improvements to the Warp-Posinst simulation framework and feedback model, and its application to the self-consistent simulations of two consecutive bunches interacting with an electron cloud in the SPS.

  16. A Power Efficient Exaflop Computer Design for Global Cloud System Resolving Climate Models.

    Science.gov (United States)

    Wehner, M. F.; Oliker, L.; Shalf, J.

    2008-12-01

    Exascale computers would allow routine ensemble modeling of the global climate system at the cloud system resolving scale. Power and cost requirements of traditional architecture systems are likely to delay such capability for many years. We present an alternative route to the exascale using embedded processor technology to design a system optimized for ultra high resolution climate modeling. These power efficient processors, used in consumer electronic devices such as mobile phones, portable music players, cameras, etc., can be tailored to the specific needs of scientific computing. We project that a system capable of integrating a kilometer scale climate model a thousand times faster than real time could be designed and built in a five year time scale for US$75M with a power consumption of 3MW. This is cheaper, more power efficient and sooner than any other existing technology.

  17. An IoT-cloud Based Wearable ECG Monitoring System for Smart Healthcare.

    Science.gov (United States)

    Yang, Zhe; Zhou, Qihao; Lei, Lei; Zheng, Kan; Xiang, Wei

    2016-12-01

    Public healthcare has been paid an increasing attention given the exponential growth human population and medical expenses. It is well known that an effective health monitoring system can detect abnormalities of health conditions in time and make diagnoses according to the gleaned data. As a vital approach to diagnose heart diseases, ECG monitoring is widely studied and applied. However, nearly all existing portable ECG monitoring systems cannot work without a mobile application, which is responsible for data collection and display. In this paper, we propose a new method for ECG monitoring based on Internet-of-Things (IoT) techniques. ECG data are gathered using a wearable monitoring node and are transmitted directly to the IoT cloud using Wi-Fi. Both the HTTP and MQTT protocols are employed in the IoT cloud in order to provide visual and timely ECG data to users. Nearly all smart terminals with a web browser can acquire ECG data conveniently, which has greatly alleviated the cross-platform issue. Experiments are carried out on healthy volunteers in order to verify the reliability of the entire system. Experimental results reveal that the proposed system is reliable in collecting and displaying real-time ECG data, which can aid in the primary diagnosis of certain heart diseases.

  18. Cloud-enabled large-scale land surface model simulations with the NASA Land Information System

    Science.gov (United States)

    Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.

    2017-12-01

    Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and

  19. Clouds and Earth Radiant Energy System (CERES), a Review: Past, Present and Future

    Science.gov (United States)

    Smith, G. L.; Priestley, K. J.; Loeb, N. G.; Wielicki, B. A.; Charlock, T. P.; Minnis, P.; Doelling, D. R.; Rutan, D. A.

    2011-01-01

    The Clouds and Earth Radiant Energy System (CERES) project s objectives are to measure the reflected solar radiance (shortwave) and Earth-emitted (longwave) radiances and from these measurements to compute the shortwave and longwave radiation fluxes at the top of the atmosphere (TOA) and the surface and radiation divergence within the atmosphere. The fluxes at TOA are to be retrieved to an accuracy of 2%. Improved bidirectional reflectance distribution functions (BRDFs) have been developed to compute the fluxes at TOA from the measured radiances with errors reduced from ERBE by a factor of two or more. Instruments aboard the Terra and Aqua spacecraft provide sampling at four local times. In order to further reduce temporal sampling errors, data are used from the geostationary meteorological satellites to account for changes of scenes between observations by the CERES radiometers. A validation protocol including in-flight calibrations and comparisons of measurements has reduced the instrument errors to less than 1%. The data are processed through three editions. The first edition provides a timely flow of data to investigators and the third edition provides data products as accurate as possible with resources available. A suite of cloud properties retrieved from the MODerate-resolution Imaging Spectroradiometer (MODIS) by the CERES team is used to identify the cloud properties for each pixel in order to select the BRDF for each pixel so as to compute radiation fluxes from radiances. Also, the cloud information is used to compute radiation at the surface and through the atmosphere and to facilitate study of the relationship between clouds and the radiation budget. The data products from CERES include, in addition to the reflected solar radiation and Earth emitted radiation fluxes at TOA, the upward and downward shortwave and longwave radiation fluxes at the surface and at various levels in the atmosphere. Also at the surface the photosynthetically active radiation

  20. A Coupled GCM-Cloud Resolving Modeling System, and a Regional Scale Model to Study Precipitation Processes

    Science.gov (United States)

    Tao, Wei-Kuo

    2007-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a superparameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. The Goddard MMF is based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM), and it has started production runs with two years results (1998 and 1999). Also, at Goddard, we have implemented several Goddard microphysical schemes (2ICE, several 31CE), Goddard radiation (including explicitly calculated cloud optical properties), and Goddard Land Information (LIS, that includes the CLM and NOAH land surface models) into a next generatio11 regional scale model, WRF. In this talk, I will present: (1) A brief review on GCE model and its applications on precipitation processes (microphysical and land processes), (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), and preliminary results (the comparison with traditional GCMs), and (3) A discussion on the Goddard WRF version (its developments and applications).

  1. Towards Scalability for Federated Identity Systems for Cloud-Based Environments

    Directory of Open Access Journals (Sweden)

    André Albino Pereira

    2015-12-01

    Full Text Available As multi-tenant authorization and federated identity management systems for cloud computing matures, the provisioning of services using this paradigm allows maximum efficiency on business that requires access control. However, regarding scalability support, mainly horizontal, some characteristics of those approaches based on central authentication protocols are problematic. The objective of this work is to address these issues by providing an adapted sticky-session mechanism for a Shibboleth architecture using JASIG CAS. This alternative, compared with the recommended distributed memory approach, shown improved efficiency and less overall infrastructure complexity, as well as demanding less 58% of computational resources and improving throughput (requests per second by 11%.

  2. CLOUD COMPUTING SOLUTION FOR ABORTION TRACKING SYSTEM USING FINGERPRINT & RFID AUTHENTICATION

    OpenAIRE

    Jaykar Priyanka; More Sharada; Thube Swapnil; Prof. Supriya Dinesh

    2016-01-01

    According to 2011 census there are decline in both rural and urban area of the country.This sex ratio decline in rural India is three times as compared to drop in urban area. A matter is great concern. In 2011 there is 914 girls for every 1000 boys under the age of seven , so that to stop sex selective abortion we have to developed ‘Cloud Computing solution for Abortion tracking system ’.When pregnant women comes to hospital she has to register using Fingerprint and RFID which is connected to...

  3. NASA Wrangler: Automated Cloud-Based Data Assembly in the RECOVER Wildfire Decision Support System

    Science.gov (United States)

    Schnase, John; Carroll, Mark; Gill, Roger; Wooten, Margaret; Weber, Keith; Blair, Kindra; May, Jeffrey; Toombs, William

    2017-01-01

    NASA Wrangler is a loosely-coupled, event driven, highly parallel data aggregation service designed to take advantageof the elastic resource capabilities of cloud computing. Wrangler automatically collects Earth observational data, climate model outputs, derived remote sensing data products, and historic biophysical data for pre-, active-, and post-wildfire decision making. It is a core service of the RECOVER decision support system, which is providing rapid-response GIS analytic capabilities to state and local government agencies. Wrangler reduces to minutes the time needed to assemble and deliver crucial wildfire-related data.

  4. Moving towards Cloud Security

    Directory of Open Access Journals (Sweden)

    Edit Szilvia Rubóczki

    2015-01-01

    Full Text Available Cloud computing hosts and delivers many different services via Internet. There are a lot of reasons why people opt for using cloud resources. Cloud development is increasing fast while a lot of related services drop behind, for example the mass awareness of cloud security. However the new generation upload videos and pictures without reason to a cloud storage, but only few know about data privacy, data management and the proprietary of stored data in the cloud. In an enterprise environment the users have to know the rule of cloud usage, however they have little knowledge about traditional IT security. It is important to measure the level of their knowledge, and evolve the training system to develop the security awareness. The article proves the importance of suggesting new metrics and algorithms for measuring security awareness of corporate users and employees to include the requirements of emerging cloud security.

  5. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  6. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  7. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  8. A robust real-time surface reconstruction method on point clouds captured from a 3D surface photogrammetry system.

    Science.gov (United States)

    Liu, Wenyang; Cheung, Yam; Sawant, Amit; Ruan, Dan

    2016-05-01

    To develop a robust and real-time surface reconstruction method on point clouds captured from a 3D surface photogrammetry system. The authors have developed a robust and fast surface reconstruction method on point clouds acquired by the photogrammetry system, without explicitly solving the partial differential equation required by a typical variational approach. Taking advantage of the overcomplete nature of the acquired point clouds, their method solves and propagates a sparse linear relationship from the point cloud manifold to the surface manifold, assuming both manifolds share similar local geometry. With relatively consistent point cloud acquisitions, the authors propose a sparse regression (SR) model to directly approximate the target point cloud as a sparse linear combination from the training set, assuming that the point correspondences built by the iterative closest point (ICP) is reasonably accurate and have residual errors following a Gaussian distribution. To accommodate changing noise levels and/or presence of inconsistent occlusions during the acquisition, the authors further propose a modified sparse regression (MSR) model to model the potentially large and sparse error built by ICP with a Laplacian prior. The authors evaluated the proposed method on both clinical point clouds acquired under consistent acquisition conditions and on point clouds with inconsistent occlusions. The authors quantitatively evaluated the reconstruction performance with respect to root-mean-squared-error, by comparing its reconstruction results against that from the variational method. On clinical point clouds, both the SR and MSR models have achieved sub-millimeter reconstruction accuracy and reduced the reconstruction time by two orders of magnitude to a subsecond reconstruction time. On point clouds with inconsistent occlusions, the MSR model has demonstrated its advantage in achieving consistent and robust performance despite the introduced occlusions. The authors have

  9. A robust real-time surface reconstruction method on point clouds captured from a 3D surface photogrammetry system

    International Nuclear Information System (INIS)

    Liu, Wenyang; Cheung, Yam; Sawant, Amit; Ruan, Dan

    2016-01-01

    Purpose: To develop a robust and real-time surface reconstruction method on point clouds captured from a 3D surface photogrammetry system. Methods: The authors have developed a robust and fast surface reconstruction method on point clouds acquired by the photogrammetry system, without explicitly solving the partial differential equation required by a typical variational approach. Taking advantage of the overcomplete nature of the acquired point clouds, their method solves and propagates a sparse linear relationship from the point cloud manifold to the surface manifold, assuming both manifolds share similar local geometry. With relatively consistent point cloud acquisitions, the authors propose a sparse regression (SR) model to directly approximate the target point cloud as a sparse linear combination from the training set, assuming that the point correspondences built by the iterative closest point (ICP) is reasonably accurate and have residual errors following a Gaussian distribution. To accommodate changing noise levels and/or presence of inconsistent occlusions during the acquisition, the authors further propose a modified sparse regression (MSR) model to model the potentially large and sparse error built by ICP with a Laplacian prior. The authors evaluated the proposed method on both clinical point clouds acquired under consistent acquisition conditions and on point clouds with inconsistent occlusions. The authors quantitatively evaluated the reconstruction performance with respect to root-mean-squared-error, by comparing its reconstruction results against that from the variational method. Results: On clinical point clouds, both the SR and MSR models have achieved sub-millimeter reconstruction accuracy and reduced the reconstruction time by two orders of magnitude to a subsecond reconstruction time. On point clouds with inconsistent occlusions, the MSR model has demonstrated its advantage in achieving consistent and robust performance despite the introduced

  10. A robust real-time surface reconstruction method on point clouds captured from a 3D surface photogrammetry system

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Wenyang [Department of Bioengineering, University of California, Los Angeles, Los Angeles, California 90095 (United States); Cheung, Yam [Department of Radiation Oncology, University of Texas Southwestern, Dallas, Texas 75390 (United States); Sawant, Amit [Department of Radiation Oncology, University of Texas Southwestern, Dallas, Texas, 75390 and Department of Radiation Oncology, University of Maryland, College Park, Maryland 20742 (United States); Ruan, Dan, E-mail: druan@mednet.ucla.edu [Department of Bioengineering, University of California, Los Angeles, Los Angeles, California 90095 and Department of Radiation Oncology, University of California, Los Angeles, Los Angeles, California 90095 (United States)

    2016-05-15

    Purpose: To develop a robust and real-time surface reconstruction method on point clouds captured from a 3D surface photogrammetry system. Methods: The authors have developed a robust and fast surface reconstruction method on point clouds acquired by the photogrammetry system, without explicitly solving the partial differential equation required by a typical variational approach. Taking advantage of the overcomplete nature of the acquired point clouds, their method solves and propagates a sparse linear relationship from the point cloud manifold to the surface manifold, assuming both manifolds share similar local geometry. With relatively consistent point cloud acquisitions, the authors propose a sparse regression (SR) model to directly approximate the target point cloud as a sparse linear combination from the training set, assuming that the point correspondences built by the iterative closest point (ICP) is reasonably accurate and have residual errors following a Gaussian distribution. To accommodate changing noise levels and/or presence of inconsistent occlusions during the acquisition, the authors further propose a modified sparse regression (MSR) model to model the potentially large and sparse error built by ICP with a Laplacian prior. The authors evaluated the proposed method on both clinical point clouds acquired under consistent acquisition conditions and on point clouds with inconsistent occlusions. The authors quantitatively evaluated the reconstruction performance with respect to root-mean-squared-error, by comparing its reconstruction results against that from the variational method. Results: On clinical point clouds, both the SR and MSR models have achieved sub-millimeter reconstruction accuracy and reduced the reconstruction time by two orders of magnitude to a subsecond reconstruction time. On point clouds with inconsistent occlusions, the MSR model has demonstrated its advantage in achieving consistent and robust performance despite the introduced

  11. Evaluating Cloud Computing in the Proposed NASA DESDynI Ground Data System

    Science.gov (United States)

    Tran, John J.; Cinquini, Luca; Mattmann, Chris A.; Zimdars, Paul A.; Cuddy, David T.; Leung, Kon S.; Kwoun, Oh-Ig; Crichton, Dan; Freeborn, Dana

    2011-01-01

    The proposed NASA Deformation, Ecosystem Structure and Dynamics of Ice (DESDynI) mission would be a first-of-breed endeavor that would fundamentally change the paradigm by which Earth Science data systems at NASA are built. DESDynI is evaluating a distributed architecture where expert science nodes around the country all engage in some form of mission processing and data archiving. This is compared to the traditional NASA Earth Science missions where the science processing is typically centralized. What's more, DESDynI is poised to profoundly increase the amount of data collection and processing well into the 5 terabyte/day and tens of thousands of job range, both of which comprise a tremendous challenge to DESDynI's proposed distributed data system architecture. In this paper, we report on a set of architectural trade studies and benchmarks meant to inform the DESDynI mission and the broader community of the impacts of these unprecedented requirements. In particular, we evaluate the benefits of cloud computing and its integration with our existing NASA ground data system software called Apache Object Oriented Data Technology (OODT). The preliminary conclusions of our study suggest that the use of the cloud and OODT together synergistically form an effective, efficient and extensible combination that could meet the challenges of NASA science missions requiring DESDynI-like data collection and processing volumes at reduced costs.

  12. EVALUATION MODEL FOR PAVEMENT SURFACE DISTRESS ON 3D POINT CLOUDS FROM MOBILE MAPPING SYSTEM

    Directory of Open Access Journals (Sweden)

    K. Aoki

    2012-07-01

    Full Text Available This paper proposes a methodology to evaluate the pavement surface distress for maintenance planning of road pavement using 3D point clouds from Mobile Mapping System (MMS. The issue on maintenance planning of road pavement requires scheduled rehabilitation activities for damaged pavement sections to keep high level of services. The importance of this performance-based infrastructure asset management on actual inspection data is globally recognized. Inspection methodology of road pavement surface, a semi-automatic measurement system utilizing inspection vehicles for measuring surface deterioration indexes, such as cracking, rutting and IRI, have already been introduced and capable of continuously archiving the pavement performance data. However, any scheduled inspection using automatic measurement vehicle needs much cost according to the instruments’ specification or inspection interval. Therefore, implementation of road maintenance work, especially for the local government, is difficult considering costeffectiveness. Based on this background, in this research, the methodologies for a simplified evaluation for pavement surface and assessment of damaged pavement section are proposed using 3D point clouds data to build urban 3D modelling. The simplified evaluation results of road surface were able to provide useful information for road administrator to find out the pavement section for a detailed examination and for an immediate repair work. In particular, the regularity of enumeration of 3D point clouds was evaluated using Chow-test and F-test model by extracting the section where the structural change of a coordinate value was remarkably achieved. Finally, the validity of the current methodology was investigated by conducting a case study dealing with the actual inspection data of the local roads.

  13. CANFAR+Skytree: A Cloud Computing and Data Mining System for Astronomy

    Science.gov (United States)

    Ball, N. M.

    2013-10-01

    To-date, computing systems have allowed either sophisticated analysis of small datasets, as exemplified by most astronomy software, or simple analysis of large datasets, such as database queries. At the Canadian Astronomy Data Centre, we have combined our cloud computing system, the Canadian Advanced Network for Astronomical Research (CANFAR), with the world's most advanced machine learning software, Skytree, to create the world's first cloud computing system for data mining in astronomy. CANFAR provides a generic environment for the storage and processing of large datasets, removing the requirement for an individual or project to set up and maintain a computing system when implementing an extensive undertaking such as a survey pipeline. 500 processor cores and several hundred terabytes of persistent storage are currently available to users, and both the storage and processing infrastructure are expandable. The storage is implemented via the International Virtual Observatory Alliance's VOSpace protocol, and is available as a mounted filesystem accessible both interactively, and to all processing jobs. The user interacts with CANFAR by utilizing virtual machines, which appear to them as equivalent to a desktop. Each machine is replicated as desired to perform large-scale parallel processing. Such an arrangement enables the user to immediately install and run the same astronomy code that they already utilize, in the same way as on a desktop. In addition, unlike many cloud systems, batch job scheduling is handled for the user on multiple virtual machines by the Condor job queueing system. Skytree is installed and run just as any other software on the system, and thus acts as a library of command line data mining functions that can be integrated into one's wider analysis. Thus we have created a generic environment for large-scale analysis by data mining, in the same way that CANFAR itself has done for storage and processing. Because Skytree scales to large data in

  14. Cloud Cover Assessment for Operational Crop Monitoring Systems in Tropical Areas

    Directory of Open Access Journals (Sweden)

    Isaque Daniel Rocha Eberhardt

    2016-03-01

    Full Text Available The potential of optical remote sensing data to identify, map and monitor croplands is well recognized. However, clouds strongly limit the usefulness of optical imagery for these applications. This paper aims at assessing cloud cover conditions over four states in the tropical and sub-tropical Center-South region of Brazil to guide the development of an appropriate agricultural monitoring system based on Landsat-like imagery. Cloudiness was assessed during overlapping four months periods to match the typical length of crop cycles in the study area. The percentage of clear sky occurrence was computed from the 1 km resolution MODIS Cloud Mask product (MOD35 considering 14 years of data between July 2000 and June 2014. Results showed high seasonality of cloud occurrence within the crop year with strong variations across the study area. The maximum seasonality was observed for the two states in the northern part of the study area (i.e., the ones closer to the Equator line, which also presented the lowest averaged values (15% of clear sky occurrence during the main (summer cropping period (November to February. In these locations, optical data faces severe constraints for mapping summer crops. On the other hand, relatively favorable conditions were found in the southern part of the study region. In the South, clear sky values of around 45% were found and no significant clear sky seasonality was observed. Results underpin the challenges to implement an operational crop monitoring system based solely on optical remote sensing imagery in tropical and sub-tropical regions, in particular if short-cycle crops have to be monitored during the cloudy summer months. To cope with cloudiness issues, we recommend the use of new systems with higher repetition rates such as Sentinel-2. For local studies, Unmanned Aircraft Vehicles (UAVs might be used to augment the observing capability. Multi-sensor approaches combining optical and microwave data can be another

  15. Usage of Cloud Computing Simulators and Future Systems For Computational Research

    OpenAIRE

    Lakshminarayanan, Ramkumar; Ramalingam, Rajasekar

    2016-01-01

    Cloud Computing is an Internet based computing, whereby shared resources, software and information, are provided to computers and devices on demand, like the electricity grid. Currently, IaaS (Infrastructure as a Service), PaaS (Platform as a Service) and SaaS (Software as a Service) are used as a business model for Cloud Computing. Nowadays, the adoption and deployment of Cloud Computing is increasing in various domains, forcing researchers to conduct research in the area of Cloud Computing ...

  16. Survey for Sensor-Cloud System from Business Process Outsourcing Perspective

    OpenAIRE

    Kim, JeongYeon

    2015-01-01

    Cloud computing is a new IT trend to meet the new business requirements such as business agility and operational efficiency with business process outsourcing (BPO). Sensor-Cloud infrastructure is the extended form of cloud computing to manage the sensors which are scattered throughout the network. Several benefits of adopting cloud computing including cost saving, high scalability, and business risk reductions also can be applied to sensor data collection. As a first investment for new techno...

  17. Cloud Based Educational Systems 
And Its Challenges And Opportunities And Issues

    OpenAIRE

    PAUL, Prantosh Kr.; DANGWAL, Kiran LATA

    2014-01-01

    Cloud Computing (CC) is actually is a set of hardware, software, networks, storage, services an interface combines to deliver aspects of computing as a service. Cloud Computing (CC) actually uses the central remote servers to maintain data and applications. Practically Cloud Computing (CC) is extension of Grid computing with independency and smarter tools and technological gradients. Healthy Cloud Computing helps in sharing of software, hardware, application and other packages with the help o...

  18. Analytical Approach for Analyzing Trusted Security System for Data Sharing in Cloud Environment

    OpenAIRE

    Anand Srivastava; Surendra Mishra; Pankaj Kawadkar

    2011-01-01

    Cheap, seemingly unlimited computing resources that can be allocated almost instantaneously and pay-as-you-go pricing schemes are some of the reasons for the success of Cloud computing .In this paper we discuss few aspects of cloud computing and also there area. Cloud computing has been acknowledged as one of the prevailing models for providing IT capacities. The computing paradigm that comes with cloud computing has incurred great concerns on the security of data, especially the integrity an...

  19. Evaluating Microphysics in Cloud-Resolving Models using TRMM and Ground-based Precipitation Radar Observations

    Science.gov (United States)

    Krueger, S. K.; Zulauf, M. A.; Li, Y.; Zipser, E. J.

    2005-05-01

    Global satellite datasets such as those produced by ISCCP, ERBE, and CERES provide strong observational constraints on cloud radiative properties. Such observations have been widely used for model evaluation, tuning, and improvement. Cloud radiative properties depend primarily on small, non-precipitating cloud droplets and ice crystals, yet the dynamical, microphysical and radiative processes which produce these small particles often involve large, precipitating hydrometeors. There now exists a global dataset of tropical cloud system precipitation feature (PF) properties, collected by TRMM and produced by Steve Nesbitt, that provides additional observational constraints on cloud system properties. We are using the TRMM PF dataset to evaluate the precipitation microphysics of two simulations of deep, precipitating, convective cloud systems: one is a 29-day summertime, continental case (ARM Summer 1997 SCM IOP, at the Southern Great Plains site); the second is a tropical maritime case: the Kwajalein MCS of 11-12 August 1999 (part of a 52-day simulation). Both simulations employed the same bulk, three-ice category microphysical parameterization (Krueger et al. 1995). The ARM simulation was executed using the UCLA/Utah 2D CRM, while the KWAJEX simulation was produced using the 3D CSU CRM (SAM). The KWAJEX simulation described above is compared with both the actual radar data and the TRMM statistics. For the Kwajalein MCS of 11 to 12 August 1999, there are research radar data available for the lifetime of the system. This particular MCS was large in size and rained heavily, but it was weak to average in measures of convective intensity, against the 5-year TRMM sample of 108. For the Kwajalein MCS simulation, the 20 dBZ contour is at 15.7 km and the 40 dBZ contour at 14.5 km! Of all 108 MCSs observed by TRMM, the highest value for the 40 dBZ contour is 8 km. Clearly, the high reflectivity cores are off scale compared with observed cloud systems in this area. A similar

  20. Cloud Based Educational Systems and Its Challenges and Opportunities and Issues

    Science.gov (United States)

    Paul, Prantosh Kr.; Lata Dangwal, Kiran

    2014-01-01

    Cloud Computing (CC) is actually is a set of hardware, software, networks, storage, services an interface combines to deliver aspects of computing as a service. Cloud Computing (CC) actually uses the central remote servers to maintain data and applications. Practically Cloud Computing (CC) is extension of Grid computing with independency and…

  1. Relationships among cloud occurrence frequency, overlap, and effective thickness derived from CALIPSO and CloudSat merged cloud vertical profiles

    Science.gov (United States)

    Kato, Seiji; Sun-Mack, Sunny; Miller, Walter F.; Rose, Fred G.; Chen, Yan; Minnis, Patrick; Wielicki, Bruce A.

    2010-01-01

    A cloud frequency of occurrence matrix is generated using merged cloud vertical profiles derived from the satellite-borne Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) and cloud profiling radar. The matrix contains vertical profiles of cloud occurrence frequency as a function of the uppermost cloud top. It is shown that the cloud fraction and uppermost cloud top vertical profiles can be related by a cloud overlap matrix when the correlation length of cloud occurrence, which is interpreted as an effective cloud thickness, is introduced. The underlying assumption in establishing the above relation is that cloud overlap approaches random overlap with increasing distance separating cloud layers and that the probability of deviating from random overlap decreases exponentially with distance. One month of Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) and CloudSat data (July 2006) support these assumptions, although the correlation length sometimes increases with separation distance when the cloud top height is large. The data also show that the correlation length depends on cloud top hight and the maximum occurs when the cloud top height is 8 to 10 km. The cloud correlation length is equivalent to the decorrelation distance introduced by Hogan and Illingworth (2000) when cloud fractions of both layers in a two-cloud layer system are the same. The simple relationships derived in this study can be used to estimate the top-of-atmosphere irradiance difference caused by cloud fraction, uppermost cloud top, and cloud thickness vertical profile differences.

  2. Stereovision-based integrated system for point cloud reconstruction and simulated brain shift validation.

    Science.gov (United States)

    Yang, Xiaochen; Clements, Logan W; Luo, Ma; Narasimhan, Saramati; Thompson, Reid C; Dawant, Benoit M; Miga, Michael I

    2017-07-01

    Intraoperative soft tissue deformation, referred to as brain shift, compromises the application of current image-guided surgery navigation systems in neurosurgery. A computational model driven by sparse data has been proposed as a cost-effective method to compensate for cortical surface and volumetric displacements. We present a mock environment developed to acquire stereoimages from a tracked operating microscope and to reconstruct three-dimensional point clouds from these images. A reconstruction error of 1 mm is estimated by using a phantom with a known geometry and independently measured deformation extent. The microscope is tracked via an attached tracking rigid body that facilitates the recording of the position of the microscope via a commercial optical tracking system as it moves during the procedure. Point clouds, reconstructed under different microscope positions, are registered into the same space to compute the feature displacements. Using our mock craniotomy device, realistic cortical deformations are generated. When comparing our tracked microscope stereo-pair measure of mock vessel displacements to that of the measurement determined by the independent optically tracked stylus marking, the displacement error was [Formula: see text] on average. These results demonstrate the practicality of using tracked stereoscopic microscope as an alternative to laser range scanners to collect sufficient intraoperative information for brain shift correction.

  3. High-pressure cloud point data for the system glycerol + olive oil + n-butane + AOT

    Directory of Open Access Journals (Sweden)

    J. P. Bender

    2008-09-01

    Full Text Available This work reports high-pressure cloud point data for the quaternary system glycerol + olive oil + n-butane + AOT surfactant. The static synthetic method, using a variable-volume view cell, was employed for obtaining the experimental data at pressures up to 27 MPa. The effects of glycerol/olive oil concentration and surfactant addition on the pressure transition values were evaluated in the temperature range from 303 K to 343 K. For the system investigated, vapor-liquid (VLE, liquid-liquid (LLE and vapor-liquid-liquid (VLLE equilibrium were recorded. It was experimentally observed that, at a given temperature and surfactant content, an increase in the concentration of glycerol/oil ratio led to a pronounced increase in the slope of the liquid-liquid coexistence curve. A comparison with results reported for the same system but using propane as solvent showed that much lower pressure transition values are obtained when using n-butane.

  4. Design of the Hospital Integrated Information Management System Based on Cloud Platform.

    Science.gov (United States)

    Aijing, L; Jin, Y

    2015-12-01

    At present, the outdated information management style cannot meet the needs of hospital management, and has become the bottleneck of hospital's management and development. In order to improve the integrated management of information, hospitals have increased their investment in integrated information management systems. On account of the lack of reasonable and scientific design, some hospital integrated information management systems have common problems, such as unfriendly interface, poor portability and maintainability, low security and efficiency, lack of interactivity and information sharing. To solve the problem, this paper carries out the research and design of a hospital information management system based on cloud platform, which can realize the optimized integration of hospital information resources and save money.

  5. Towards large-scale data analysis: challenges in the design of portable systems and use of Cloud computing.

    Science.gov (United States)

    Diaz, Javier; Arrizabalaga, Saioa; Bustamante, Paul; Mesa, Iker; Añorga, Javier; Goya, Jon

    2013-01-01

    Portable systems and global communications open a broad spectrum for new health applications. In the framework of electrophysiological applications, several challenges are faced when developing portable systems embedded in Cloud computing services. In order to facilitate new developers in this area based on our experience, five areas of interest are presented in this paper where strategies can be applied for improving the performance of portable systems: transducer and conditioning, processing, wireless communications, battery and power management. Likewise, for Cloud services, scalability, portability, privacy and security guidelines have been highlighted.

  6. The Cloud-Aerosol Transport System (CATS): A New Earth Science Capability for ISS (Invited)

    Science.gov (United States)

    McGill, M. J.; Yorks, J. E.; Scott, S.; Kupchock, A.; Selmer, P.

    2013-12-01

    The Cloud-Aerosol Transport System (CATS) is a lidar remote sensing instrument developed for deployment to the International Space Station (ISS). The CATS lidar will provide range-resolved profile measurements of atmospheric aerosol and cloud distributions and properties. The CATS instrument uses a high repetition rate laser operating at three wavelengths (1064, 532, and 355 nm) to derive properties of cloud/aerosol layers including: layer height, layer thickness, backscatter, optical depth, extinction, and depolarization-based discrimination of particle type. The CATS mission was designed to capitalize on the Space Station's unique orbit and facilities to continue existing Earth Science data records, to provide observational data for use in forecast models, and to demonstrate new technologies for use in future missions. The CATS payload will be installed on the Japanese Experiment Module - Exposed Facility (JEM-EF). The payload is designed to operate on-orbit for at least six months, and up to three years. The payload is completed and currently scheduled for a mid-2014 launch. The ISS and, in particular, the JEM-EF, is an exciting new platform for spaceborne Earth observations. The ability to leverage existing aircraft instrument designs coupled with the lower cost possible for ISS external attached payloads permits rapid and cost effective development of spaceborne sensors. The CATS payload is based on existing instrumentation built and operated on the high-altitude NASA ER-2 aircraft. The payload is housed in a 1.5 m x 1 m x 0.8 m volume that attaches to the JEM-EF. The allowed volume limits the maximum size for the collecting telescope to 60 cm diameter. Figure 1 shows a schematic layout of the CATS payload, with the primary instrument components identified. Figure 2 is a photo of the completed payload. CATS payload cut-away view. Completed CATS payload assembly.

  7. CLOUD STORAGE SERVICES

    OpenAIRE

    Yan, Cheng

    2017-01-01

    Cloud computing is a hot topic in recent research and applications. Because it is widely used in various fields. Up to now, Google, Microsoft, IBM, Amazon and other famous co partnership have proposed their cloud computing application. Look upon cloud computing as one of the most important strategy in the future. Cloud storage is the lower layer of cloud computing system which supports the service of the other layers above it. At the same time, it is an effective way to store and manage heavy...

  8. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  9. Formation of giant molecular clouds in global spiral structures: the role of orbital dynamics and cloud-cloud collisions

    International Nuclear Information System (INIS)

    Roberts, W.W. Jr.; Stewart, G.R.

    1987-01-01

    The different roles played by orbital dynamics and dissipative cloud-cloud collisions in the formation of giant molecular clouds (GMCs) in a global spiral structure are investigated. The interstellar medium (ISM) is simulated by a system of particles, representing clouds, which orbit in a spiral-perturbed, galactic gravitational field. The overall magnitude and width of the global cloud density distribution in spiral arms is very similar in the collisional and collisionless simulations. The results suggest that the assumed number density and size distribution of clouds and the details of individual cloud-cloud collisions have relatively little effect on these features. Dissipative cloud-cloud collisions play an important steadying role for the cloud system's global spiral structure. Dissipative cloud-cloud collisions also damp the relative velocity dispersion of clouds in massive associations and thereby aid in the effective assembling of GMC-like complexes

  10. Retrievals of Ice Cloud Microphysical Properties of Deep Convective Systems using Radar Measurements

    Science.gov (United States)

    Tian, J.; Dong, X.; Xi, B.; Wang, J.; Homeyer, C. R.

    2015-12-01

    This study presents innovative algorithms for retrieving ice cloud microphysical properties of Deep Convective Systems (DCSs) using Next-Generation Radar (NEXRAD) reflectivity and newly derived empirical relationships from aircraft in situ measurements in Wang et al. (2015) during the Midlatitude Continental Convective Clouds Experiment (MC3E). With composite gridded NEXRAD radar reflectivity, four-dimensional (space-time) ice cloud microphysical properties of DCSs are retrieved, which is not possible from either in situ sampling at a single altitude or from vertical pointing radar measurements. For this study, aircraft in situ measurements provide the best-estimated ice cloud microphysical properties for validating the radar retrievals. Two statistical comparisons between retrieved and aircraft in situ measured ice microphysical properties are conducted from six selected cases during MC3E. For the temporal-averaged method, the averaged ice water content (IWC) and median mass diameter (Dm) from aircraft in situ measurements are 0.50 g m-3 and 1.51 mm, while the retrievals from radar reflectivity have negative biases of 0.12 g m-3 (24%) and 0.02 mm (1.3%) with correlations of 0.71 and 0.48, respectively. For the spatial-averaged method, the IWC retrievals are closer to the aircraft results (0.51 vs. 0.47 g m-3) with a positive bias of 8.5%, whereas the Dm retrievals are larger than the aircraft results (1.65 mm vs. 1.51 mm) with a positive bias of 9.3%. The retrieved IWCs decrease from ~0.6 g m-3 at 5 km to ~0.15 g m-3 at 13 km, and Dm values decrease from ~2 mm to ~0.7 mm at the same levels. In general, the aircraft in situ measured IWC and Dm values at each level are within one standard derivation of retrieved properties. Good agreements between microphysical properties measured from aircraft and retrieved from radar reflectivity measurements indicate the reasonable accuracy of our retrievals.

  11. Modelling operations and security of cloud systems using Z-notation and Chinese Wall security policy

    Science.gov (United States)

    Basu, Srijita; Sengupta, Anirban; Mazumdar, Chandan

    2016-11-01

    Enterprises are increasingly using cloud computing for hosting their applications. Availability of fast Internet and cheap bandwidth are causing greater number of people to use cloud-based services. This has the advantage of lower cost and minimum maintenance. However, ensuring security of user data and proper management of cloud infrastructure remain major areas of concern. Existing techniques are either too complex, or fail to properly represent the actual cloud scenario. This article presents a formal cloud model using the constructs of Z-notation. Principles of the Chinese Wall security policy have been applied to design secure cloud-specific operations. The proposed methodology will enable users to safely host their services, as well as process sensitive data, on cloud.

  12. Next Generation Cloud-based Science Data Systems and Their Implications on Data and Software Stewardship, Preservation, and Provenance

    Science.gov (United States)

    Hua, H.; Manipon, G.; Starch, M.

    2017-12-01

    NASA's upcoming missions are expected to be generating data volumes at least an order of magnitude larger than current missions. A significant increase in data processing, data rates, data volumes, and long-term data archive capabilities are needed. Consequently, new challenges are emerging that impact traditional data and software management approaches. At large-scales, next generation science data systems are exploring the move onto cloud computing paradigms to support these increased needs. New implications such as costs, data movement, collocation of data systems & archives, and moving processing closer to the data, may result in changes to the stewardship, preservation, and provenance of science data and software. With more science data systems being on-boarding onto cloud computing facilities, we can expect more Earth science data records to be both generated and kept in the cloud. But at large scales, the cost of processing and storing global data may impact architectural and system designs. Data systems will trade the cost of keeping data in the cloud with the data life-cycle approaches of moving "colder" data back to traditional on-premise facilities. How will this impact data citation and processing software stewardship? What are the impacts of cloud-based on-demand processing and its affect on reproducibility and provenance. Similarly, with more science processing software being moved onto cloud, virtual machines, and container based approaches, more opportunities arise for improved stewardship and preservation. But will the science community trust data reprocessed years or decades later? We will also explore emerging questions of the stewardship of the science data system software that is generating the science data records both during and after the life of mission.

  13. Resource Storage Management Model For Ensuring Quality Of Service In The Cloud Archive Systems

    Directory of Open Access Journals (Sweden)

    Mariusz Kapanowski

    2014-01-01

    Full Text Available Nowadays, service providers offer a lot of IT services in the public or private cloud. The client can buy various kinds of services like SaaS, PaaS, etc. Recently there was introduced Backup as a Service (BaaS as a variety of SaaS. At the moment there are available several different BaaSes for archiving the data in the cloud, but they provide only a basic level of service quality. In the paper we propose a model which ensures QoS for BaaS and some  methods for management of storage resources aimed at achieving the required SLA. This model introduces a set of parameters responsible for SLA level which can be offered on the basic or higher level of quality. The storage systems (typically HSM, which are distributed between several Data Centres,  are built based on disk arrays, VTLs, and tape libraries. The RSMM model does not assume bandwidth reservation or control, but is rather focused on the management of storage resources.

  14. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path.

    Science.gov (United States)

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective.

  15. Towards Service-Oriented Middleware for Fog and Cloud Integrated Cyber Physical Systems

    DEFF Research Database (Denmark)

    Mohamed, Nader; Lazarova-Molnar, Sanja; Jawhar, Imad

    2017-01-01

    enables the integration of CPS with other systems such as Cloud and Fog Computing. Furthermore, as CPS can be developed for various applications at different scales, this paper provides a classification for CPS applications and discusses how CPSWare can effectively deal with the different issues in each...... of the applications. An appropriate middleware is needed to provide infrastructural support and assist the development and operations of diverse CPS applications. This paper studies utilizing the service-oriented middleware (SOM) approach for CPS and discusses the advantages and requirements for such utilization....... In addition, it proposes an SOM for CPS, called CPSWare. This middleware views all CPS components as a set of services and provides a service-based infrastructure to develop and operate CPS applications. This approach provides systemic solutions for solving many computing and networking issues in CPS. It also...

  16. Industrial cloud-based cyber-physical systems the IMC-AESOP approach

    CERN Document Server

    Bangemann, Thomas; Karnouskos, Stamatis; Delsing, Jerker; Stluka, Petr; Harrison, Robert; Jammes, Francois; Lastra, Jose

    2014-01-01

    This book presents cutting-edge emerging technologies and approaches in the areas of service-oriented architectures, intelligent devices, and cloud-based cyber-physical systems. It provides a clear view on their applicability to the management and automation of manufacturing and process industries. It offers a holistic view of future industrial cyber-physical systems and their industrial usage, and also depicts technologies and architectures as well as a migration approach and engineering tools based on these. By providing a careful balance between the theory and the practical aspects, this book has been authored by several experts from academia and industry, thereby offering a valuable understanding of the vision, the domain, the processes and the results of the research. It has several illustrations and tables to clearly exemplify the concepts and results examined in the text, and these are supported by four real-life case-studies. We are witnessing rapid advances in the industrial automation, mainly driven...

  17. Environmental Catastrophes in the Earth's History Due to Solar Systems Encounters with Giant Molecular Clouds

    Science.gov (United States)

    Pavlov, Alexander A.

    2011-01-01

    In its motion through the Milky Way galaxy, the solar system encounters an average density (>=330 H atoms/cubic cm) giant molecular cloud (GMC) approximately every 108 years, a dense (approx 2 x 103 H atoms/cubic cm) GMC every approx 109 years and will inevitably encounter them in the future. However, there have been no studies linking such events with severe (snowball) glaciations in Earth history. Here we show that dramatic climate change can be caused by interstellar dust accumulating in Earth's atmosphere during the solar system's immersion into a dense (approx ,2 x 103 H atoms/cubic cm) GMC. The stratospheric dust layer from such interstellar particles could provide enough radiative forcing to trigger the runaway ice-albedo feedback that results in global snowball glaciations. We also demonstrate that more frequent collisions with less dense GMCs could cause moderate ice ages.

  18. Clouds and the Earth's Radiant Energy System (CERES) Data Products for Climate Research

    Science.gov (United States)

    Kato, Seiji; Loeb, Norman G.; Rutan, David A.; Rose, Fred G.

    2015-01-01

    NASA's Clouds and the Earth's Radiant Energy System (CERES) project integrates CERES, Moderate Resolution Imaging Spectroradiometer (MODIS), and geostationary satellite observations to provide top-of-atmosphere (TOA) irradiances derived from broadband radiance observations by CERES instruments. It also uses snow cover and sea ice extent retrieved from microwave instruments as well as thermodynamic variables from reanalysis. In addition, these variables are used for surface and atmospheric irradiance computations. The CERES project provides TOA, surface, and atmospheric irradiances in various spatial and temporal resolutions. These data sets are for climate research and evaluation of climate models. Long-term observations are required to understand how the Earth system responds to radiative forcing. A simple model is used to estimate the time to detect trends in TOA reflected shortwave and emitted longwave irradiances.

  19. Examining the controlling factors on Southern Ocean clouds and their radiative effects in the context of midlatitude weather systems

    Science.gov (United States)

    Kelleher, M. K.; Grise, K. M.

    2017-12-01

    Clouds and their associated radiative effects are one of the largest sources of uncertainty in the present generation of global climate models. One region where model biases are especially large is over the Southern Ocean, where many models systematically underestimate the climatological shortwave cloud radiative effects (CRE) and/or misrepresent the relationship between shortwave CRE and atmospheric dynamics. Previous research has shown that two "cloud controlling factors", estimated inversion strength (EIS) and mid-tropospheric vertical velocity, are helpful in explaining the relationship between CRE and atmospheric dynamics on monthly timescales. For example, when the Southern Hemisphere midlatitude jet shifts poleward on monthly timescales, the high clouds and their associated longwave CRE shift poleward with the jet, consistent with a poleward shift of the storm track and the attendant vertical velocity anomalies. However, the observed changes in shortwave CRE with a poleward jet shift are small due to a trade-off between the competing effects of opposing EIS and vertical velocity anomalies. This study extends these previous findings to examine the relationship between Southern Ocean cloud controlling factors and CRE on daily timescales. On a daily timescale, the relationship of EIS and vertical velocity with CRE is more complex, due in part to the presence of transient weather systems. Composites of EIS, vertical velocity, longwave CRE, and shortwave CRE around extratropical cyclones and anticyclones are constructed to examine how the CRE anomalies vary in different sectors of midlatitude weather systems and the role that EIS and vertical velocity play in determining those anomalies. The relationships between the cloud controlling factors and CRE on daily timescales provide key insight into the underlying physical processes responsible for the relationships between midlatitude cloud controlling factors and CRE previously documented on monthly timescales.

  20. Measurement of sulfur dioxide oxidation rates in wintertime orographic clouds

    International Nuclear Information System (INIS)

    Snider, J.R.

    1990-01-01

    SO2-reaction studies in the clouds are examined and summarized to experimentally confirm model predictions and previous field studies regarding dominant SO2-reaction pathways. Controlled amounts of SO2 were released into nonprecipitating orographic clouds, and sulfate yields are compared to oxidant depletions. The sulfate yields were taken from cloud-water samples and liquid-water-concentration measurements, and oxidant-depletion data were generated from continuous gas-phase measurements. Comparisons of Y sub SO4 and D sub H2O2 suggest that H2O2 is the dominant oxidant, and the in-cloud reaction between H2O2 and the bisulfite ion can be expressed by a simple rate that agrees with predictions and laboratory results. The rate measurements are found to be inconsistent with the rate law proposed by Hegg and Hobbs (1982) and with some observational data. The present conclusions are of interest to evaluating the effects of sulfur dioxide emissions on sulfuric acid deposition. 30 refs

  1. IBM SmartCloud essentials

    CERN Document Server

    Schouten, Edwin

    2013-01-01

    A practical, user-friendly guide that provides an introduction to cloud computing using IBM SmartCloud, along with a thorough understanding of resource management in a cloud environment.This book is great for anyone who wants to get a grasp of what cloud computing is and what IBM SmartCloud has to offer. If you are an IT specialist, IT architect, system administrator, or a developer who wants to thoroughly understand the cloud computing resource model, this book is ideal for you. No prior knowledge of cloud computing is expected.

  2. Tests of Cloud Computing and Storage System features for use in H1 Collaboration Data Preservation model

    International Nuclear Information System (INIS)

    Łobodziński, Bogdan

    2011-01-01

    Based on the currently developing strategy for data preservation and long-term analysis in HEP tests of possible future Cloud Computing based on the Eucalyptus Private Cloud platform and the petabyte scale storage open source system CEPH were performed for the H1 Collaboration. Improvements in computing power and strong development of storage systems suggests that a single Cloud Computing resource supported on a given site will be sufficient for analysis requirements beyond the end-date of experiments. This work describes our test-bed architecture which could be applied to fulfill the requirements of the physics program of H1 after the end date of the Collaboration. We discuss the reasons why we choose the Eucalyptus platform and CEPH storage infrastructure as well as our experience with installations and support of these infrastructures. Using our first test results we will examine performance characteristics, noticed failure states, deficiencies, bottlenecks and scaling boundaries.

  3. Combining Cloud-based Workflow Management System with SOA and CEP to Create Agility in Collaborative Environment

    Directory of Open Access Journals (Sweden)

    Marian STOICA

    2017-01-01

    Full Text Available In current economy, technological solutions like cloud computing, service-oriented architecture (SOA and complex event processing (CEP are recognized as modern approaches used for increasing the business agility and achieving innovation. The complexity of collaborative business environment raises more and more the need for performant workflow management systems (WfMS that meet current requirements. Each approach has advantages, but also faces challenges. In this paper we propose a solution for integration of cloud computing with WfMS, SOA and CEP that allows these technologies to complete each other and bank on their benefits to increase agility and reduce the challenges/problems. The paper presents a short introduction in the subject, followed by an analysis of the combination between cloud computing and WfMS and the benefits of cloud based workflow management system. The paper ends with a solution for combining cloud WfMS with SOA and CEP in order to gain business agility and real time collaboration, followed by conclusions and research directions.

  4. IBM Cloud Computing Powering a Smarter Planet

    Science.gov (United States)

    Zhu, Jinzy; Fang, Xing; Guo, Zhe; Niu, Meng Hua; Cao, Fan; Yue, Shuang; Liu, Qin Yu

    With increasing need for intelligent systems supporting the world's businesses, Cloud Computing has emerged as a dominant trend to provide a dynamic infrastructure to make such intelligence possible. The article introduced how to build a smarter planet with cloud computing technology. First, it introduced why we need cloud, and the evolution of cloud technology. Secondly, it analyzed the value of cloud computing and how to apply cloud technology. Finally, it predicted the future of cloud in the smarter planet.

  5. Cloud GIS Based Watershed Management

    Science.gov (United States)

    Bediroğlu, G.; Colak, H. E.

    2017-11-01

    In this study, we generated a Cloud GIS based watershed management system with using Cloud Computing architecture. Cloud GIS is used as SAAS (Software as a Service) and DAAS (Data as a Service). We applied GIS analysis on cloud in terms of testing SAAS and deployed GIS datasets on cloud in terms of DAAS. We used Hybrid cloud computing model in manner of using ready web based mapping services hosted on cloud (World Topology, Satellite Imageries). We uploaded to system after creating geodatabases including Hydrology (Rivers, Lakes), Soil Maps, Climate Maps, Rain Maps, Geology and Land Use. Watershed of study area has been determined on cloud using ready-hosted topology maps. After uploading all the datasets to systems, we have applied various GIS analysis and queries. Results shown that Cloud GIS technology brings velocity and efficiency for watershed management studies. Besides this, system can be easily implemented for similar land analysis and management studies.

  6. Preliminary design of ECCO: Experimental control system which is cloud oriented

    International Nuclear Information System (INIS)

    Zheng, Wei; Hu, Feiran; Zhang, Ming; Zhang, Jing; Wan, Kuanhong; Liu, Qiang; Pan, Yuan; Zhuang, Ge

    2016-01-01

    Highlights: • ECCO is a self-organized and de-centralized control system software. • ECCO integrates ECCO-SDD and ECCO-REST.. • ECCO network protocol is based on HTTP protocol and RESTful design practice, implements Hypermedia, automatic discovery, and event. • ECCO is flexible, plug-and-play, and provides a series of unified toolkits. - Abstract: As the development of the Tokamak, the scale of the facility is getting bigger and bigger. It is a great challenge to design, manage and operate a control system of such big scale. So we developed a new control system software: Experimental Control System which is Cloud Oriented (ECCO). ECCO consists two parts, ECCO-SDD and ECCO-REST. ECCO-SDD is used to design, manage and describe the whole control system, configure every subsystem statically. There is a SDD editor which is a human machine interface for control system designer to design by simply drag and drop, and it can be easily extended using plug-in. The ECCO-SDD translator is used to generate different outputs. All the system design and configuration is stored in the MongoDB database using an object relational mapping dedicated designed for ECCO-SDD. ECCO-REST mainly defines a control network protocol based on HTTP RESTful service, it also implements automatic discovery using Zero-configuration (Zeroconf) networking standard. Since this protocol is based on industrial standard and transparent protocol, it is open enough and it can be easily implemented by others. ECCO-REST application is the core of ECCO-REST, it is a cross platform control software running on distributed control units just like the EPICS IOC. It can be extended by user created models. It is configured by human readable JSON file which can be generated by ECCO-SDD translator. ECCO is a self-organized and de-centralized control system software. Based on the same protocol, every part of the system can discover each other, thus the controllers which ECCO-REST application running on can

  7. Preliminary design of ECCO: Experimental control system which is cloud oriented

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, Wei, E-mail: zhengwei@hust.edu.cn [State Key Laboratory of Advanced Electromagnetic Engineering and Technology in Huazhong University of Science and Technology, Wuhan 430074 (China); School of Electrical and Electronic Engineering in Huazhong University of Science and Technology, Wuhan 430074 (China); Hu, Feiran; Zhang, Ming; Zhang, Jing; Wan, Kuanhong; Liu, Qiang; Pan, Yuan; Zhuang, Ge [State Key Laboratory of Advanced Electromagnetic Engineering and Technology in Huazhong University of Science and Technology, Wuhan 430074 (China); School of Electrical and Electronic Engineering in Huazhong University of Science and Technology, Wuhan 430074 (China)

    2016-11-15

    Highlights: • ECCO is a self-organized and de-centralized control system software. • ECCO integrates ECCO-SDD and ECCO-REST.. • ECCO network protocol is based on HTTP protocol and RESTful design practice, implements Hypermedia, automatic discovery, and event. • ECCO is flexible, plug-and-play, and provides a series of unified toolkits. - Abstract: As the development of the Tokamak, the scale of the facility is getting bigger and bigger. It is a great challenge to design, manage and operate a control system of such big scale. So we developed a new control system software: Experimental Control System which is Cloud Oriented (ECCO). ECCO consists two parts, ECCO-SDD and ECCO-REST. ECCO-SDD is used to design, manage and describe the whole control system, configure every subsystem statically. There is a SDD editor which is a human machine interface for control system designer to design by simply drag and drop, and it can be easily extended using plug-in. The ECCO-SDD translator is used to generate different outputs. All the system design and configuration is stored in the MongoDB database using an object relational mapping dedicated designed for ECCO-SDD. ECCO-REST mainly defines a control network protocol based on HTTP RESTful service, it also implements automatic discovery using Zero-configuration (Zeroconf) networking standard. Since this protocol is based on industrial standard and transparent protocol, it is open enough and it can be easily implemented by others. ECCO-REST application is the core of ECCO-REST, it is a cross platform control software running on distributed control units just like the EPICS IOC. It can be extended by user created models. It is configured by human readable JSON file which can be generated by ECCO-SDD translator. ECCO is a self-organized and de-centralized control system software. Based on the same protocol, every part of the system can discover each other, thus the controllers which ECCO-REST application running on can

  8. A Smartphone App and Cloud-Based Consultation System for Burn Injury Emergency Care.

    Directory of Open Access Journals (Sweden)

    Lee A Wallis

    Full Text Available Each year more than 10 million people worldwide are burned severely enough to require medical attention, with clinical outcomes noticeably worse in resource poor settings. Expert clinical advice on acute injuries can play a determinant role and there is a need for novel approaches that allow for timely access to advice. We developed an interactive mobile phone application that enables transfer of both patient data and pictures of a wound from the point-of-care to a remote burns expert who, in turn, provides advice back.The application is an integrated clinical decision support system that includes a mobile phone application and server software running in a cloud environment. The client application is installed on a smartphone and structured patient data and photographs can be captured in a protocol driven manner. The user can indicate the specific injured body surface(s through a touchscreen interface and an integrated calculator estimates the total body surface area that the burn injury affects. Predefined standardised care advice including total fluid requirement is provided immediately by the software and the case data are relayed to a cloud server. A text message is automatically sent to a burn expert on call who then can access the cloud server with the smartphone app or a web browser, review the case and pictures, and respond with both structured and personalized advice to the health care professional at the point-of-care.In this article, we present the design of the smartphone and the server application alongside the type of structured patient data collected together with the pictures taken at point-of-care. We report on how the application will be introduced at point-of-care and how its clinical impact will be evaluated prior to roll out. Challenges, strengths and limitations of the system are identified that may help materialising or hinder the expected outcome to provide a solution for remote consultation on burns that can be

  9. A Smartphone App and Cloud-Based Consultation System for Burn Injury Emergency Care.

    Science.gov (United States)

    Wallis, Lee A; Fleming, Julian; Hasselberg, Marie; Laflamme, Lucie; Lundin, Johan

    2016-01-01

    Each year more than 10 million people worldwide are burned severely enough to require medical attention, with clinical outcomes noticeably worse in resource poor settings. Expert clinical advice on acute injuries can play a determinant role and there is a need for novel approaches that allow for timely access to advice. We developed an interactive mobile phone application that enables transfer of both patient data and pictures of a wound from the point-of-care to a remote burns expert who, in turn, provides advice back. The application is an integrated clinical decision support system that includes a mobile phone application and server software running in a cloud environment. The client application is installed on a smartphone and structured patient data and photographs can be captured in a protocol driven manner. The user can indicate the specific injured body surface(s) through a touchscreen interface and an integrated calculator estimates the total body surface area that the burn injury affects. Predefined standardised care advice including total fluid requirement is provided immediately by the software and the case data are relayed to a cloud server. A text message is automatically sent to a burn expert on call who then can access the cloud server with the smartphone app or a web browser, review the case and pictures, and respond with both structured and personalized advice to the health care professional at the point-of-care. In this article, we present the design of the smartphone and the server application alongside the type of structured patient data collected together with the pictures taken at point-of-care. We report on how the application will be introduced at point-of-care and how its clinical impact will be evaluated prior to roll out. Challenges, strengths and limitations of the system are identified that may help materialising or hinder the expected outcome to provide a solution for remote consultation on burns that can be integrated into routine

  10. Provide a model to improve the performance of intrusion detection systems in the cloud

    OpenAIRE

    Foroogh Sedighi

    2016-01-01

    High availability of tools and service providers in cloud computing and the fact that cloud computing services are provided by internet and deal with public, have caused important challenges for new computing model. Cloud computing faces problems and challenges such as user privacy, data security, data ownership, availability of services, and recovery after breaking down, performance, scalability, programmability. So far, many different methods are presented for detection of intrusion in clou...

  11. A Standard Mutual Authentication Protocol for Cloud Computing Based Health Care System.

    Science.gov (United States)

    Mohit, Prerna; Amin, Ruhul; Karati, Arijit; Biswas, G P; Khan, Muhammad Khurram

    2017-04-01

    Telecare Medical Information System (TMIS) supports a standard platform to the patient for getting necessary medical treatment from the doctor(s) via Internet communication. Security protection is important for medical records (data) of the patients because of very sensitive information. Besides, patient anonymity is another most important property, which must be protected. Most recently, Chiou et al. suggested an authentication protocol for TMIS by utilizing the concept of cloud environment. They claimed that their protocol is patient anonymous and well security protected. We reviewed their protocol and found that it is completely insecure against patient anonymity. Further, the same protocol is not protected against mobile device stolen attack. In order to improve security level and complexity, we design a light weight authentication protocol for the same environment. Our security analysis ensures resilience of all possible security attacks. The performance of our protocol is relatively standard in comparison with the related previous research.

  12. Salt Effect on the Cloud Point Phenomenon of Amphiphilic Drug-Hydroxypropylmethyl Cellulose System

    Directory of Open Access Journals (Sweden)

    Mohd. Sajid Ali

    2014-01-01

    Full Text Available Effect of two amphiphilic drugs (tricyclic antidepressant, nortriptyline hydrochloride (NORT, and nonsteroidal anti-inflammatory drug, sodium salt of ibuprofen (IBF on the cloud point of biopolymer hydroxypropylmethyl cellulose (HPMC was studied. Effect of NaCl was also seen on the CP of HPMC-drug system. CP of HPMC increases uniformly on increasing the (drug. Both drugs, though one being anionic (IBF and other cationic (NORT, affect the CP in almost the same manner but with different extent implying the role of hydrophobicity in the interaction between drug and polymer. Salt affects the CP of the drug in a dramatic way as low concentration of salt was only able to increase the value of the CP, though not affecting the pattern. However, in presence of high concentration of salts, minimum was observed on CP versus (drug plots. Various thermodynamic parameters were evaluated and discussed on the basis of the observed results.

  13. Convergence Of Cloud Computing Internet Of Things And Machine Learning The Future Of Decision Support Systems

    Directory of Open Access Journals (Sweden)

    Gilberto Crespo-Perez

    2017-07-01

    Full Text Available The objective of this research was to develop a framework for understanding the Convergence of Cloud Computing Machine Learning and Internet of Things as the future of Decision Support Systems. To develop this framework the researchers analyzed and synthesized 35 research articles from 2006 to 2017. The results indicated that when the data is massive it is necessary to use computational algorithms and complex analytical techniques. The Internet of Things in combination with the large accumulation of data and data mining improves the learning of automatic intelligence for business. This is due to the fact that the technology has the intelligence to infer and provide solutions based on past experiences and past events.

  14. In situ measurements of tropical cloud properties in the West African Monsoon: upper tropospheric ice clouds, Mesoscale Convective System outflow, and subvisual cirrus

    Directory of Open Access Journals (Sweden)

    W. Frey

    2011-06-01

    Full Text Available In situ measurements of ice crystal size distributions in tropical upper troposphere/lower stratosphere (UT/LS clouds were performed during the SCOUT-AMMA campaign over West Africa in August 2006. The cloud properties were measured with a Forward Scattering Spectrometer Probe (FSSP-100 and a Cloud Imaging Probe (CIP operated aboard the Russian high altitude research aircraft M-55 Geophysica with the mission base in Ouagadougou, Burkina Faso. A total of 117 ice particle size distributions were obtained from the measurements in the vicinity of Mesoscale Convective Systems (MCS. Two to four modal lognormal size distributions were fitted to the average size distributions for different potential temperature bins. The measurements showed proportionately more large ice particles compared to former measurements above maritime regions. With the help of trace gas measurements of NO, NOy, CO2, CO, and O3 and satellite images, clouds in young and aged MCS outflow were identified. These events were observed at altitudes of 11.0 km to 14.2 km corresponding to potential temperature levels of 346 K to 356 K. In a young outflow from a developing MCS ice crystal number concentrations of up to (8.3 ± 1.6 cm−3 and rimed ice particles with maximum dimensions exceeding 1.5 mm were found. A maximum ice water content of 0.05 g m−3 was observed and an effective radius of about 90 μm. In contrast the aged outflow events were more diluted and showed a maximum number concentration of 0.03 cm−3, an ice water content of 2.3 × 10−4 g m−3, an effective radius of about 18 μm, while the largest particles had a maximum dimension of 61 μm.

    Close to the tropopause subvisual cirrus were encountered four times at altitudes of 15 km to 16.4 km. The mean ice particle number concentration of these encounters was 0.01 cm−3 with maximum particle sizes of 130

  15. Final Technical Report for "High-resolution global modeling of the effects of subgrid-scale clouds and turbulence on precipitating cloud systems"

    Energy Technology Data Exchange (ETDEWEB)

    Larson, Vincent [Univ. of Wisconsin, Milwaukee, WI (United States)

    2016-11-25

    The Multiscale Modeling Framework (MMF) embeds a cloud-resolving model in each grid column of a General Circulation Model (GCM). A MMF model does not need to use a deep convective parameterization, and thereby dispenses with the uncertainties in such parameterizations. However, MMF models grossly under-resolve shallow boundary-layer clouds, and hence those clouds may still benefit from parameterization. In this grant, we successfully created a climate model that embeds a cloud parameterization (“CLUBB”) within a MMF model. This involved interfacing CLUBB’s clouds with microphysics and reducing computational cost. We have evaluated the resulting simulated clouds and precipitation with satellite observations. The chief benefit of the project is to provide a MMF model that has an improved representation of clouds and that provides improved simulations of precipitation.

  16. A WSN based Environment and Parameter Monitoring System for Human Health Comfort: A Cloud Enabled Approach

    Directory of Open Access Journals (Sweden)

    Manohara Pai

    2014-05-01

    Full Text Available The number and type of sensors measuring physical and physiological parameters have seen dramatic increase due to progress in the MEMS and Nano Technology. The Wireless Sensor Networks (WSNs in turn is bringing new applications in environment monitoring and healthcare in order to improve the quality of service especially in hospitals. The adequacy of WSNs to gather critical information has provided solution but with limited storage, computation and scalability. This limitation is addressed by integrating WSN with cloud services. But, once the data enters the cloud the owner has no control over it. Hence confidentiality and integrity of the data being stored in the cloud are compromised. In this proposed work, secure sensor-cloud architecture for the applications in healthcare is implemented by integrating two different clouds. The sink node of WSN outsources data into the cloud after performing operations to secure the data. Since the SaaS and IaaS environments of Cloud Computing are provided by two different cloud service providers (CSPs, both the CSPs will not have complete information of the architecture. This provides inherent security as data storage and data processing are done on different clouds.

  17. Cost-optimized configuration of computing instances for large sized cloud systems

    Directory of Open Access Journals (Sweden)

    Woo-Chan Kim

    2017-09-01

    Full Text Available Cloud computing services are becoming more popular for various reasons which include ‘having no need for capital expenditure’ and ‘the ability to quickly meet business demands’. However, what seems to be an attractive option may become a substantial expenditure as more projects are moved into the cloud. Cloud service companies provide different pricing options to their customers that can potentially lower the customers’ spending on the cloud. Choosing the right combination of pricing options can be formulated as a linear mixed integer programming problem, which can be solved using optimization.

  18. An Efficient Method to Create Digital Terrain Models from Point Clouds Collected by Mobile LiDAR Systems

    Science.gov (United States)

    Gézero, L.; Antunes, C.

    2017-05-01

    The digital terrain models (DTM) assume an essential role in all types of road maintenance, water supply and sanitation projects. The demand of such information is more significant in developing countries, where the lack of infrastructures is higher. In recent years, the use of Mobile LiDAR Systems (MLS) proved to be a very efficient technique in the acquisition of precise and dense point clouds. These point clouds can be a solution to obtain the data for the production of DTM in remote areas, due mainly to the safety, precision, speed of acquisition and the detail of the information gathered. However, the point clouds filtering and algorithms to separate "terrain points" from "no terrain points", quickly and consistently, remain a challenge that has caught the interest of researchers. This work presents a method to create the DTM from point clouds collected by MLS. The method is based in two interactive steps. The first step of the process allows reducing the cloud point to a set of points that represent the terrain's shape, being the distance between points inversely proportional to the terrain variation. The second step is based on the Delaunay triangulation of the points resulting from the first step. The achieved results encourage a wider use of this technology as a solution for large scale DTM production in remote areas.

  19. A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system

    International Nuclear Information System (INIS)

    Toor, S; Eerola, P; Kraemer, O; Lindén, T; Osmani, L; Tarkoma, S; White, J

    2014-01-01

    The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.

  20. Visualization system: animation of the dynamic evolution of the molecular hydrogen cloud during its gravitational collapse in 3D

    International Nuclear Information System (INIS)

    Duarte P, R.; Klapp E, J.; Arreaga D, G.

    2006-01-01

    The results of a group of numeric simulations and a region of interest form a molecular hydrogen cloud that collapses under the action of their own force of graveness. For they are believed it two models the constant one and the gaussian with the profile of the density of the initial cloud and a barotropic equation of state that it allows the iso thermic change to adiabatic. For each pattern two values of critical density are used, a spectra of density interferences, obtaining a binary system, tertiary or even a quaternary one. The necessary programs explained in the methodology to generate the visualizations of the models are generated. (Author)

  1. A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system

    Science.gov (United States)

    Toor, S.; Osmani, L.; Eerola, P.; Kraemer, O.; Lindén, T.; Tarkoma, S.; White, J.

    2014-06-01

    The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.

  2. Case Study: IBM Watson Analytics Cloud Platform as Analytics-as-a-Service System for Heart Failure Early Detection

    Directory of Open Access Journals (Sweden)

    Gabriele Guidi

    2016-07-01

    Full Text Available In the recent years the progress in technology and the increasing availability of fast connections have produced a migration of functionalities in Information Technologies services, from static servers to distributed technologies. This article describes the main tools available on the market to perform Analytics as a Service (AaaS using a cloud platform. It is also described a use case of IBM Watson Analytics, a cloud system for data analytics, applied to the following research scope: detecting the presence or absence of Heart Failure disease using nothing more than the electrocardiographic signal, in particular through the analysis of Heart Rate Variability. The obtained results are comparable with those coming from the literature, in terms of accuracy and predictive power. Advantages and drawbacks of cloud versus static approaches are discussed in the last sections.

  3. An OAIS-Based Hospital Information System on the Cloud: Analysis of a NoSQL Column-Oriented Approach.

    Science.gov (United States)

    Celesti, Antonio; Fazio, Maria; Romano, Agata; Bramanti, Alessia; Bramanti, Placido; Villari, Massimo

    2018-05-01

    The Open Archive Information System (OAIS) is a reference model for organizing people and resources in a system, and it is already adopted in care centers and medical systems to efficiently manage clinical data, medical personnel, and patients. Archival storage systems are typically implemented using traditional relational database systems, but the relation-oriented technology strongly limits the efficiency in the management of huge amount of patients' clinical data, especially in emerging cloud-based, that are distributed. In this paper, we present an OAIS healthcare architecture useful to manage a huge amount of HL7 clinical documents in a scalable way. Specifically, it is based on a NoSQL column-oriented Data Base Management System deployed in the cloud, thus to benefit from a big tables and wide rows available over a virtual distributed infrastructure. We developed a prototype of the proposed architecture at the IRCCS, and we evaluated its efficiency in a real case of study.

  4. Fractional activation of accumulation-mode particles in warm continental stratiform clouds

    International Nuclear Information System (INIS)

    Gillani, N.V.; Daum, P.H.; Schwartz, S.E.; Leaitch, W.R.; Strapp, J.W.; Isaac, G.A.

    1991-07-01

    The degree of activation of accumulation-mode particles (AMP) in clouds has been studied using continuous (1 second average) aircraft measurements of the number concentrations of cloud droplets (N cd , 2 to 35 μm diameter) and of unactivated AMP (N amp , 0.17 to 2.07 μm diameter) in cloud interstitial air. The magnitude and spatial variation of the activated fraction (F) of all measured particles (defined as F triple-bond N cd /N tot , where N tot = N cd + N amp ) are investigated, based on measurements made during ten aircraft flights in non-precipitating warm continental stratiform clouds near Syracuse NY in the fall of 1984. Based on instantaneous observations throughout the clouds, the spatial distribution of F was found to be quite nonuniform. In general, F was low in cloud edges and where total particle loading was high and/or cloud convective activity was low. In the interior of clouds, the value of F exceeded 0.9 for 36% of the data, but was below 0.6 for 28%. Factors influencing F the most were the total particle loading (N tot ) and the thermal stability of the cloud layer. The dependence of F on N tot in cloud interior was characterized by two distinct regimes. For N tot -3 , F was generally close to unity and relatively insensitive to N tot . For N tot > 800 cm -3 , F tended to decrease with increasing N tot . This decrease was greatest in a stable stratus deck embedded in a warm moist airmass. The results suggest that, in warm continental stratiform clouds, the process of particle activation becomes nonlinear and self-limiting at high particle loading. The degree of this nonlinearity depends on cloud convective activity (thermal instability)

  5. A Cloud-based Infrastructure and Architecture for Environmental System Research

    Science.gov (United States)

    Wang, D.; Wei, Y.; Shankar, M.; Quigley, J.; Wilson, B. E.

    2016-12-01

    The present availability of high-capacity networks, low-cost computers and storage devices, and the widespread adoption of hardware virtualization and service-oriented architecture provide a great opportunity to enable data and computing infrastructure sharing between closely related research activities. By taking advantage of these approaches, along with the world-class high computing and data infrastructure located at Oak Ridge National Laboratory, a cloud-based infrastructure and architecture has been developed to efficiently deliver essential data and informatics service and utilities to the environmental system research community, and will provide unique capabilities that allows terrestrial ecosystem research projects to share their software utilities (tools), data and even data submission workflow in a straightforward fashion. The infrastructure will minimize large disruptions from current project-based data submission workflows for better acceptances from existing projects, since many ecosystem research projects already have their own requirements or preferences for data submission and collection. The infrastructure will eliminate scalability problems with current project silos by provide unified data services and infrastructure. The Infrastructure consists of two key components (1) a collection of configurable virtual computing environments and user management systems that expedite data submission and collection from environmental system research community, and (2) scalable data management services and system, originated and development by ORNL data centers.

  6. The Integrated Cloud-based Environmental Data Management System at Los Alamos National Laboratory - 13391

    International Nuclear Information System (INIS)

    Schultz Paige, Karen; Gomez, Penny; Patel, Nita P.; EchoHawk, Chris; Dorries, Alison M.

    2013-01-01

    In today's world, instant access to information is taken for granted. The national labs are no exception; our data users expect immediate access to their data. Los Alamos National Laboratory (LANL) has collected over ten million records, and the data needs to be accessible to scientists as well as the public. The data span a wide range of media, analytes, time periods, formats, and quality and have traditionally existed in scattered databases, making comprehensive work with the data impossible. Recently, LANL has successfully integrated all their environmental data into a single, cloud-based, web-accessible data management system. The system combines data transparency to the public with immediate access required by the technical staff. The use of automatic electronic data validation has been critical to immediate data access while saving millions of dollars and increasing data consistency and quality. The system includes a Google Maps based GIS tool that is simple enough for people to locate potentially contaminated sites near their home or workplace, and complex enough to allow scientists to plot and trend their data at the surface and at depth as well as over time. A variety of formatted reports can be run at any desired frequency to report the most current data available in the data base. The advanced user can also run free form queries of the data base. This data management system has saved LANL time and money, an increasingly important accomplishment during periods of budget cuts with increasing demand for immediate electronic services. (authors)

  7. The Integrated Cloud-based Environmental Data Management System at Los Alamos National Laboratory - 13391

    Energy Technology Data Exchange (ETDEWEB)

    Schultz Paige, Karen; Gomez, Penny; Patel, Nita P.; EchoHawk, Chris; Dorries, Alison M. [Los Alamos National Laboratory, MS M996, Los Alamos, NM, 87544 (United States)

    2013-07-01

    In today's world, instant access to information is taken for granted. The national labs are no exception; our data users expect immediate access to their data. Los Alamos National Laboratory (LANL) has collected over ten million records, and the data needs to be accessible to scientists as well as the public. The data span a wide range of media, analytes, time periods, formats, and quality and have traditionally existed in scattered databases, making comprehensive work with the data impossible. Recently, LANL has successfully integrated all their environmental data into a single, cloud-based, web-accessible data management system. The system combines data transparency to the public with immediate access required by the technical staff. The use of automatic electronic data validation has been critical to immediate data access while saving millions of dollars and increasing data consistency and quality. The system includes a Google Maps based GIS tool that is simple enough for people to locate potentially contaminated sites near their home or workplace, and complex enough to allow scientists to plot and trend their data at the surface and at depth as well as over time. A variety of formatted reports can be run at any desired frequency to report the most current data available in the data base. The advanced user can also run free form queries of the data base. This data management system has saved LANL time and money, an increasingly important accomplishment during periods of budget cuts with increasing demand for immediate electronic services. (authors)

  8. A mobile cloud-based Parkinson's disease assessment system for home-based monitoring.

    Science.gov (United States)

    Pan, Di; Dhall, Rohit; Lieberman, Abraham; Petitti, Diana B

    2015-03-26

    Parkinson's disease (PD) is the most prevalent movement disorder of the central nervous system, and affects more than 6.3 million people in the world. The characteristic motor features include tremor, bradykinesia, rigidity, and impaired postural stability. Current therapy based on augmentation or replacement of dopamine is designed to improve patients' motor performance but often leads to levodopa-induced adverse effects, such as dyskinesia and motor fluctuation. Clinicians must regularly monitor patients in order to identify these effects and other declines in motor function as soon as possible. Current clinical assessment for Parkinson's is subjective and mostly conducted by brief observations made during patient visits. Changes in patients' motor function between visits are hard to track and clinicians are not able to make the most informed decisions about the course of therapy without frequent visits. Frequent clinic visits increase the physical and economic burden on patients and their families. In this project, we sought to design, develop, and evaluate a prototype mobile cloud-based mHealth app, "PD Dr", which collects quantitative and objective information about PD and would enable home-based assessment and monitoring of major PD symptoms. We designed and developed a mobile app on the Android platform to collect PD-related motion data using the smartphone 3D accelerometer and to send the data to a cloud service for storage, data processing, and PD symptoms severity estimation. To evaluate this system, data from the system were collected from 40 patients with PD and compared with experts' rating on standardized rating scales. The evaluation showed that PD Dr could effectively capture important motion features that differentiate PD severity and identify critical symptoms. For hand resting tremor detection, the sensitivity was .77 and accuracy was .82. For gait difficulty detection, the sensitivity was .89 and accuracy was .81. In PD severity estimation, the

  9. Evaluating Maximum Photovoltaic Integration in District Distribution Systems Considering Optimal Inverter Dispatch and Cloud Shading Conditions

    DEFF Research Database (Denmark)

    Ding, Tao; Kou, Yu; Yang, Yongheng

    2017-01-01

    . However, the intermittency of solar PV energy (e.g., due to passing clouds) may affect the PV generation in the district distribution network. To address this issue, the voltage magnitude constraints under the cloud shading conditions should be taken into account in the optimization model, which can...

  10. NRS : a system for automated network virtualization in IaaS cloud infrastructures

    NARCIS (Netherlands)

    Theodorou, D.; Mak, R.H.; Keijser, J.J.; Suerink, T.

    2013-01-01

    Applications running in multi-tenant IaaS clouds increasingly require networked compute resources, which may belong to several clouds hosted in multiple data-centers. To accommodate these applications network virtualization is necessary, not only for isolation between tenants, but also for

  11. Security in cloud computing

    OpenAIRE

    Moreno Martín, Oriol

    2016-01-01

    Security in Cloud Computing is becoming a challenge for next generation Data Centers. This project will focus on investigating new security strategies for Cloud Computing systems. Cloud Computingisarecent paradigmto deliver services over Internet. Businesses grow drastically because of it. Researchers focus their work on it. The rapid access to exible and low cost IT resources on an on-demand fashion, allows the users to avoid planning ahead for provisioning, and enterprises to save money ...

  12. A Chaotic Particle Swarm Optimization-Based Heuristic for Market-Oriented Task-Level Scheduling in Cloud Workflow Systems.

    Science.gov (United States)

    Li, Xuejun; Xu, Jia; Yang, Yun

    2015-01-01

    Cloud workflow system is a kind of platform service based on cloud computing. It facilitates the automation of workflow applications. Between cloud workflow system and its counterparts, market-oriented business model is one of the most prominent factors. The optimization of task-level scheduling in cloud workflow system is a hot topic. As the scheduling is a NP problem, Ant Colony Optimization (ACO) and Particle Swarm Optimization (PSO) have been proposed to optimize the cost. However, they have the characteristic of premature convergence in optimization process and therefore cannot effectively reduce the cost. To solve these problems, Chaotic Particle Swarm Optimization (CPSO) algorithm with chaotic sequence and adaptive inertia weight factor is applied to present the task-level scheduling. Chaotic sequence with high randomness improves the diversity of solutions, and its regularity assures a good global convergence. Adaptive inertia weight factor depends on the estimate value of cost. It makes the scheduling avoid premature convergence by properly balancing between global and local exploration. The experimental simulation shows that the cost obtained by our scheduling is always lower than the other two representative counterparts.

  13. A Chaotic Particle Swarm Optimization-Based Heuristic for Market-Oriented Task-Level Scheduling in Cloud Workflow Systems

    Directory of Open Access Journals (Sweden)

    Xuejun Li

    2015-01-01

    Full Text Available Cloud workflow system is a kind of platform service based on cloud computing. It facilitates the automation of workflow applications. Between cloud workflow system and its counterparts, market-oriented business model is one of the most prominent factors. The optimization of task-level scheduling in cloud workflow system is a hot topic. As the scheduling is a NP problem, Ant Colony Optimization (ACO and Particle Swarm Optimization (PSO have been proposed to optimize the cost. However, they have the characteristic of premature convergence in optimization process and therefore cannot effectively reduce the cost. To solve these problems, Chaotic Particle Swarm Optimization (CPSO algorithm with chaotic sequence and adaptive inertia weight factor is applied to present the task-level scheduling. Chaotic sequence with high randomness improves the diversity of solutions, and its regularity assures a good global convergence. Adaptive inertia weight factor depends on the estimate value of cost. It makes the scheduling avoid premature convergence by properly balancing between global and local exploration. The experimental simulation shows that the cost obtained by our scheduling is always lower than the other two representative counterparts.

  14. Determining ice water content from 2D crystal images in convective cloud systems

    Science.gov (United States)

    Leroy, Delphine; Coutris, Pierre; Fontaine, Emmanuel; Schwarzenboeck, Alfons; Strapp, J. Walter

    2016-04-01

    Cloud microphysical in-situ instrumentation measures bulk parameters like total water content (TWC) and/or derives particle size distributions (PSD) (utilizing optical spectrometers and optical array probes (OAP)). The goal of this work is to introduce a comprehensive methodology to compute TWC from OAP measurements, based on the dataset collected during recent HAIC (High Altitude Ice Crystals)/HIWC (High Ice Water Content) field campaigns. Indeed, the HAIC/HIWC field campaigns in Darwin (2014) and Cayenne (2015) provide a unique opportunity to explore the complex relationship between cloud particle mass and size in ice crystal environments. Numerous mesoscale convective systems (MCSs) were sampled with the French Falcon 20 research aircraft at different temperature levels from -10°C up to 50°C. The aircraft instrumentation included an IKP-2 (isokinetic probe) to get reliable measurements of TWC and the optical array probes 2D-S and PIP recording images over the entire ice crystal size range. Based on the known principle relating crystal mass and size with a power law (m=α•Dβ), Fontaine et al. (2014) performed extended 3D crystal simulations and thereby demonstrated that it is possible to estimate the value of the exponent β from OAP data, by analyzing the surface-size relationship for the 2D images as a function of time. Leroy et al. (2015) proposed an extended version of this method that produces estimates of β from the analysis of both the surface-size and perimeter-size relationships. Knowing the value of β, α then is deduced from the simultaneous IKP-2 TWC measurements for the entire HAIC/HIWC dataset. The statistical analysis of α and β values for the HAIC/HIWC dataset firstly shows that α is closely linked to β and that this link changes with temperature. From these trends, a generalized parameterization for α is proposed. Finally, the comparison with the initial IKP-2 measurements demonstrates that the method is able to predict TWC values

  15. Vertical profiles of droplet effective radius in shallow convective clouds

    Directory of Open Access Journals (Sweden)

    S. Zhang

    2011-05-01

    Full Text Available Conventional satellite retrievals can only provide information on cloud-top droplet effective radius (re. Given the fact that cloud ensembles in a satellite snapshot have different cloud-top heights, Rosenfeld and Lensky (1998 used the cloud-top height and the corresponding cloud-top re from the cloud ensembles in the snapshot to construct a profile of re representative of that in the individual clouds. This study investigates the robustness of this approach in shallow convective clouds based on results from large-eddy simulations (LES for clean (aerosol mixing ratio Na = 25 mg−1, intermediate (Na = 100 mg−1, and polluted (Na = 2000 mg−1 conditions. The cloud-top height and the cloud-top re from the modeled cloud ensembles are used to form a constructed re profile, which is then compared to the in-cloud re profiles. For the polluted and intermediate cases where precipitation is negligible, the constructed re profiles represent the in-cloud re profiles fairly well with a low bias (about 10 %. The method used in Rosenfeld and Lensky (1998 is therefore validated for nonprecipitating shallow cumulus clouds. For the clean, drizzling case, the in-cloud re can be very large and highly variable, and quantitative profiling based on cloud-top re is less useful. The differences in re profiles between clean and polluted conditions derived in this manner are however, distinct. This study also investigates the subadiabatic characteristics of the simulated cumulus clouds to reveal the effect of mixing on re and its evolution. Results indicate that as polluted and moderately polluted clouds develop into their decaying stage, the subadiabatic fraction

  16. Cloud-Based Service Information System for Evaluating Quality of Life after Breast Cancer Surgery.

    Directory of Open Access Journals (Sweden)

    Hao-Yun Kao

    Full Text Available Although recent studies have improved understanding of quality of life (QOL outcomes of breast conserving surgery, few have used longitudinal data for more than two time points, and few have examined predictors of QOL over two years. Additionally, the longitudinal data analyses in such studies rarely apply the appropriate statistical methodology to control for censoring and inter-correlations arising from repeated measures obtained from the same patient pool. This study evaluated an internet-based system for measuring longitudinal changes in QOL and developed a cloud-based system for managing patients after breast conserving surgery.This prospective study analyzed 657 breast cancer patients treated at three tertiary academic hospitals. Related hospital personnel such as surgeons and other healthcare professionals were also interviewed to determine the requirements for an effective cloud-based system for surveying QOL in breast cancer patients. All patients completed the SF-36, Quality of Life Questionnaire (QLQ-C30 and its supplementary breast cancer measure (QLQ-BR23 at baseline, 6 months, 1 year, and 2 years postoperatively. The 95% confidence intervals for differences in responsiveness estimates were derived by bootstrap estimation. Scores derived by these instruments were interpreted by generalized estimating equation before and after surgery.All breast cancer surgery patients had significantly improved QLQ-C30 and QLQ-BR23 subscale scores throughout the 2-year follow-up period (p<0.05. During the study period, QOL generally had a negative association with advanced age, high Charlson comorbidity index score, tumor stage III or IV, previous chemotherapy, and long post-operative LOS. Conversely, QOL was positively associated with previous radiotherapy and hormone therapy. Additionally, patients with high scores for preoperative QOL tended to have high scores for QLQ-C30, QLQ-BR23 and SF-36 subscales. Based on the results of usability testing

  17. A synchronous distributed cloud-based virtual reality meeting system for architectural and urban design

    Directory of Open Access Journals (Sweden)

    Lei Sun

    2014-12-01

    Full Text Available In the spatial design fields such as architectural design and urban design, a consensus-building process among a variety of stakeholders like project executors, architects, residents, users, and general citizens is required. New technological developments such as cloud computing and Virtual Design Studios (VDS enable the creation of virtual meeting systems. This paper proposes an approach towards a synchronous distributed design meeting system. In this paper, in addition to sharing a 3D virtual space for a synchronous distributed type design meeting, we developed a prototype system that enables participants to sketch or make annotations and have discussions as well as add viewpoints to them. We applied these functions to evaluate an architectural design and urban landscape examination. In conclusion, the proposed method was evaluated as being effective and feasible. Yet, it shows a few shortcomings including the fact that simultaneous operation is limited to one client, and more arbitrary shapes should be supported in future versions of the application.

  18. Prospective Architectures for Onboard vs Cloud-Based Decision Making for Unmanned Aerial Systems

    Science.gov (United States)

    Sankararaman, Shankar; Teubert, Christopher

    2017-01-01

    This paper investigates propsective architectures for decision-making in unmanned aerial systems. When these unmanned vehicles operate in urban environments, there are several sources of uncertainty that affect their behavior, and decision-making algorithms need to be robust to account for these different sources of uncertainty. It is important to account for several risk-factors that affect the flight of these unmanned systems, and facilitate decision-making by taking into consideration these various risk-factors. In addition, there are several technical challenges related to autonomous flight of unmanned aerial systems; these challenges include sensing, obstacle detection, path planning and navigation, trajectory generation and selection, etc. Many of these activities require significant computational power and in many situations, all of these activities need to be performed in real-time. In order to efficiently integrate these activities, it is important to develop a systematic architecture that can facilitate real-time decision-making. Four prospective architectures are discussed in this paper; on one end of the spectrum, the first architecture considers all activities/computations being performed onboard the vehicle whereas on the other end of the spectrum, the fourth and final architecture considers all activities/computations being performed in the cloud, using a new service known as Prognostics as a Service that is being developed at NASA Ames Research Center. The four different architectures are compared, their advantages and disadvantages are explained and conclusions are presented.

  19. Hardware/Software Co-design of Global Cloud System Resolving

    Directory of Open Access Journals (Sweden)

    Michael Wehner

    2011-10-01

    Full Text Available We present an analysis of the performance aspects of an atmospheric general circulation model at the ultra-high resolution required to resolve individual cloud systems and describe alternative technological paths to realize the integration of such a model in the relatively near future. Due to a superlinear scaling of the computational burden dictated by the Courant stability criterion, the solution of the equations of motion dominate the calculation at these ultra-high resolutions. From this extrapolation, it is estimated that a credible kilometer scale atmospheric model would require a sustained computational rate of at least 28 Petaflop/s to provide scientifically useful climate simulations. Our design study portends an alternate strategy for practical power-efficient implementations of next-generation ultra-scale systems. We demonstrate that hardware/software co-design of low-power embedded processor technology could be exploited to design a custom machine tailored to ultra-high resolution climate model specifications at relatively affordable cost and power considerations. A strawman machine design is presented consisting of in excess of 20 million processing elements that effectively exploits forthcoming many-core chips. The system pushes the limits of domain decomposition to increase explicit parallelism, and suggests that functional partitioning of sub-components of the climate code (much like the coarse-grained partitioning of computation between the atmospheric, ocean, land, and ice components of current coupled models may be necessary for future performance scaling.

  20. Cloud Computing Integrated Multi-Factor Authentication Framework Application in Logistics Information Systems

    Directory of Open Access Journals (Sweden)

    Zeynel Erdi Karabulut

    2017-12-01

    Full Text Available As new technology enables firms to perform many daily processes easier the need of authentication and authorization process is becoming an integral part of many businesses. Also mobile applications are very popular nowadays play an important role in our lives. Such demands are not only limited to Logistics Information Systems (LIS but many field of information system as well. In this study multi-dimensional authentication which consist of online biometric face detection integrated as cloud computing software as a Service (SaaS, Near Field Communication (NFC card authentication, location confirmation, and temporal data confirmation are gathered together to fulfill different scenarios of authentication needs of business. Microsoft Face API (Application Program Interface, SAAS (software as a service has been used in face recognition module of developed mobile application. The face recognition module of the mobile application has been tested with Yale Face Database. Location, temporal data and NFC card information are collected and confirmed by the mobile application for authentication and authorization. These images were tested with our facial recognition module and confusion matrices were created. The accuracy of the system after the facial recognition test was found to be 100%. NFC card, location and temporal data authentication not only further increases security level but also fulfils many business authentication scenarios successfully. To the best of our knowledge there is no other authentication model other than implemented one that has a-4-factor confirmation including biometric face identification, NFC card authentication, location confirmation and temporal data confirmation.

  1. TOWARDS A LOW-COST, REAL-TIME PHOTOGRAMMETRIC LANDSLIDE MONITORING SYSTEM UTILISING MOBILE AND CLOUD COMPUTING TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    P. Chidburee

    2016-06-01

    Full Text Available Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i the development of an Android mobile application; (ii the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan and a web-based system (Autodesk 123D Catch. Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard

  2. Towards a Low-Cost Real-Time Photogrammetric Landslide Monitoring System Utilising Mobile and Cloud Computing Technology

    Science.gov (United States)

    Chidburee, P.; Mills, J. P.; Miller, P. E.; Fieber, K. D.

    2016-06-01

    Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM) pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i) the development of an Android mobile application; (ii) the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii) performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan) and a web-based system (Autodesk 123D Catch). Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard at a local-scale.

  3. Tropical cloud and precipitation regimes as seen from near-simultaneous TRMM, CloudSat, and CALIPSO observations and comparison with ISCCP

    Science.gov (United States)

    Luo, Zhengzhao Johnny; Anderson, Ricardo C.; Rossow, William B.; Takahashi, Hanii

    2017-06-01

    Although Tropical Rainfall Measuring Mission (TRMM) and CloudSat/CALIPSO fly in different orbits, they frequently cross each other so that for the period between 2006 and 2010, a total of 15,986 intersect lines occurred within 20 min of each other from 30°S to 30°N, providing a rare opportunity to study tropical cloud and precipitation regimes and their internal vertical structure from near-simultaneous measurements by these active sensors. A k-means cluster analysis of TRMM and CloudSat matchups identifies three tropical cloud and precipitation regimes: the first two regimes correspond to, respectively, organized deep convection with heavy rain and cirrus anvils with moderate rain; the third regime is a convectively suppressed regime that can be further divided into three subregimes, which correspond to, respectively, stratocumulus clouds with drizzle, cirrus overlying low clouds, and nonprecipitating cumulus. Inclusion of CALIPSO data adds to the dynamic range of cloud properties and identifies one more cluster; subcluster analysis further identifies a thin, midlevel cloud regime associated with tropical mountain ranges. The radar-lidar cloud regimes are compared with the International Satellite Cloud Climatology Project (ISCCP) weather states (WSs) for the extended tropics. Focus is placed on the four convectively active WSs, namely, WS1-WS4. ISCCP WS1 and WS2 are found to be counterparts of Regime 1 and Regime 2 in radar-lidar observations, respectively. ISCCP WS3 and WS4, which are mainly isolated convection and broken, detached cirrus, do not have a strong association with any individual radar and lidar regimes, a likely effect of the different sampling strategies between ISCCP and active sensors and patchy cloudiness of these WSs.

  4. Flexible CP-ABE Based Access Control on Encrypted Data for Mobile Users in Hybrid Cloud System

    Institute of Scientific and Technical Information of China (English)

    Wen-Min Li; Xue-Lei Li; Qiao-Yan Wen; Shuo Zhang; Hua Zhang

    2017-01-01

    In hybrid cloud computing, encrypted data access control can provide a fine-grained access method for orga-nizations to enact policies closer to organizational policies. This paper presents an improved CP-ABE (ciphertext-policy attribute-based encryption) scheme to construct an encrypted data access control solution that is suitable for mobile users in hybrid cloud system. In our improvement, we split the original decryption keys into a control key, a secret key and a set of transformation keys. The private cloud managed by the organization administrator takes charge of updating the transformation keys using the control key. It helps to handle the situation of flexible access management and attribute alteration. Meanwhile, the mobile user's single secret key remains unchanged as well as the ciphertext even if the data user's attribute has been revoked. In addition, we modify the access control list through adding the attributes with corresponding control key and transformation keys so as to manage user privileges depending upon the system version. Finally, the analysis shows that our scheme is secure, flexible and efficient to be applied in mobile hybrid cloud computing.

  5. [Construction and analysis of a monitoring system with remote real-time multiple physiological parameters based on cloud computing].

    Science.gov (United States)

    Zhu, Lingyun; Li, Lianjie; Meng, Chunyan

    2014-12-01

    There have been problems in the existing multiple physiological parameter real-time monitoring system, such as insufficient server capacity for physiological data storage and analysis so that data consistency can not be guaranteed, poor performance in real-time, and other issues caused by the growing scale of data. We therefore pro posed a new solution which was with multiple physiological parameters and could calculate clustered background data storage and processing based on cloud computing. Through our studies, a batch processing for longitudinal analysis of patients' historical data was introduced. The process included the resource virtualization of IaaS layer for cloud platform, the construction of real-time computing platform of PaaS layer, the reception and analysis of data stream of SaaS layer, and the bottleneck problem of multi-parameter data transmission, etc. The results were to achieve in real-time physiological information transmission, storage and analysis of a large amount of data. The simulation test results showed that the remote multiple physiological parameter monitoring system based on cloud platform had obvious advantages in processing time and load balancing over the traditional server model. This architecture solved the problems including long turnaround time, poor performance of real-time analysis, lack of extensibility and other issues, which exist in the traditional remote medical services. Technical support was provided in order to facilitate a "wearable wireless sensor plus mobile wireless transmission plus cloud computing service" mode moving towards home health monitoring for multiple physiological parameter wireless monitoring.

  6. Cloud networking understanding cloud-based data center networks

    CERN Document Server

    Lee, Gary

    2014-01-01

    Cloud Networking: Understanding Cloud-Based Data Center Networks explains the evolution of established networking technologies into distributed, cloud-based networks. Starting with an overview of cloud technologies, the book explains how cloud data center networks leverage distributed systems for network virtualization, storage networking, and software-defined networking. The author offers insider perspective to key components that make a cloud network possible such as switch fabric technology and data center networking standards. The final chapters look ahead to developments in architectures

  7. Cloud Computing: An Overview

    Science.gov (United States)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  8. Measured electric field intensities near electric cloud discharges detected by the Kennedy Space Center's Lightning Detection and Ranging System, LDAR

    Science.gov (United States)

    Poehler, H. A.

    1977-01-01

    For a summer thunderstorm, for which simultaneous, airborne electric field measurements and Lightning Detection and Ranging (LDAR) System data was available, measurements were coordinated to present a picture of the electric field intensity near cloud electrical discharges detected by the LDAR System. Radar precipitation echos from NOAA's 10 cm weather radar and measured airborne electric field intensities were superimposed on LDAR PPI plots to present a coordinated data picture of thunderstorm activity.

  9. Network Intrusion Detection System (NIDS in Cloud Environment based on Hidden Naïve Bayes Multiclass Classifier

    Directory of Open Access Journals (Sweden)

    Hafza A. Mahmood

    2018-04-01

    Full Text Available Cloud Environment is next generation internet based computing system that supplies customiza-ble services to the end user to work or access to the various cloud applications. In order to provide security and decrease the damage of information system, network and computer system it is im-portant to provide intrusion detection system (IDS. Now Cloud environment are under threads from network intrusions, as one of most prevalent and offensive means Denial of Service (DoS attacks that cause dangerous impact on cloud computing systems. This paper propose Hidden naïve Bayes (HNB Classifier to handle DoS attacks which is a data mining (DM model used to relaxes the conditional independence assumption of Naïve Bayes classifier (NB, proposed sys-tem used HNB Classifier supported with discretization and feature selection where select the best feature enhance the performance of the system and reduce consuming time. To evaluate the per-formance of proposal system, KDD 99 CUP and NSL KDD Datasets has been used. The experi-mental results show that the HNB classifier improves the performance of NIDS in terms of accu-racy and detecting DoS attacks, where the accuracy of detect DoS is 100% in three test KDD cup 99 dataset by used only 12 feature that selected by use gain ratio while in NSL KDD Dataset the accuracy of detect DoS attack is 90 % in three Experimental NSL KDD dataset by select 10 fea-ture only.

  10. Research approach and first results on agglomerate compaction in protoplanetary dust simulation in the Cloud Manipulation System

    Science.gov (United States)

    Vedernikov, Andrei; Blum, Jurgen; Ingo Von Borstel, Olaf; Schraepler, Rainer; Balapanov, Daniyar; Cecere, Anselmo

    2016-07-01

    Nanometre and micrometre-sized solid particles are ubiquitous in space and on Earth - from galaxies, interstellar space, protoplanetary and debris disks to planetary rings and atmospheres, planetary surfaces, comets, interplanetary space, Earth's atmosphere. Apparently, the most intriguing problem in the picture of the formation of planets is the transition from individual microscopic dust grains to kilometre-sized planetesimals. Revealing the mechanisms of this transition is one of the main tasks of the European Space Agency's project Interaction in Cosmic and Atmospheric Particle Systems (ICAPS). It was found that Brownian motion driven agglomeration could not provide the transition within reasonable time scale. As a result, at this stage top scientific goals shifted towards forced agglomeration and concentration of particles, targeting revealing the onset of compaction, experimental study of the evolution of fractal dimensions, size and mass distribution, occurrence of bouncing. The main tasks comprise 1) development of the rapid agglomeration model 2) development of the experimental facilities creating big fractal-type agglomerates from 10 to 1000 μm from a cloud of micrometre-size grains; 3) experimental realization of the rapid agglomeration in microgravity and ground conditions; and 4) in situ investigation of the morphology, mobility, mechanical and optical properties of the free-floating agglomerates, including investigation of thermophoresis, photophoresis of the agglomerates and of the two-phase flow phenomena. To solve the experimental part of the tasks we developed a Cloud Manipulation System, realized as a breadboard (CMS BB) for long duration microgravity platforms and a simplified laboratory version (CMS LV) mostly oriented on short duration microgravity and ground tests. The new system is based on the use of thermophoresis, most favourable for cloud manipulation without creating additional particle-particle forces in the cloud with a possibility

  11. Proactive vs reactive failure recovery assessment in combined fog-to-cloud (F2C) systems

    OpenAIRE

    Souza, Vitor Barbosa Carlos de; Masip Bruin, Xavier; Marín Tordera, Eva; Ramirez Almonte, Wilson; Sánchez López, Sergio

    2017-01-01

    The increasing number of end user devices at the edge of the network, along with their ever increasing computing capacity, as well as the advances in Data Center technologies, paved the way for the generation of Internet of Things (IoT). Several IoT services have been deployed leveraging Cloud Computing and, more recently, Fog Computing. In order to enable efficient control of cloud and fog premises, Fog-to-Cloud (F2C) has been recently proposed as a distributed architecture for coordinated m...

  12. Evaluation of the MiKlip decadal prediction system using satellite based cloud products

    Directory of Open Access Journals (Sweden)

    Thomas Spangehl

    2016-12-01

    Full Text Available The decadal hindcast simulations performed for the Mittelfristige Klimaprognosen (MiKlip project are evaluated using satellite-retrieved cloud parameters from the CM SAF cLoud, Albedo and RAdiation dataset from AVHRR data (CLARA-A1 provided by the EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF and from the International Satellite Cloud Climatology Project (ISCCP. The forecast quality of two sets of hindcasts, Baseline-1-LR and Baseline-0, which use differing initialisations, is assessed. Basic evaluation focuses on multi-year ensemble mean fields and cloud-type histograms utilizing satellite simulator output. Additionally, ensemble evaluation employing analysis of variance (ANOVA, analysis rank histograms (ARH and a deterministic correlation score is performed. Satellite simulator output is available for a subset of the full hindcast ensembles only. Therefore, the raw model cloud cover is complementary used. The new Baseline-1-LR hindcasts are closer to satellite data with respect to the simulated tropical/subtropical mean cloud cover pattern than the reference hindcasts (Baseline-0 emphasizing improvements of the new MiKlip initialisation procedure. A slightly overestimated occurrence rate of optically thick cloud-types is analysed for different experiments including hindcasts and simulations using realistic sea surface boundaries according to the Atmospheric Model Intercomparison Project (AMIP. By contrast, the evaluation of cirrus and cirrostratus clouds is complicated by observational based uncertainties. Time series of the 3-year mean total cloud cover averaged over the tropical warm pool (TWP region show some correlation with the CLARA-A1 cloud fractional cover. Moreover, ensemble evaluation of the Baseline-1-LR hindcasts reveals potential predictability of the 2–5 lead year averaged total cloud cover for a large part of this region when regarding the full observational period. However, the hindcasts show only

  13. Cloud-based Predictive Modeling System and its Application to Asthma Readmission Prediction

    Science.gov (United States)

    Chen, Robert; Su, Hang; Khalilia, Mohammed; Lin, Sizhe; Peng, Yue; Davis, Tod; Hirsh, Daniel A; Searles, Elizabeth; Tejedor-Sojo, Javier; Thompson, Michael; Sun, Jimeng

    2015-01-01

    The predictive modeling process is time consuming and requires clinical researchers to handle complex electronic health record (EHR) data in restricted computational environments. To address this problem, we implemented a cloud-based predictive modeling system via a hybrid setup combining a secure private server with the Amazon Web Services (AWS) Elastic MapReduce platform. EHR data is preprocessed on a private server and the resulting de-identified event sequences are hosted on AWS. Based on user-specified modeling configurations, an on-demand web service launches a cluster of Elastic Compute 2 (EC2) instances on AWS to perform feature selection and classification algorithms in a distributed fashion. Afterwards, the secure private server aggregates results and displays them via interactive visualization. We tested the system on a pediatric asthma readmission task on a de-identified EHR dataset of 2,967 patients. We conduct a larger scale experiment on the CMS Linkable 2008–2010 Medicare Data Entrepreneurs’ Synthetic Public Use File dataset of 2 million patients, which achieves over 25-fold speedup compared to sequential execution. PMID:26958172

  14. Trust management in cloud services

    CERN Document Server

    Noor, Talal H; Bouguettaya, Athman

    2014-01-01

    This book describes the design and implementation of Cloud Armor, a novel approach for credibility-based trust management and automatic discovery of cloud services in distributed and highly dynamic environments. This book also helps cloud users to understand the difficulties of establishing trust in cloud computing and the best criteria for selecting a service cloud. The techniques have been validated by a prototype system implementation and experimental studies using a collection of real world trust feedbacks on cloud services.The authors present the design and implementation of a novel pro

  15. Cloud MicroAtlas

    Indian Academy of Sciences (India)

    We begin by outlining the life cycle of a tall cloud, and thenbriefly discuss cloud systems. We choose one aspect of thislife cycle, namely, the rapid growth of water droplets in ice freeclouds, to then discuss in greater detail. Taking a singlevortex to be a building block of turbulence, we demonstrateone mechanism by which ...

  16. Kernel structures for Clouds

    Science.gov (United States)

    Spafford, Eugene H.; Mckendry, Martin S.

    1986-01-01

    An overview of the internal structure of the Clouds kernel was presented. An indication of how these structures will interact in the prototype Clouds implementation is given. Many specific details have yet to be determined and await experimentation with an actual working system.

  17. A home healthcare system in the cloud-addressing security and privacy challenges

    NARCIS (Netherlands)

    Deng, M.; Petkovic, M.; Nalin, M.; Baroni, I.

    2011-01-01

    Cloud computing is an emerging technology that is expected to support Internet scale critical applications which could be essential to the healthcare sector. Its scalability, resilience, adaptability, connectivity, cost reduction, and high performance features have high potential to lift the

  18. Data Security Risk Estimation for Information-Telecommunication Systems on the basis of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Anatoly Valeryevich Tsaregorodtsev

    2014-02-01

    Full Text Available Cloud computing will be one of the most common IT technologies to deploy applications, due to its key features: on-demand network access to a shared pool of configurable computing resources, flexibility and good quality/price ratio. Migrating to cloud architecture enables organizations to reduce the overall cost of implementing and maintaining the infrastructure and reduce development time for new business applications. There are many factors that influence the information security environment of cloud, as its multitenant architecture brings new and more complex problems and vulnerabilities. And the approach to risk estimation used in making decisions about the migration of critical data in the cloud infrastructure of the organization are proposed in the paper.

  19. Simulations of the observation of clouds and aerosols with the Experimental Lidar in Space Equipment system.

    Science.gov (United States)

    Liu, Z; Voelger, P; Sugimoto, N

    2000-06-20

    We carried out a simulation study for the observation of clouds and aerosols with the Japanese Experimental Lidar in Space Equipment (ELISE), which is a two-wavelength backscatter lidar with three detection channels. The National Space Development Agency of Japan plans to launch the ELISE on the Mission Demonstrate Satellite 2 (MDS-2). In the simulations, the lidar return signals for the ELISE are calculated for an artificial, two-dimensional atmospheric model including different types of clouds and aerosols. The signal detection processes are simulated realistically by inclusion of various sources of noise. The lidar signals that are generated are then used as input for simulations of data analysis with inversion algorithms to investigate retrieval of the optical properties of clouds and aerosols. The results demonstrate that the ELISE can provide global data on the structures and optical properties of clouds and aerosols. We also conducted an analysis of the effects of cloud inhomogeneity on retrievals from averaged lidar profiles. We show that the effects are significant for space lidar observations of optically thick broken clouds.

  20. A Robust Multi-Scale Modeling System for the Study of Cloud and Precipitation Processes

    Science.gov (United States)

    Tao, Wei-Kuo

    2012-01-01

    During the past decade, numerical weather and global non-hydrostatic models have started using more complex microphysical schemes originally developed for high resolution cloud resolving models (CRMs) with 1-2 km or less horizontal resolutions. These microphysical schemes affect the dynamic through the release of latent heat (buoyancy loading and pressure gradient) the radiation through the cloud coverage (vertical distribution of cloud species), and surface processes through rainfall (both amount and intensity). Recently, several major improvements of ice microphysical processes (or schemes) have been developed for cloud-resolving model (Goddard Cumulus Ensemble, GCE, model) and regional scale (Weather Research and Forecast, WRF) model. These improvements include an improved 3-ICE (cloud ice, snow and graupel) scheme (Lang et al. 2010); a 4-ICE (cloud ice, snow, graupel and hail) scheme and a spectral bin microphysics scheme and two different two-moment microphysics schemes. The performance of these schemes has been evaluated by using observational data from TRMM and other major field campaigns. In this talk, we will present the high-resolution (1 km) GeE and WRF model simulations and compared the simulated model results with observation from recent field campaigns [i.e., midlatitude continental spring season (MC3E; 2010), high latitude cold-season (C3VP, 2007; GCPEx, 2012), and tropical oceanic (TWP-ICE, 2006)].

  1. DIaaS: Resource Management System for the Intra-Cloud with On-Premise Desktops

    Directory of Open Access Journals (Sweden)

    Hyun-Woo Kim

    2017-01-01

    Full Text Available Infrastructure as a service with desktops (DIaaS based on the extensible mark-up language (XML is herein proposed to utilize surplus resources. DIaaS is a traditional surplus-resource integrated management technology. It is designed to provide fast work distribution and computing services based on user service requests as well as storage services through desktop-based distributed computing and storage resource integration. DIaaS includes a nondisruptive resource service and an auto-scalable scheme to enhance the availability and scalability of intra-cloud computing resources. A performance evaluation of the proposed scheme measured the clustering performance time for surplus resource utilization. The results showed improvement in computing and storage services in a connection of at least two computers compared to the traditional method for high-availability measurement of nondisruptive services. Furthermore, an artificial server error environment was used to create a clustering delay for computing and storage services and for nondisruptive services. It was compared to the Hadoop distributed file system (HDFS.

  2. Dynamic Pricing in Cloud Manufacturing Systems under Combined Effects of Consumer Structure, Negotiation, and Demand

    Directory of Open Access Journals (Sweden)

    Wei Peng

    2017-01-01

    Full Text Available In this study, we proposed a game-theory based framework to model the dynamic pricing process in the cloud manufacturing (CMfg system. We considered a service provider (SP, a broker agent (BA, and a dynamic service demander (SD population that is composed of price takers and bargainers in this study. The pricing processes under linear demand and constant elasticity demand were modeled, respectively. The combined effects of SD population structure, negotiation, and demand forms on the SP’s and the BA’s equilibrium prices and expected revenues were examined. We found that the SP’s optimal wholesale price, the BA’s optimal reservation price, and posted price all increase with the proportion of price takers under linear demand but decrease with it under constant elasticity demand. We also found that the BA’s optimal reservation price increases with bargainers’ power no matter under what kind of demand. Through analyzing the participants’ revenues, we showed that a dynamic SD population with a high ratio of price takers would benefit the SP and the BA.

  3. Geographically distributed Batch System as a Service: the INDIGO-DataCloud approach exploiting HTCondor

    Science.gov (United States)

    Aiftimiei, D. C.; Antonacci, M.; Bagnasco, S.; Boccali, T.; Bucchi, R.; Caballer, M.; Costantini, A.; Donvito, G.; Gaido, L.; Italiano, A.; Michelotto, D.; Panella, M.; Salomoni, D.; Vallero, S.

    2017-10-01

    One of the challenges a scientific computing center has to face is to keep delivering well consolidated computational frameworks (i.e. the batch computing farm), while conforming to modern computing paradigms. The aim is to ease system administration at all levels (from hardware to applications) and to provide a smooth end-user experience. Within the INDIGO- DataCloud project, we adopt two different approaches to implement a PaaS-level, on-demand Batch Farm Service based on HTCondor and Mesos. In the first approach, described in this paper, the various HTCondor daemons are packaged inside pre-configured Docker images and deployed as Long Running Services through Marathon, profiting from its health checks and failover capabilities. In the second approach, we are going to implement an ad-hoc HTCondor framework for Mesos. Container-to-container communication and isolation have been addressed exploring a solution based on overlay networks (based on the Calico Project). Finally, we have studied the possibility to deploy an HTCondor cluster that spans over different sites, exploiting the Condor Connection Broker component, that allows communication across a private network boundary or firewall as in case of multi-site deployments. In this paper, we are going to describe and motivate our implementation choices and to show the results of the first tests performed.

  4. A Two-Tier Energy-Aware Resource Management for Virtualized Cloud Computing System

    Directory of Open Access Journals (Sweden)

    Wei Huang

    2016-01-01

    Full Text Available The economic costs caused by electric power take the most significant part in total cost of data center; thus energy conservation is an important issue in cloud computing system. One well-known technique to reduce the energy consumption is the consolidation of Virtual Machines (VMs. However, it may lose some performance points on energy saving and the Quality of Service (QoS for dynamic workloads. Fortunately, Dynamic Frequency and Voltage Scaling (DVFS is an efficient technique to save energy in dynamic environment. In this paper, combined with the DVFS technology, we propose a cooperative two-tier energy-aware management method including local DVFS control and global VM deployment. The DVFS controller adjusts the frequencies of homogenous processors in each server at run-time based on the practical energy prediction. On the other hand, Global Scheduler assigns VMs onto the designate servers based on the cooperation with the local DVFS controller. The final evaluation results demonstrate the effectiveness of our two-tier method in energy saving.

  5. A New WRF-Chem Treatment for Studying Regional Scale Impacts of Cloud-Aerosol Interactions in Parameterized Cumuli

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Larry K.; Shrivastava, ManishKumar B.; Easter, Richard C.; Fast, Jerome D.; Chapman, Elaine G.; Liu, Ying

    2015-01-01

    A new treatment of cloud-aerosol interactions within parameterized shallow and deep convection has been implemented in WRF-Chem that can be used to better understand the aerosol lifecycle over regional to synoptic scales. The modifications to the model to represent cloud-aerosol interactions include treatment of the cloud dropletnumber mixing ratio; key cloud microphysical and macrophysical parameters (including the updraft fractional area, updraft and downdraft mass fluxes, and entrainment) averaged over the population of shallow clouds, or a single deep convective cloud; and vertical transport, activation/resuspension, aqueous chemistry, and wet removal of aerosol and trace gases in warm clouds. Thesechanges have been implemented in both the WRF-Chem chemistry packages as well as the Kain-Fritsch cumulus parameterization that has been modified to better represent shallow convective clouds. Preliminary testing of the modified WRF-Chem has been completed using observations from the Cumulus Humilis Aerosol Processing Study (CHAPS) as well as a high-resolution simulation that does not include parameterized convection. The simulation results are used to investigate the impact of cloud-aerosol interactions on the regional scale transport of black carbon (BC), organic aerosol (OA), and sulfate aerosol. Based on the simulations presented here, changes in the column integrated BC can be as large as -50% when cloud-aerosol interactions are considered (due largely to wet removal), or as large as +35% for sulfate in non-precipitating conditions due to the sulfate production in the parameterized clouds. The modifications to WRF-Chem version 3.2.1 are found to account for changes in the cloud drop number concentration (CDNC) and changes in the chemical composition of cloud-drop residuals in a way that is consistent with observations collected during CHAPS. Efforts are currently underway to port the changes described here to WRF-Chem version 3.5, and it is anticipated that they

  6. Cloud Governance

    DEFF Research Database (Denmark)

    Berthing, Hans Henrik

    Denne præsentation beskriver fordele og værdier ved anvendelse af Cloud Computing. Endvidere inddrager resultater fra en række internationale analyser fra ISACA om Cloud Computing.......Denne præsentation beskriver fordele og værdier ved anvendelse af Cloud Computing. Endvidere inddrager resultater fra en række internationale analyser fra ISACA om Cloud Computing....

  7. The Magellanic Stream System. I. Ram-Pressure Tails and the Relics of the Collision Between the Magellanic Clouds

    Science.gov (United States)

    Hammer, F.; Yang, Y. B.; Flores, H.; Puech, M.; Fouquet, S.

    2015-11-01

    We have analyzed the Magellanic Stream (MS) using the deepest and the most resolved H i survey of the Southern Hemisphere (the Galactic All-Sky Survey). The overall Stream is structured into two filaments, suggesting two ram-pressure tails lagging behind the Magellanic Clouds (MCs), and resembling two close, transonic, von Karman vortex streets. The past motions of the Clouds appear imprinted in them, implying almost parallel initial orbits, and then a radical change after their passage near the N(H i) peak of the MS. This is consistent with a recent collision between the MCs, 200-300 Myr ago, which has stripped their gas further into small clouds, spreading them out along a gigantic bow shock, perpendicular to the MS. The Stream is formed by the interplay between stellar feedback and the ram pressure exerted by hot gas in the Milky Way (MW) halo with n h = 10-4 cm-3 at 50-70 kpc, a value necessary to explain the MS multiphase high-velocity clouds. The corresponding hydrodynamic modeling provides the currently most accurate reproduction of the whole H i Stream morphology, of its velocity, and column density profiles along L MS. The “ram pressure plus collision” scenario requires tidal dwarf galaxies, which are assumed to be the Cloud and dSph progenitors, to have left imprints in the MS and the Leading Arm, respectively. The simulated LMC and SMC have baryonic mass, kinematics, and proper motions consistent with observations. This supports a novel paradigm for the MS System, which could have its origin in material expelled toward the MW by the ancient gas-rich merger that formed M31.

  8. THE MAGELLANIC STREAM SYSTEM. I. RAM-PRESSURE TAILS AND THE RELICS OF THE COLLISION BETWEEN THE MAGELLANIC CLOUDS

    Energy Technology Data Exchange (ETDEWEB)

    Hammer, F.; Yang, Y. B.; Flores, H.; Puech, M.; Fouquet, S., E-mail: francois.hammer@obspm.fr [GEPI, Observatoire de Paris, CNRS, 5 Place Jules Janssen, Meudon F-92195 (France)

    2015-11-10

    We have analyzed the Magellanic Stream (MS) using the deepest and the most resolved H i survey of the Southern Hemisphere (the Galactic All-Sky Survey). The overall Stream is structured into two filaments, suggesting two ram-pressure tails lagging behind the Magellanic Clouds (MCs), and resembling two close, transonic, von Karman vortex streets. The past motions of the Clouds appear imprinted in them, implying almost parallel initial orbits, and then a radical change after their passage near the N(H i) peak of the MS. This is consistent with a recent collision between the MCs, 200–300 Myr ago, which has stripped their gas further into small clouds, spreading them out along a gigantic bow shock, perpendicular to the MS. The Stream is formed by the interplay between stellar feedback and the ram pressure exerted by hot gas in the Milky Way (MW) halo with n{sub h} = 10{sup −4} cm{sup −3} at 50–70 kpc, a value necessary to explain the MS multiphase high-velocity clouds. The corresponding hydrodynamic modeling provides the currently most accurate reproduction of the whole H i Stream morphology, of its velocity, and column density profiles along L{sub MS}. The “ram pressure plus collision” scenario requires tidal dwarf galaxies, which are assumed to be the Cloud and dSph progenitors, to have left imprints in the MS and the Leading Arm, respectively. The simulated LMC and SMC have baryonic mass, kinematics, and proper motions consistent with observations. This supports a novel paradigm for the MS System, which could have its origin in material expelled toward the MW by the ancient gas-rich merger that formed M31.

  9. Exploiting geo-distributed clouds for a e-health monitoring system with minimum service delay and privacy preservation.

    Science.gov (United States)

    Shen, Qinghua; Liang, Xiaohui; Shen, Xuemin; Lin, Xiaodong; Luo, Henry Y

    2014-03-01

    In this paper, we propose an e-health monitoring system with minimum service delay and privacy preservation by exploiting geo-distributed clouds. In the system, the resource allocation scheme enables the distributed cloud servers to cooperatively assign the servers to the requested users under the load balance condition. Thus, the service delay for users is minimized. In addition, a traffic-shaping algorithm is proposed. The traffic-shaping algorithm converts the user health data traffic to the nonhealth data traffic such that the capability of traffic analysis attacks is largely reduced. Through the numerical analysis, we show the efficiency of the proposed traffic-shaping algorithm in terms of service delay and privacy preservation. Furthermore, through the simulations, we demonstrate that the proposed resource allocation scheme significantly reduces the service delay compared to two other alternatives using jointly the short queue and distributed control law.

  10. Cloud/Fog Computing System Architecture and Key Technologies for South-North Water Transfer Project Safety

    Directory of Open Access Journals (Sweden)

    Yaoling Fan

    2018-01-01

    Full Text Available In view of the real-time and distributed features of Internet of Things (IoT safety system in water conservancy engineering, this study proposed a new safety system architecture for water conservancy engineering based on cloud/fog computing and put forward a method of data reliability detection for the false alarm caused by false abnormal data from the bottom sensors. Designed for the South-North Water Transfer Project (SNWTP, the architecture integrated project safety, water quality safety, and human safety. Using IoT devices, fog computing layer was constructed between cloud server and safety detection devices in water conservancy projects. Technologies such as real-time sensing, intelligent processing, and information interconnection were developed. Therefore, accurate forecasting, accurate positioning, and efficient management were implemented as required by safety prevention of the SNWTP, and safety protection of water conservancy projects was effectively improved, and intelligential water conservancy engineering was developed.

  11. Simulation of e-cloud driven instability and its attenuation using a simulated feedback system in the CERN SPS

    International Nuclear Information System (INIS)

    Vay, J.-L.; Furman, M.A.

    2010-01-01

    Electron clouds have been shown to trigger fast growing instabilities on proton beams circulating in the SPS, and a feedback system to control the single-bunch instabilities is under active development. We present the latest improvements to the WARP-POSINST simulation framework and feedback model, and its application to the self-consistent simulations of two consecutive bunches interacting with an electron cloud in the SPS. Simulations using an idealized feedback system exhibit adequate mitigation of the instability providing that the cutoff of the feedback bandwidth is at or above 450 MHz. Artifacts from numerical noise of the injected distribution of electrons in the modeling of portions of bunch trains are discussed, and benchmarking of WARP against POSINST and HEADTAIL are presented.

  12. Cloud Computing Security: A Survey

    OpenAIRE

    Khalil, Issa; Khreishah, Abdallah; Azeem, Muhammad

    2014-01-01

    Cloud computing is an emerging technology paradigm that migrates current technological and computing concepts into utility-like solutions similar to electricity and water systems. Clouds bring out a wide range of benefits including configurable computing resources, economic savings, and service flexibility. However, security and privacy concerns are shown to be the primary obstacles to a wide adoption of clouds. The new concepts that clouds introduce, such as multi-tenancy, resource sharing a...

  13. QUALITY ASSURANCE FOR CLOUD COMPUTING

    OpenAIRE

    Sumaira Aslam; Hina Shahid

    2016-01-01

    Cloud computing is a greatest and latest thing. Marketers for lots of big companies are all using cloud computing terms in their marketing campaign to make them seem them impressive so, that they can get clients and customers. Cloud computing is overall the philosophy and design concept and it is much more complicated and yet much simpler. The basic underlined thing that cloud computing do is to separate the applications from operating systems from the software from the hardware that runs eve...

  14. Augmented reality system using lidar point cloud data for displaying dimensional information of objects on mobile phones

    OpenAIRE

    Gupta, S.; Lohani, B.

    2014-01-01

    Mobile augmented reality system is the next generation technology to visualise 3D real world intelligently. The technology is expanding at a fast pace to upgrade the status of a smart phone to an intelligent device. The research problem identified and presented in the current work is to view actual dimensions of various objects that are captured by a smart phone in real time. The methodology proposed first establishes correspondence between LiDAR point cloud, that are stored in a ser...

  15. Climatic effects during passage of the solar system through interstellar clouds

    International Nuclear Information System (INIS)

    Talbot, R.J. Jr.; Butler, D.M.; Newman, M.J.

    1976-01-01

    It is thought likely that the solar system passes through regions where there are a large number of dense interstellar clouds. When this occurs several processes may cause significant changes in the climate of the Earth and other planets. Matters here discussed include the influences of compression of the solar wind cavity, accretion of matter by the Sun, and particulate input into the Earth's atmosphere. Gravitational energy released by the accretion of interstellar material by the Sun may enhance the solar luminosity, and considerations of terrestrial heat balance suggest that luminosity enhancements of 1% or more will produce significant variations of climate. Observational evidence suggests that there is some mechanism producing a relationship between solar wind flow and climate. One proposed mechanism is that contemporary solar wind modulation of galactic cosmic rays influences climate, and the fact that the Earth would be outside the solar wind cavity for all or part of the year may have an effect on terrestrial climate. Relatively small variations of solar UV radiation input may have perceptible influences on climate, and if a 1% variation in radiation input to the stratosphere has a significant effect then accretion may have a large impact on terrestrial conditions, even though the change in the total heat balance is negligible.With regard to dust input into the Earth's atmosphere it is estimated that during the lifetime of the solar system the mass of dust grains accreted by the Earth should have been about 10 16 to 10 18 g; the matter of evidence for their presence is discussed. It is concluded that the processes proposed have very complex implications for global weather patterns; and at present it is not possible to evaluate which, if any, will unquestionably affect the Earth's climate. (U.K.)

  16. A Karaoke System with Real-Time Media Merging and Sharing Functions for a Cloud-Computing-Integrated Mobile Device

    Directory of Open Access Journals (Sweden)

    Her-Tyan Yeh

    2013-01-01

    Full Text Available Mobile devices such as personal digital assistants (PDAs, smartphones, and tablets have increased in popularity and are extremely efficient for work-related, social, and entertainment uses. Popular entertainment services have also attracted substantial attention. Thus, relevant industries have exerted considerable efforts in establishing a method by which mobile devices can be used to develop excellent and convenient entertainment services. Because cloud-computing technology is mature and possesses a strong computing processing capacity, integrating this technology into the entertainment service function in mobile devices can reduce the data load on a system and maintain mobile device performances. This study combines cloud computing with a mobile device to design a karaoke system that contains real-time media merging and sharing functions. This system enables users to download music videos (MVs from their mobile device and sing and record their singing by using the device. They can upload the recorded song to the cloud server where it is merged with real-time media. Subsequently, by employing a media streaming technology, users can store their personal MVs in their mobile device or computer and instantaneously share these videos with others on the Internet. Through this process, people can instantly watch shared videos, enjoy the leisure and entertainment effects of mobile devices, and satisfy their desire for singing.

  17. Relating tropical ocean clouds to moist processes using water vapor isotope measurements

    Directory of Open Access Journals (Sweden)

    J. Lee

    2011-01-01

    Full Text Available We examine the co-variations of tropospheric water vapor, its isotopic composition and cloud types and relate these distributions to tropospheric mixing and distillation models using satellite observations from the Aura Tropospheric Emission Spectrometer (TES over the summertime tropical ocean. Interpretation of these process distributions must take into account the sensitivity of the TES isotope and water vapor measurements to variations in cloud, water, and temperature amount. Consequently, comparisons are made between cloud-types based on the International Satellite Cloud Climatology Project (ISSCP classification; these are clear sky, non-precipitating (e.g., cumulus, boundary layer (e.g., stratocumulus, and precipitating clouds (e.g. regions of deep convection. In general, we find that the free tropospheric vapor over tropical oceans does not strictly follow a Rayleigh model in which air parcels become dry and isotopically depleted through condensation. Instead, mixing processes related to convection as well as subsidence, and re-evaporation of rainfall associated with organized deep convection all play significant roles in controlling the water vapor distribution. The relative role of these moisture processes are examined for different tropical oceanic regions.

  18. Cloud-based BP system integrated with CPOE improves self-management of the hypertensive patients: A randomized controlled trial.

    Science.gov (United States)

    Lee, Peisan; Liu, Ju-Chi; Hsieh, Ming-Hsiung; Hao, Wen-Rui; Tseng, Yuan-Teng; Liu, Shuen-Hsin; Lin, Yung-Kuo; Sung, Li-Chin; Huang, Jen-Hung; Yang, Hung-Yu; Ye, Jong-Shiuan; Zheng, He-Shun; Hsu, Min-Huei; Syed-Abdul, Shabbir; Lu, Richard; Nguyen, Phung-Anh; Iqbal, Usman; Huang, Chih-Wei; Jian, Wen-Shan; Li, Yu-Chuan Jack

    2016-08-01

    Less than 50% of patients with hypertensive disease manage to maintain their blood pressure (BP) within normal levels. The aim of this study is to evaluate whether cloud BP system integrated with computerized physician order entry (CPOE) can improve BP management as compared with traditional care. A randomized controlled trial done on a random sample of 382 adults recruited from 786 patients who had been diagnosed with hypertension and receiving treatment for hypertension in two district hospitals in the north of Taiwan. Physicians had access to cloud BP data from CPOE. Neither patients nor physicians were blinded to group assignment. The study was conducted over a period of seven months. At baseline, the enrollees were 50% male with a mean (SD) age of 58.18 (10.83) years. The mean sitting BP of both arms was no different. The proportion of patients with BP control at two, four and six months was significantly greater in the intervention group than in the control group. The average capture rates of blood pressure in the intervention group were also significantly higher than the control group in all three check-points. Cloud-based BP system integrated with CPOE at the point of care achieved better BP control compared to traditional care. This system does not require any technical skills and is therefore suitable for every age group. The praise and assurance to the patients from the physicians after reviewing the Cloud BP records positively reinforced both BP measuring and medication adherence behaviors. Copyright © 2016. Published by Elsevier Ireland Ltd.

  19. Toward a web-based real-time radiation treatment planning system in a cloud computing environment.

    Science.gov (United States)

    Na, Yong Hum; Suh, Tae-Suk; Kapp, Daniel S; Xing, Lei

    2013-09-21

    To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an 'on-demand' basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture's constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm(2)) from the Varian TrueBeam(TM) STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are

  20. Toward a web-based real-time radiation treatment planning system in a cloud computing environment

    International Nuclear Information System (INIS)

    Na, Yong Hum; Kapp, Daniel S; Xing, Lei; Suh, Tae-Suk

    2013-01-01

    To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an ‘on-demand’ basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture’s constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm 2 ) from the Varian TrueBeam TM STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are

  1. Cloud database development and management

    CERN Document Server

    Chao, Lee

    2013-01-01

    Nowadays, cloud computing is almost everywhere. However, one can hardly find a textbook that utilizes cloud computing for teaching database and application development. This cloud-based database development book teaches both the theory and practice with step-by-step instructions and examples. This book helps readers to set up a cloud computing environment for teaching and learning database systems. The book will cover adequate conceptual content for students and IT professionals to gain necessary knowledge and hands-on skills to set up cloud based database systems.

  2. Automating Information Assurance for Cyber Situational Awareness within a Smart Cloud System of Systems

    Science.gov (United States)

    2014-03-01

    monitoring and protection of data such as Transport Layer Security ( TLS ), Secure Sockets Layer ( SSL ), and Internet Protocol Security (IPsec) protocols...and usage of data loss prevention software. Protocols such as TLS , SSL , and IPsec encrypt data packets for secure transportation and decryption by...Representational State Transfer RSS rich site summary SA situational awareness SAF Singapore Armed Forces SoS system of systems SSL secure sockets layer S

  3. Real-time Monitoring System for Rotating Machinery with IoT-based Cloud Platform

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Haedong; Kim, Suhyun; Woo, Sunhee; Kim, Songhyun; Lee, Seungchul [Ulsan Nat’l Institute Science and Technology, Ulsan (Korea, Republic of)

    2017-06-15

    The objective of this research is to improve the efficiency of data collection from many machine components on smart factory floors using IoT(Internet of things) techniques and cloud platform, and to make it easy to update outdated diagnostic schemes through online deployment methods from cloud resources. The short-term analysis is implemented by a micro-controller, and it includes machine-learning algorithms for inferring snapshot information of the machine components. For long-term analysis, time-series and high-dimension data are used for root cause analysis by combining a cloud platform and multivariate analysis techniques. The diagnostic results are visualized in a webbased display dashboard for an unconstrained user access. The implementation is demonstrated to identify its performance in data acquisition and analysis for rotating machinery.

  4. Real-time Monitoring System for Rotating Machinery with IoT-based Cloud Platform

    International Nuclear Information System (INIS)

    Jeong, Haedong; Kim, Suhyun; Woo, Sunhee; Kim, Songhyun; Lee, Seungchul

    2017-01-01

    The objective of this research is to improve the efficiency of data collection from many machine components on smart factory floors using IoT(Internet of things) techniques and cloud platform, and to make it easy to update outdated diagnostic schemes through online deployment methods from cloud resources. The short-term analysis is implemented by a micro-controller, and it includes machine-learning algorithms for inferring snapshot information of the machine components. For long-term analysis, time-series and high-dimension data are used for root cause analysis by combining a cloud platform and multivariate analysis techniques. The diagnostic results are visualized in a webbased display dashboard for an unconstrained user access. The implementation is demonstrated to identify its performance in data acquisition and analysis for rotating machinery.

  5. Extended-range prediction trials using the global cloud/cloud-system resolving model NICAM and its new ocean-coupled version NICOCO

    Science.gov (United States)

    Miyakawa, Tomoki

    2017-04-01

    The global cloud/cloud-system resolving model NICAM and its new fully-coupled version NICOCO is run on one of the worlds top-tier supercomputers, the K computer. NICOCO couples the full-3D ocean component COCO of the general circulation model MIROC using a general-purpose coupler Jcup. We carried out multiple MJO simulations using NICAM and the new ocean-coupled version NICOCO to examine their extended-range MJO prediction skills and the impact of ocean coupling. NICAM performs excellently in terms of MJO prediction, maintaining a valid skill up to 27 days after the model is initialized (Miyakawa et al 2014). As is the case in most global models, ocean coupling frees the model from being anchored by the observed SST and allows the model climate to drift away further from reality compared to the atmospheric version of the model. Thus, it is important to evaluate the model bias, and in an initial value problem such as the seasonal extended-range prediction, it is essential to be able to distinguish the actual signal from the early transition of the model from the observed state to its own climatology. Since NICAM is a highly resource-demanding model, evaluation and tuning of the model climatology (order of years) is challenging. Here we focus on the initial 100 days to estimate the early drift of the model, and subsequently evaluate MJO prediction skills of NICOCO. Results show that in the initial 100 days, NICOCO forms a La-Nina like SST bias compared to observation, with a warmer Maritime Continent warm pool and a cooler equatorial central Pacific. The enhanced convection over the Maritime Continent associated with this bias project on to the real-time multi-variate MJO indices (RMM, Wheeler and Hendon 2004), and contaminates the MJO skill score. However, the bias does not appear to demolish the MJO signal severely. The model maintains a valid MJO prediction skill up to nearly 4 weeks when evaluated after linearly removing the early drift component estimated from

  6. Overlapping Open Clusters NGC 1750 and NGC 1758 behind the Taurus Dark Clouds. II. CCD Photometry in the Vilnius System

    Directory of Open Access Journals (Sweden)

    Straižys V.

    2003-09-01

    Full Text Available Seven-color photometry in the Vilnius system has been obtained for 420 stars down to V = 16 mag in the area containing the overlapping open clusters NGC 1750 and NGC 1758 in Taurus. Spectral and luminosity classes, color excesses, interstellar extinctions and distances are given for 287 stars. The classification of stars is based on their reddening-free Q-parameters. 18 stars observed photoelectrically were used as standards. The extinction vs. distance diagram exhibits the presence of one dust cloud at a distance of 175 pc which almost coincides with a distance of other dust clouds in the Taurus complex. The clusters NGC 1750 and NGC 1758 are found to be at the same distance of ~760 pc and may penetrate each other. Their interstellar extinction AV is 1.06 mag which corresponds to EB-V = 0.34 mag.

  7. Summary of Symposium on Cloud Systems, Hurricanes and TRMM: Celebration of Dr. Joanne Simpson's Career, The First Fifty Years

    Science.gov (United States)

    Tao, W.-K.; Adler, R.; Braun, S.; Einaudi, F.; Ferrier, B.; Halverson, J.; Heymsfield, G.; Kummerow, C.; Negri, A.; Kakar, R.; hide

    2000-01-01

    A symposium celebrating the first 50 years of Dr. Joanne Simpson's career took place at the NASA/Goddard Space Flight Center from December 1 - 3, 1999. This symposium consisted of presentations that focused on: historical and personal points of view concerning Dr. Simpson's research career, her interactions with the American Meteorological Society, and her leadership in TRMM; scientific interactions with Dr. Simpson that influenced personal research; research related to observations and modeling of clouds, cloud systems and hurricanes; and research related to the Tropical Rainfall Measuring Mission (TRMM). There were a total of 36 presentations and 103 participants from the US, Japan and Australia. The specific presentations during the symposium are summarized in this paper.

  8. Detection System of HTTP DDoS Attacks in a Cloud Environment Based on Information Theoretic Entropy and Random Forest

    Directory of Open Access Journals (Sweden)

    Mohamed Idhammad

    2018-01-01

    Full Text Available Cloud Computing services are often delivered through HTTP protocol. This facilitates access to services and reduces costs for both providers and end-users. However, this increases the vulnerabilities of the Cloud services face to HTTP DDoS attacks. HTTP request methods are often used to address web servers’ vulnerabilities and create multiple scenarios of HTTP DDoS attack such as Low and Slow or Flooding attacks. Existing HTTP DDoS detection systems are challenged by the big amounts of network traffic generated by these attacks, low detection accuracy, and high false positive rates. In this paper we present a detection system of HTTP DDoS attacks in a Cloud environment based on Information Theoretic Entropy and Random Forest ensemble learning algorithm. A time-based sliding window algorithm is used to estimate the entropy of the network header features of the incoming network traffic. When the estimated entropy exceeds its normal range the preprocessing and the classification tasks are triggered. To assess the proposed approach various experiments were performed on the CIDDS-001 public dataset. The proposed approach achieves satisfactory results with an accuracy of 99.54%, a FPR of 0.4%, and a running time of 18.5s.

  9. Augmented reality system using lidar point cloud data for displaying dimensional information of objects on mobile phones

    Science.gov (United States)

    Gupta, S.; Lohani, B.

    2014-05-01

    Mobile augmented reality system is the next generation technology to visualise 3D real world intelligently. The technology is expanding at a fast pace to upgrade the status of a smart phone to an intelligent device. The research problem identified and presented in the current work is to view actual dimensions of various objects that are captured by a smart phone in real time. The methodology proposed first establishes correspondence between LiDAR point cloud, that are stored in a server, and the image t hat is captured by a mobile. This correspondence is established using the exterior and interior orientation parameters of the mobile camera and the coordinates of LiDAR data points which lie in the viewshed of the mobile camera. A pseudo intensity image is generated using LiDAR points and their intensity. Mobile image and pseudo intensity image are then registered using image registration method SIFT thereby generating a pipeline to locate a point in point cloud corresponding to a point (pixel) on the mobile image. The second part of the method uses point cloud data for computing dimensional information corresponding to the pairs of points selected on mobile image and fetch the dimensions on top of the image. This paper describes all steps of the proposed method. The paper uses an experimental setup to mimic the mobile phone and server system and presents some initial but encouraging results

  10. Usage of the hybrid encryption in a cloud instant messages exchange system

    Science.gov (United States)

    Kvyetnyy, Roman N.; Romanyuk, Olexander N.; Titarchuk, Evgenii O.; Gromaszek, Konrad; Mussabekov, Nazarbek

    2016-09-01

    A new approach for constructing cloud instant messaging represented in this article allows users to encrypt data locally by using Diffie - Hellman key exchange protocol. The described approach allows to construct a cloud service which operates only by users encrypted messages; encryption and decryption takes place locally at the user party using a symmetric AES encryption. A feature of the service is the conferences support without the need for messages reecryption for each participant. In the article it is given an example of the protocol implementation on the ECC and RSA encryption algorithms basis, as well as a comparison of these implementations.

  11. Geoengineering by cloud seeding: influence on sea ice and climate system

    International Nuclear Information System (INIS)

    Rasch, Philip J; Latham, John; Chen, Chih-Chieh

    2009-01-01

    General circulation model computations using a fully coupled ocean-atmosphere model indicate that increasing cloud reflectivity by seeding maritime boundary layer clouds with particles made from seawater may compensate for some of the effects on climate of increasing greenhouse gas concentrations. The chosen seeding strategy (one of many possible scenarios) can restore global averages of temperature, precipitation and sea ice to present day values, but not simultaneously. The response varies nonlinearly with the extent of seeding, and geoengineering generates local changes to important climatic features. The global tradeoffs of restoring ice cover, and cooling the planet, must be assessed alongside the local changes to climate features.

  12. Cloud-shadow removal for Unmanned Aerial System multispectral imagery based on tensor decomposition methods

    DEFF Research Database (Denmark)

    Baum, Andreas; Wang, Sheng; Garcia, Monica

    2017-01-01

    that are mosaicked into larger images to produce ortho-photomaps. Frequently, especially in northern latitudes, the images to be mosaicked have been acquired under varying irradiance conditions due to moving clouds that create artifacts in the detected signal unrelated to physical changes in vegetation properties......, this study succeeded to remove the cloud shadow effects and image noise in UAS imagery providing normalized reflectance. The comparison between the corrected and un-corrected images shows a significant improvement for reflectance estimation in the shadow areas. Further, analysis of vegetation indices e...

  13. RECOVER: An Automated, Cloud-Based Decision Support System for Post-Fire Rehabilitation Planning

    Science.gov (United States)

    Schnase, J. L.; Carroll, M. L.; Weber, K. T.; Brown, M. E.; Gill, R. L.; Wooten, M.; May, J.; Serr, K.; Smith, E.; Goldsby, R.; Newtoff, K.; Bradford, K.; Doyle, C.; Volker, E.; Weber, S.

    2014-11-01

    RECOVER is a site-specific decision support system that automatically brings together in a single analysis environment the information necessary for post-fire rehabilitation decision-making. After a major wildfire, law requires that the federal land management agencies certify a comprehensive plan for public safety, burned area stabilization, resource protection, and site recovery. These burned area emergency response (BAER) plans are a crucial part of our national response to wildfire disasters and depend heavily on data acquired from a variety of sources. Final plans are due within 21 days of control of a major wildfire and become the guiding document for managing the activities and budgets for all subsequent remediation efforts. There are few instances in the federal government where plans of such wide-ranging scope and importance are assembled on such short notice and translated into action more quickly. RECOVER has been designed in close collaboration with our agency partners and directly addresses their high-priority decision-making requirements. In response to a fire detection event, RECOVER uses the rapid resource allocation capabilities of cloud computing to automatically collect Earth observational data, derived decision products, and historic biophysical data so that when the fire is contained, BAER teams will have a complete and ready-to-use RECOVER dataset and GIS analysis environment customized for the target wildfire. Initial studies suggest that RECOVER can transform this information-intensive process by reducing from days to a matter of minutes the time required to assemble and deliver crucial wildfire-related data.

  14. RECOVER: An Automated Cloud-Based Decision Support System for Post-fire Rehabilitation Planning

    Science.gov (United States)

    Schnase, John L.; Carroll, Mark; Weber, K. T.; Brown, Molly E.; Gill, Roger L.; Wooten, Margaret; May J.; Serr, K.; Smith, E.; Goldsby, R.; hide

    2014-01-01

    RECOVER is a site-specific decision support system that automatically brings together in a single analysis environment the information necessary for post-fire rehabilitation decision-making. After a major wildfire, law requires that the federal land management agencies certify a comprehensive plan for public safety, burned area stabilization, resource protection, and site recovery. These burned area emergency response (BAER) plans are a crucial part of our national response to wildfire disasters and depend heavily on data acquired from a variety of sources. Final plans are due within 21 days of control of a major wildfire and become the guiding document for managing the activities and budgets for all subsequent remediation efforts. There are few instances in the federal government where plans of such wide-ranging scope and importance are assembled on such short notice and translated into action more quickly. RECOVER has been designed in close collaboration with our agency partners and directly addresses their high-priority decision-making requirements. In response to a fire detection event, RECOVER uses the rapid resource allocation capabilities of cloud computing to automatically collect Earth observational data, derived decision products, and historic biophysical data so that when the fire is contained, BAER teams will have a complete and ready-to-use RECOVER dataset and GIS analysis environment customized for the target wildfire. Initial studies suggest that RECOVER can transform this information-intensive process by reducing from days to a matter of minutes the time required to assemble and deliver crucial wildfire-related data.

  15. Cloud Computing Security

    OpenAIRE

    Ngongang, Guy

    2011-01-01

    This project aimed to show how possible it is to use a network intrusion detection system in the cloud. The security in the cloud is a concern nowadays and security professionals are still finding means to make cloud computing more secure. First of all the installation of the ESX4.0, vCenter Server and vCenter lab manager in server hardware was successful in building the platform. This allowed the creation and deployment of many virtual servers. Those servers have operating systems and a...

  16. The cloud radiative feedback of a midlatitude squall line system and implication for climate study

    International Nuclear Information System (INIS)

    Chin, H.N.S.

    1992-01-01

    The main objectives of this study are (1) to study the impact of longwave and shortwave radiation on the thermodynamic and kinematic structure of a midlatitude squall line; and (2) to explore the influence of specifically including the ice phase in the cloud-radiation feedback mechanism for climate models

  17. GABE: A Cloud Brokerage System for Service Selection, Accountability and Enforcement

    Science.gov (United States)

    Sundareswaran, Smitha

    2014-01-01

    Much like its meteorological counterpart, "Cloud Computing" is an amorphous agglomeration of entities. It is amorphous in that the exact layout of the servers, the load balancers and their functions are neither known nor fixed. Its an agglomerate in that multiple service providers and vendors often coordinate to form a multitenant system…

  18. Cloud-Based RFID Mutual Authentication Protocol without Leaking Location Privacy to the Cloud

    OpenAIRE

    Dong, Qingkuan; Tong, Jiaqing; Chen, Yuan

    2015-01-01

    With the rapid developments of the IoT (Internet of Things) and the cloud computing, cloud-based RFID systems attract more attention. Users can reduce their cost of deploying and maintaining the RFID system by purchasing cloud services. However, the security threats of cloud-based RFID systems are more serious than those of traditional RFID systems. In cloud-based RFID systems, the connection between the reader and the cloud database is not secure and cloud service provider is not trusted. Th...

  19. Atmospheric diffusion of large clouds

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, T. V. [Univ. of California, Lawrence Radiation Lab., Livermore, California (United States)

    1967-07-01

    Clouds of pollutants travel within a coordinate system that is fixed to the earth's surface, and they diffuse and grow within a coordinate system fixed to the cloud's center. This paper discusses an approach to predicting the cloud's properties, within the latter coordinate system, on space scales of a few hundred meters to a few hundred kilometers and for time periods of a few days. A numerical cloud diffusion model is presented which starts with a cloud placed arbitrarily within the troposphere. Similarity theories of atmospheric turbulence are used to predict the horizontal diffusivity as a function of initial cloud size, turbulent atmospheric dissipation, and time. Vertical diffusivity is input as a function of time and height. Therefore, diurnal variations of turbulent diffusion in the boundary layer and effects of temperature inversions, etc. can be modeled. Nondiffusive cloud depletion mechanisms, such as dry deposition, washout, and radioactive decay, are also a part of this numerical model. An effluent cloud, produced by a reactor run at the Nuclear Rocket Development Station, Nevada, is discussed in this paper. Measurements on this cloud, for a period of two days, are compared to calculations with the above numerical cloud diffusion model. In general, there is agreement. within a factor of two, for airborne concentrations, cloud horizontal area, surface air concentrations, and dry deposition as airborne concentration decreased by seven orders of magnitude during the two-day period. (author)

  20. Green symbiotic cloud communications

    CERN Document Server

    Mustafa, H D; Desai, Uday B; Baveja, Brij Mohan

    2017-01-01

    This book intends to change the perception of modern day telecommunications. Communication systems, usually perceived as “dumb pipes”, carrying information / data from one point to another, are evolved into intelligently communicating smart systems. The book introduces a new field of cloud communications. The concept, theory, and architecture of this new field of cloud communications are discussed. The book lays down nine design postulates that form the basis of the development of a first of its kind cloud communication paradigm entitled Green Symbiotic Cloud Communications or GSCC. The proposed design postulates are formulated in a generic way to form the backbone for development of systems and technologies of the future. The book can be used to develop courses that serve as an essential part of graduate curriculum in computer science and electrical engineering. Such courses can be independent or part of high-level research courses. The book will also be of interest to a wide range of readers including b...

  1. Real-Time Cloud-Based Health Tracking and Monitoring System in Designed Boundary for Cardiology Patients

    Directory of Open Access Journals (Sweden)

    Aamir Shahzad

    2018-01-01

    Full Text Available Telemonitoring is not a new term, in information technology (IT, which has been employed to remotely monitor the health of patients that are located not in common places, such hospitals or medical centers. For that, wearable medical sensors, such as electrocardiography sensors, blood pressure sensors, and glucometer, have commonly been used to make possible to acquire the real-time information from the remotely located patients; therefore, the medical information is further carried, via the Internet, to perform medical diagnosis and the corresponding treatments. Like in other IT sectors, there has been tremendous progress accounted in medical sectors (and in telemonitoring systems that changes the human life protection against several chronic diseases, and the patient’s medical information can be accessed wirelessly via Wi-Fi and cellular systems. Further, with the advents of cloud computing technology, medical systems are now more efficient and scalable in processing, such as storage and access, the medical information with minimal development costs. This study is also a piece of enhancement made to track and monitor the real-time medical information, bounded in authorized area, through the modeling of private cloud computing. The private cloud-based environment is designed, for patient health monitoring called bounded telemonitoring system, to acquire the real-time medical information of patients that resided in the boundary, inside medical wards and outside medical wards, of the medical center. A new wireless sensor network scenario is designed and modeled to keep or monitor the patients’ health information whole day, 24 hours. This research is a new secured sight towards medical information access and gives directions for future developments in the medical systems.

  2. Business resilience system (BRS) driven through Boolean, fuzzy logics and cloud computation real and near real time analysis and decision making system

    CERN Document Server

    Zohuri, Bahman

    2017-01-01

    This book provides a technical approach to a Business Resilience System with its Risk Atom and Processing Data Point based on fuzzy logic and cloud computation in real time. Its purpose and objectives define a clear set of expectations for Organizations and Enterprises so their network system and supply chain are totally resilient and protected against cyber-attacks, manmade threats, and natural disasters. These enterprises include financial, organizational, homeland security, and supply chain operations with multi-point manufacturing across the world. Market shares and marketing advantages are expected to result from the implementation of the system. The collected information and defined objectives form the basis to monitor and analyze the data through cloud computation, and will guarantee the success of their survivability's against any unexpected threats. This book will be useful for advanced undergraduate and graduate students in the field of computer engineering, engineers that work for manufacturing com...

  3. Models of evaluating efficiency and risks on integration of cloud-base IT-services of the machine-building enterprise: a system approach

    Science.gov (United States)

    Razumnikov, S.; Kurmanbay, A.

    2016-04-01

    The present paper suggests a system approach to evaluation of the effectiveness and risks resulted from the integration of cloud-based services in a machine-building enterprise. This approach makes it possible to estimate a set of enterprise IT applications and choose the applications to be migrated to the cloud with regard to specific business requirements, a technological strategy and willingness to risk.

  4. Aerosols, clouds, and precipitation in the North Atlantic trades observed during the Barbados aerosol cloud experiment – Part 1: Distributions and variability

    Directory of Open Access Journals (Sweden)

    E. Jung

    2016-07-01

    clouds were less than 1 km deep. Clouds tend to precipitate when the cloud is thicker than 500–600 m. Distributions of cloud field characteristics (depth, radar reflectivity, Doppler velocity, precipitation were well identified in the reflectivity–velocity diagram from the cloud radar observations. Two types of precipitation features were observed for shallow marine cumulus clouds that may impact boundary layer differently: first, a classic cloud-base precipitation where precipitation shafts were observed to emanate from the cloud base; second, cloud-top precipitation where precipitation shafts emanated mainly near the cloud tops, sometimes accompanied by precipitation near the cloud base. The second type of precipitation was more frequently observed during the experiment. Only 42–44 % of the clouds sampled were non-precipitating throughout the entire cloud layer and the rest of the clouds showed precipitation somewhere in the cloud, predominantly closer to the cloud top.

  5. Research computing in a distributed cloud environment

    International Nuclear Information System (INIS)

    Fransham, K; Agarwal, A; Armstrong, P; Bishop, A; Charbonneau, A; Desmarais, R; Hill, N; Gable, I; Gaudet, S; Goliath, S; Impey, R; Leavett-Brown, C; Ouellete, J; Paterson, M; Pritchet, C; Penfold-Brown, D; Podaima, W; Schade, D; Sobie, R J

    2010-01-01

    The recent increase in availability of Infrastructure-as-a-Service (IaaS) computing clouds provides a new way for researchers to run complex scientific applications. However, using cloud resources for a large number of research jobs requires significant effort and expertise. Furthermore, running jobs on many different clouds presents even more difficulty. In order to make it easy for researchers to deploy scientific applications across many cloud resources, we have developed a virtual machine resource manager (Cloud Scheduler) for distributed compute clouds. In response to a user's job submission to a batch system, the Cloud Scheduler manages the distribution and deployment of user-customized virtual machines across multiple clouds. We describe the motivation for and implementation of a distributed cloud using the Cloud Scheduler that is spread across both commercial and dedicated private sites, and present some early results of scientific data analysis using the system.

  6. Zen of cloud learning cloud computing by examples on Microsoft Azure

    CERN Document Server

    Bai, Haishi

    2014-01-01

    Zen of Cloud: Learning Cloud Computing by Examples on Microsoft Azure provides comprehensive coverage of the essential theories behind cloud computing and the Windows Azure cloud platform. Sharing the author's insights gained while working at Microsoft's headquarters, it presents nearly 70 end-to-end examples with step-by-step guidance on implementing typical cloud-based scenarios.The book is organized into four sections: cloud service fundamentals, cloud solutions, devices and cloud, and system integration and project management. Each chapter contains detailed exercises that provide readers w

  7. CLOUD-BASED VS DESKTOP-BASED PROPERTY MANAGEMENT SYSTEMS IN HOTEL

    OpenAIRE

    Mustafa GULMEZ; Edina AJANOVIC; Ismail KARAYUN

    2015-01-01

    Even though keeping up with the modern developments in IT sector is crucial for the success and competitiveness of a hotel, it is usually very hard for new technologies to be accepted and implemented. This is the case with the cloud technology for which the opinions between hoteliers are divided on those who think that it is just another fashion trend, unnecessary to be taken into consideration and those that believe that it helps in performing daily operations more easily, leaving space for ...

  8. Designing a Patient Monitoring System Using Cloud and Semantic Web Technologies

    OpenAIRE

    Chryssa Thermolia; Ekaterini S. Bei; Stelios Sotiriadis; Kostas Stravoskoufos; Euripides G.M. Petrakis

    2015-01-01

    Moving into a new era of healthcare, new tools and devices are developed to extend and improve health services, such as remote patient monitoring and risk prevention. In this concept, Internet of Things (IoT) and Cloud Computing present great advantages by providing remote and efficient services, as well as cooperation between patients, clinicians, researchers and other health professionals. This paper focuses on patients suffering from bipolar disorder, a brain disorder ...

  9. ADSLANF: A negotiation framework for cloud management systems using a bulk negotiation behavioral learning approach

    OpenAIRE

    RAJAVEL, RAJKUMAR; THANGARATHINAM, MALA

    2017-01-01

    One of the major challenges in cloud computing is the development of a service-level agreement (SLA) negotiation framework using an intelligent third-party broker negotiation strategy. Current frameworks exploit various negotiation strategies using game theoretic, heuristic, and argumentation-based approaches for obtaining optimal negotiation with a better success rate (negotiation commitment). However, these approaches fail to optimize the negotiation round (NR), total negotiatio...

  10. Implementation of a micro-physical scheme for warm clouds in the meteorological model 'MERCURE': Application to cooling tower plumes and to orographic precipitation

    International Nuclear Information System (INIS)

    Bouzereau, Emmanuel

    2004-01-01

    A two-moment semi-spectral warm micro-physical scheme has been implemented inside the meteorological model 'MERCURE'. A new formulation of the buoyancy flux () is proposed, which is coherent with the corrigendum of Mellor (1977) but differs from Bougeault (1981). The non-precipitating cloud microphysics is validated by comparing the numerical simulations of fifteen cases of cooling tower plumes with data from a measurement campaign in Bugey in 1980. Satisfactory results are obtained on the plumes shape, on the temperature and vertical velocity fields and on the droplets spectrums, although the liquid water contents tend to be overestimated. The precipitating cloud microphysics is tested by reproducing the academical cases of orographic precipitation of Chaumerliac et al. (1987) and Richard and Chaumerliac (1989). The simulations allow a check of the action of different micro-physical terms. (author) [fr

  11. Technical Challenges and Lessons from the Migration of the GLOBE Data and Information System to Utilize Cloud Computing Service

    Science.gov (United States)

    Moses, J. F.; Memarsadeghi, N.; Overoye, D.; Littlefield, B.

    2016-12-01

    The Global Learning and Observation to Benefit the Environment (GLOBE) Data and Information System supports an international science and education program with capabilities to accept local environment observations, archive, display and visualize them along with global satellite observations. Since its inception twenty years ago, the Web and database system has been upgraded periodically to accommodate the changes in technology and the steady growth of GLOBE's education community and collection of observations. Recently, near the end-of-life of the system hardware, new commercial computer platform options were explored and a decision made to utilize Cloud services. Now the GLOBE DIS has been fully deployed and maintained using Amazon Cloud services for over two years now. This paper reviews the early risks, actual challenges, and some unexpected findings as a result of the GLOBE DIS migration. We describe the plans, cost drivers and estimates, highlight adjustments that were made and suggest improvements. We present the trade studies for provisioning, for load balancing, networks, processing , storage, as well as production, staging and backup systems. We outline the migration team's skills and required level of effort for transition, and resulting changes in the overall maintenance and operations activities. Examples include incremental adjustments to processing capacity and frequency of backups, and efforts previously expended on hardware maintenance that were refocused onto application-specific enhancements.

  12. Technical Challenges and Lessons from the Migration of the GLOBE Data and Information System to Utilize Cloud Computing Service

    Science.gov (United States)

    Moses, John F.; Memarsadeghi, Nargess; Overoye, David; Littlefield, Brain

    2017-01-01

    The Global Learning and Observation to Benefit the Environment (GLOBE) Data and Information System supports an international science and education program with capabilities to accept local environment observations, archive, display and visualize them along with global satellite observations. Since its inception twenty years ago, the Web and database system has been upgraded periodically to accommodate the changes in technology and the steady growth of GLOBEs education community and collection of observations. Recently, near the end-of-life of the system hardware, new commercial computer platform options were explored and a decision made to utilize Cloud services. Now the GLOBE DIS has been fully deployed and maintained using Amazon Cloud services for over two years now. This paper reviews the early risks, actual challenges, and some unexpected findings as a result of the GLOBE DIS migration. We describe the plans, cost drivers and estimates, highlight adjustments that were made and suggest improvements. We present the trade studies for provisioning, for load balancing, networks, processing, storage, as well as production, staging and backup systems. We outline the migration teams skills and required level of effort for transition, and resulting changes in the overall maintenance and operations activities. Examples include incremental adjustments to processing capacity and frequency of backups, and efforts previously expended on hardware maintenance that were refocused onto application-specific enhancements.

  13. Cloud Cover

    Science.gov (United States)

    Schaffhauser, Dian

    2012-01-01

    This article features a major statewide initiative in North Carolina that is showing how a consortium model can minimize risks for districts and help them exploit the advantages of cloud computing. Edgecombe County Public Schools in Tarboro, North Carolina, intends to exploit a major cloud initiative being refined in the state and involving every…

  14. Cloud Control

    Science.gov (United States)

    Ramaswami, Rama; Raths, David; Schaffhauser, Dian; Skelly, Jennifer

    2011-01-01

    For many IT shops, the cloud offers an opportunity not only to improve operations but also to align themselves more closely with their schools' strategic goals. The cloud is not a plug-and-play proposition, however--it is a complex, evolving landscape that demands one's full attention. Security, privacy, contracts, and contingency planning are all…

  15. Security Audit Compliance for Cloud Computing

    OpenAIRE

    Doelitzscher, Frank

    2014-01-01

    Cloud computing has grown largely over the past three years and is widely popular amongst today's IT landscape. In a comparative study between 250 IT decision makers of UK companies they said, that they already use cloud services for 61% of their systems. Cloud vendors promise "infinite scalability and resources" combined with on-demand access from everywhere. This lets cloud users quickly forget, that there is still a real IT infrastructure behind a cloud. Due to virtualization and multi-ten...

  16. MULTI TENANCY SECURITY IN CLOUD COMPUTING

    OpenAIRE

    Manjinder Singh*, Charanjit Singh

    2017-01-01

    The word Cloud is used as a metaphor for the internet, based on standardised use of a cloud like shape to denote a network. Cloud Computing is advanced technology for resource sharing through network with less cost as compare to other technologies. Cloud infrastructure supports various models IAAS, SAAS, PAAS. The term virtualization in cloud computing is very useful today. With the help of virtualization, more than one operating system is supported with all resources on single H/W. We can al...

  17. VMware vCloud director cookbook

    CERN Document Server

    Langenhan, Daniel

    2013-01-01

    VMware vCloud Director Cookbook will adopt a Cookbook-based approach. Packed with illustrations and programming examples, this book explains the simple as well as the complex recipes in an easy-to-understand language.""VMware vCloud Director Cookbook"" is aimed at system administrators and technical architects moving from a virtualized environment to cloud environments. Familiarity with cloud computing platforms and some knowledge of virtualization and managing cloud environments is expected.

  18. A Survey Paper on Privacy Issue in Cloud Computing

    OpenAIRE

    Yousra Abdul Alsahib S. Aldeen; Mazleena Salleh; Mohammad Abdur Razzaque

    2015-01-01

    In past few years, cloud computing is one of the popular paradigm to host and deliver services over Internet. It is having popularity by offering multiple computing services as cloud storage, cloud hosting and cloud servers etc., for various types of businesses as well as in academics. Though there are several benefits of cloud computing, it suffers from security and privacy challenges. Privacy of cloud system is a serious concern for the customers. Considering the privacy within the cloud th...

  19. The Evolution of Cloud Computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Berghaus, Frank; Love, Peter; Leblanc, Matthew Edgar; Di Girolamo, Alessandro; Paterson, Michael; Gable, Ian; Sobie, Randall; Field, Laurence

    2015-01-01

    The ATLAS experiment has successfully incorporated cloud computing technology and cloud resources into its primarily grid-based model of distributed computing. Cloud R&D activities continue to mature and transition into stable production systems, while ongoing evolutionary changes are still needed to adapt and refine the approaches used, in response to changes in prevailing cloud technology. In addition, completely new developments are needed to handle emerging requirements. This work will describe the overall evolution of cloud computing in ATLAS. The current status of the VM management systems used for harnessing IAAS resources will be discussed. Monitoring and accounting systems tailored for clouds are needed to complete the integration of cloud resources within ATLAS' distributed computing framework. We are developing and deploying new solutions to address the challenge of operation in a geographically distributed multi-cloud scenario, including a system for managing VM images across multiple clouds, ...

  20. Relation of Cloud Occurrence Frequency, Overlap, and Effective Thickness Derived from CALIPSO and CloudSat Merged Cloud Vertical Profiles

    Science.gov (United States)

    Kato, Seiji; Sun-Mack, Sunny; Miller, Walter F.; Rose, Fred G.; Chen, Yan; Minnis, Patrick; Wielicki, Bruce A.

    2009-01-01

    A cloud frequency of occurrence matrix is generated using merged cloud vertical profile derived from Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) and Cloud Profiling Radar (CPR). The matrix contains vertical profiles of cloud occurrence frequency as a function of the uppermost cloud top. It is shown that the cloud fraction and uppermost cloud top vertical pro les can be related by a set of equations when the correlation distance of cloud occurrence, which is interpreted as an effective cloud thickness, is introduced. The underlying assumption in establishing the above relation is that cloud overlap approaches the random overlap with increasing distance separating cloud layers and that the probability of deviating from the random overlap decreases exponentially with distance. One month of CALIPSO and CloudSat data support these assumptions. However, the correlation distance sometimes becomes large, which might be an indication of precipitation. The cloud correlation distance is equivalent to the de-correlation distance introduced by Hogan and Illingworth [2000] when cloud fractions of both layers in a two-cloud layer system are the same.

  1. CLOUD COMPUTING TECHNOLOGY TRENDS

    Directory of Open Access Journals (Sweden)

    Cristian IVANUS

    2014-05-01

    Full Text Available Cloud computing has been a tremendous innovation, through which applications became available online, accessible through an Internet connection and using any computing device (computer, smartphone or tablet. According to one of the most recent studies conducted in 2012 by Everest Group and Cloud Connect, 57% of companies said they already use SaaS application (Software as a Service, and 38% reported using standard tools PaaS (Platform as a Service. However, in the most cases, the users of these solutions highlighted the fact that one of the main obstacles in the development of this technology is the fact that, in cloud, the application is not available without an Internet connection. The new challenge of the cloud system has become now the offline, specifically accessing SaaS applications without being connected to the Internet. This topic is directly related to user productivity within companies as productivity growth is one of the key promises of cloud computing system applications transformation. The aim of this paper is the presentation of some important aspects related to the offline cloud system and regulatory trends in the European Union (EU.

  2. Development of Two-Moment Cloud Microphysics for Liquid and Ice Within the NASA Goddard Earth Observing System Model (GEOS-5)

    Science.gov (United States)

    Barahona, Donifan; Molod, Andrea M.; Bacmeister, Julio; Nenes, Athanasios; Gettelman, Andrew; Morrison, Hugh; Phillips, Vaughan,; Eichmann, Andrew F.

    2013-01-01

    This work presents the development of a two-moment cloud microphysics scheme within the version 5 of the NASA Goddard Earth Observing System (GEOS-5). The scheme includes the implementation of a comprehensive stratiform microphysics module, a new cloud coverage scheme that allows ice supersaturation and a new microphysics module embedded within the moist convection parameterization of GEOS-5. Comprehensive physically-based descriptions of ice nucleation, including homogeneous and heterogeneous freezing, and liquid droplet activation are implemented to describe the formation of cloud particles in stratiform clouds and convective cumulus. The effect of preexisting ice crystals on the formation of cirrus clouds is also accounted for. A new parameterization of the subgrid scale vertical velocity distribution accounting for turbulence and gravity wave motion is developed. The implementation of the new microphysics significantly improves the representation of liquid water and ice in GEOS-5. Evaluation of the model shows agreement of the simulated droplet and ice crystal effective and volumetric radius with satellite retrievals and in situ observations. The simulated global distribution of supersaturation is also in agreement with observations. It was found that when using the new microphysics the fraction of condensate that remains as liquid follows a sigmoidal increase with temperature which differs from the linear increase assumed in most models and is in better agreement with available observations. The performance of the new microphysics in reproducing the observed total cloud fraction, longwave and shortwave cloud forcing, and total precipitation is similar to the operational version of GEOS-5 and in agreement with satellite retrievals. However the new microphysics tends to underestimate the coverage of persistent low level stratocumulus. Sensitivity studies showed that the simulated cloud properties are robust to moderate variation in cloud microphysical parameters

  3. Impact of Surface Active Ionic Liquids on the Cloud Points of Nonionic Surfactants and the Formation of Aqueous Micellar Two-Phase Systems.

    Science.gov (United States)

    Vicente, Filipa A; Cardoso, Inês S; Sintra, Tânia E; Lemus, Jesus; Marques, Eduardo F; Ventura, Sónia P M; Coutinho, João A P

    2017-09-21

    Aqueous micellar two-phase systems (AMTPS) hold a large potential for cloud point extraction of biomolecules but are yet poorly studied and characterized, with few phase diagrams reported for these systems, hence limiting their use in extraction processes. This work reports a systematic investigation of the effect of different surface-active ionic liquids (SAILs)-covering a wide range of molecular properties-upon the clouding behavior of three nonionic Tergitol surfactants. Two different effects of the SAILs on the cloud points and mixed micelle size have been observed: ILs with a more hydrophilic character and lower critical packing parameter (CPP formation of smaller micelles and concomitantly increase the cloud points; in contrast, ILs with a more hydrophobic character and higher CPP (CPP ≥ 1) induce significant micellar growth and a decrease in the cloud points. The latter effect is particularly interesting and unusual for it was accepted that cloud point reduction is only induced by inorganic salts. The effects of nonionic surfactant concentration, SAIL concentration, pH, and micelle ζ potential are also studied and rationalized.

  4. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    Directory of Open Access Journals (Sweden)

    Krithika Bhuvaneshwar

    2015-01-01

    Full Text Available Next generation sequencing (NGS technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.

  5. An Approach to the Optimization of Mobile Payments for the Transport System using (NFC through Cloud Computing

    Directory of Open Access Journals (Sweden)

    Estevan Gómez Torres

    2017-02-01

    Full Text Available The use of mobile devices has gradually increased. Every day, the number of organizations adopting systems which include some type of mobile payments is becoming bigger. Because of this reason, it is mandatory to have agile and quick systems to guarantee security and reliability, not only for the user but also for the operator. In that way, users will get a high-quality service based on mobile technologies. An analysis of NFC Technology has been made in this paper taking into consideration a proposal of a system development including mobile payments, which could be used in the transportation system of Metro of Quito. To assure the response time and transactional security, the use of cloud computing is recommended.

  6. SUPERNOVA PROPAGATION AND CLOUD ENRICHMENT: A NEW MODEL FOR THE ORIGIN OF 60Fe IN THE EARLY SOLAR SYSTEM

    International Nuclear Information System (INIS)

    Gounelle, Matthieu; Meibom, Anders; Hennebelle, Patrick; Inutsuka, Shu-ichiro

    2009-01-01

    The radioactive isotope 60 Fe (T 1/2 = 1.5 Myr) was present in the early solar system. It is unlikely that it was injected directly into the nascent solar system by a single, nearby supernova (SN). It is proposed instead that it was inherited during the molecular cloud (MC) stage from several SNe belonging to previous episodes of star formation. The expected abundance of 60 Fe in star-forming regions is estimated taking into account the stochasticity of the star-forming process, and it is showed that many MCs are expected to contain 60 Fe (and possibly 26 Al [T 1/2 = 0.74 Myr]) at a level compatible with that of the nascent solar system. Therefore, no special explanation is needed to account for our solar system's formation.

  7. Feasibility study of using the RoboEarth cloud engine for rapid mapping and tracking with small unmanned aerial systems

    Science.gov (United States)

    Li-Chee-Ming, J.; Armenakis, C.

    2014-11-01

    This paper presents the ongoing development of a small unmanned aerial mapping system (sUAMS) that in the future will track its trajectory and perform 3D mapping in near-real time. As both mapping and tracking algorithms require powerful computational capabilities and large data storage facilities, we propose to use the RoboEarth Cloud Engine (RCE) to offload heavy computation and store data to secure computing environments in the cloud. While the RCE's capabilities have been demonstrated with terrestrial robots in indoor environments, this paper explores the feasibility of using the RCE in mapping and tracking applications in outdoor environments by small UAMS. The experiments presented in this work assess the data processing strategies and evaluate the attainable tracking and mapping accuracies using the data obtained by the sUAMS. Testing was performed with an Aeryon Scout quadcopter. It flew over York University, up to approximately 40 metres above the ground. The quadcopter was equipped with a single-frequency GPS receiver providing positioning to about 3 meter accuracies, an AHRS (Attitude and Heading Reference System) estimating the attitude to about 3 degrees, and an FPV (First Person Viewing) camera. Video images captured from the onboard camera were processed using VisualSFM and SURE, which are being reformed as an Application-as-a-Service via the RCE. The 3D virtual building model of York University was used as a known environment to georeference the point cloud generated from the sUAMS' sensor data. The estimated position and orientation parameters of the video camera show increases in accuracy when compared to the sUAMS' autopilot solution, derived from the onboard GPS and AHRS. The paper presents the proposed approach and the results, along with their accuracies.

  8. Carbon Sequestration Estimation of Street Trees Based on Point Cloud from Vehicle-Borne Laser Scanning System

    Science.gov (United States)

    Zhao, Y.; Hu, Q.

    2017-09-01

    Continuous development of urban road traffic system requests higher standards of road ecological environment. Ecological benefits of street trees are getting more attention. Carbon sequestration of street trees refers to the carbon stocks of street trees, which can be a measurement for ecological benefits of street trees. Estimating carbon sequestration in a traditional way is costly and inefficient. In order to solve above problems, a carbon sequestration estimation approach for street trees based on 3D point cloud from vehicle-borne laser scanning system is proposed in this paper. The method can measure the geometric parameters of a street tree, including tree height, crown width, diameter at breast height (DBH), by processing and analyzing point cloud data of an individual tree. Four Chinese scholartree trees and four camphor trees are selected for experiment. The root mean square error (RMSE) of tree height is 0.11m for Chinese scholartree and 0.02m for camphor. Crown widths in X direction and Y direction, as well as the average crown width are calculated. And the RMSE of average crown width is 0.22m for Chinese scholartree and 0.10m for camphor. The last calculated parameter is DBH, the RMSE of DBH is 0.5cm for both Chinese scholartree and camphor. Combining the measured geometric parameters and an appropriate carbon sequestration calculation model, the individual tree's carbon sequestration will be estimated. The proposed method can help enlarge application range of vehicle-borne laser point cloud data, improve the efficiency of estimating carbon sequestration, construct urban ecological environment and manage landscape.

  9. Evaluation of Passive Multilayer Cloud Detection Using Preliminary CloudSat and CALIPSO Cloud Profiles

    Science.gov (United States)

    Minnis, P.; Sun-Mack, S.; Chang, F.; Huang, J.; Nguyen, L.; Ayers, J. K.; Spangenberg, D. A.; Yi, Y.; Trepte, C. R.

    2006-12-01

    During the last few years, several algorithms have been developed to detect and retrieve multilayered clouds using passive satellite data. Assessing these techniques has been difficult due to the need for active sensors such as cloud radars and lidars that can "see" through different layers of clouds. Such sensors have been available only at a few surface sites and on aircraft during field programs. With the launch of the CALIPSO and CloudSat satellites on April 28, 2006, it is now possible to observe multilayered systems all over the globe using collocated cloud radar and lidar data. As part of the A- Train, these new active sensors are also matched in time ad space with passive measurements from the Aqua Moderate Resolution Imaging Spectroradiometer (MODIS) and Advanced Microwave Scanning Radiometer - EOS (AMSR-E). The Clouds and the Earth's Radiant Energy System (CERES) has been developing and testing algorithms to detect ice-over-water overlapping cloud systems and to retrieve the cloud liquid path (LWP) and ice water path (IWP) for those systems. One technique uses a combination of the CERES cloud retrieval algorithm applied to MODIS data and a microwave retrieval method applied to AMSR-E data. The combination of a CO2-slicing cloud retireval technique with the CERES algorithms applied to MODIS data (Chang et al., 2005) is used to detect and analyze such overlapped systems that contain thin ice clouds. A third technique uses brightness temperature differences and the CERES algorithms to detect similar overlapped methods. This paper uses preliminary CloudSat and CALIPSO data to begin a global scale assessment of these different methods. The long-term goals are to assess and refine the algorithms to aid the development of an optimal combination of the techniques to better monitor ice 9and liquid water clouds in overlapped conditions.

  10. Cloud information system «Redactor.Online» for the create of periodic scientific-educational editions

    Directory of Open Access Journals (Sweden)

    E. V. Golubev

    2016-01-01

    Full Text Available The article presents the approaches to the creation of cloud information system for automation of business processes related to the preparation and publication of periodic scientific and educational publications (publishing. The aim of the research is the development of concepts, models, patterns, architecture of such a system, the choice of software implementation. The urgency of development based on the results of a study of existing and used in practice technologies for creating electronic versions of scientific journals and other types of scientific and educational resources. It is expected that the use of cloud-based systems will reduce the time and cost of publishing houses, as well as improve the quality of published material (for example, through the use of their interactive and multimedia elements, add comments and assessment opportunities articles and so forth..Description is given on an example of cloud Redactor.Online system developed by small innovative enterprises of Petrozavodsk State University «Internet-business-system». As a means of implementation chosen freelydistributed products, such as PostgreSQL, PHP, Yii Framework.Logically, the cloud structure of the system is a set of components, based on common data sources and interacting with each other. The main components of the system are the basic (general part of promo-site system, editorial offices of periodicals created in the system, media sites, mobile applications editions, complex control system as a whole.The central element of the system architecture is the editorial offices. It provides a set of features related to the preparation for release and publication of the magazine, including the management of the lifecycle of the article (step preparation of the article the author, its review, proofreading, typesetting, translation, publication, and export to external citation indexes. Available functionality of this component is determined by the role assigned to the user

  11. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  12. Storm and cloud dynamics

    CERN Document Server

    Cotton, William R

    1992-01-01

    This book focuses on the dynamics of clouds and of precipitating mesoscale meteorological systems. Clouds and precipitating mesoscale systems represent some of the most important and scientifically exciting weather systems in the world. These are the systems that produce torrential rains, severe winds including downburst and tornadoes, hail, thunder and lightning, and major snow storms. Forecasting such storms represents a major challenge since they are too small to be adequately resolved by conventional observing networks and numerical prediction models.Key Features* Key Highlight

  13. CLOUD PARAMETERIZATIONS, CLOUD PHYSICS, AND THEIR CONNECTIONS: AN OVERVIEW

    International Nuclear Information System (INIS)

    LIU, Y.; DAUM, P.H.; CHAI, S.K.; LIU, F.

    2002-01-01

    This paper consists of three parts. The first part is concerned with the parameterization of cloud microphysics in climate models. We demonstrate the crucial importance of spectral dispersion of the cloud droplet size distribution in determining radiative properties of clouds (e.g., effective radius), and underline the necessity of specifying spectral dispersion in the parameterization of cloud microphysics. It is argued that the inclusion of spectral dispersion makes the issue of cloud parameterization essentially equivalent to that of the droplet size distribution function, bringing cloud parameterization to the forefront of cloud physics. The second part is concerned with theoretical investigations into the spectral shape of droplet size distributions in cloud physics. After briefly reviewing the mainstream theories (including entrainment and mixing theories, and stochastic theories), we discuss their deficiencies and the need for a paradigm shift from reductionist approaches to systems approaches. A systems theory that has recently been formulated by utilizing ideas from statistical physics and information theory is discussed, along with the major results derived from it. It is shown that the systems formalism not only easily explains many puzzles that have been frustrating the mainstream theories, but also reveals such new phenomena as scale-dependence of cloud droplet size distributions. The third part is concerned with the potential applications of the systems theory to the specification of spectral dispersion in terms of predictable variables and scale-dependence under different fluctuating environments

  14. A Discovery of a Compact High Velocity Cloud-Galactic Supershell System

    Science.gov (United States)

    Park, Geumsook; Koo, Bon-Chul; Kang, Ji-hyun; Gibson, Steven J.; Peek, Joshua Eli Goldston; Douglas, Kevin A.; Korpela, Eric J.; Heiles, Carl E.

    2017-01-01

    High velocity clouds (HVCs) are neutral hydrogen (HI) gas clouds having very different radial velocities from those of the Galactic disk material. While some large HVC complexes are known to be gas streams tidally stripped from satellite galaxies of the Milky Way, there are relatively isolated and small angular-sized HVCs, so called “compact HVCs (CHVCs)”, the origin of which remains controversial. There are about 300 known CHVCs in the Milky Way, and many of them show a head-tail structure, implying a ram pressure interaction with the diffuse Galactic halo gas. It is, however, not clear whether CHVCs are completely dissipated in the Galactic halo to feed the multi-phase circumgalactic medium or they can survive their trip through the halo and collide with the Galactic disk. The colliding CHVCs may leave a gigantic trail in the disk, and it had been suggested that some of HI supershells that require ≧ 3 x 1052 erg may be produced by the collision of such HVCs.Here we report the detection of a kiloparsec (kpc)-size supershell in the outskirts of the Milky Way with the compact HVC 040+01-282 (hereafter, CHVC040) at its geometrical center using the “Inner-Galaxy Arecibo L-band Feed Array” HI 21 cm survey data. The morphological and physical properties of both objects suggest that CHVC040, which is either a fragment of a nearby disrupted galaxy or a cloud that originated from an intergalactic accreting flow, collided with the disk ˜5 Myr ago to form the supershell. Our results show that some compact HVCs can survive their trip through the Galactic halo and inject energy and momentum into the Milky Way disk.

  15. Circumstellar Disks and Outflows in Turbulent Molecular Cloud Cores: Possible Formation Mechanism for Misaligned Systems

    Energy Technology Data Exchange (ETDEWEB)

    Matsumoto, Tomoaki [Faculty of Sustainability Studies, Hosei University, Fujimi, Chiyoda-ku, Tokyo 102-8160 (Japan); Machida, Masahiro N. [Department of Earth and Planetary Sciences, Kyushu University, Fukuoka 812-8581 (Japan); Inutsuka, Shu-ichiro, E-mail: matsu@hosei.ac.jp [Department of Physics, Nagoya University, Chikusa-ku, Nagoya 464-8602 (Japan)

    2017-04-10

    We investigate the formation of circumstellar disks and outflows subsequent to the collapse of molecular cloud cores with the magnetic field and turbulence. Numerical simulations are performed by using an adaptive mesh refinement to follow the evolution up to ∼1000 years after the formation of a protostar. In the simulations, circumstellar disks are formed around the protostars; those in magnetized models are considerably smaller than those in nonmagnetized models, but their size increases with time. The models with stronger magnetic fields tend to produce smaller disks. During evolution in the magnetized models, the mass ratios of a disk to a protostar is approximately constant at ∼1%–10%. The circumstellar disks are aligned according to their angular momentum, and the outflows accelerate along the magnetic field on the 10–100 au scale; this produces a disk that is misaligned with the outflow. The outflows are classified into two types: a magnetocentrifugal wind and a spiral flow. In the latter, because of the geometry, the axis of rotation is misaligned with the magnetic field. The magnetic field has an internal structure in the cloud cores, which also causes misalignment between the outflows and the magnetic field on the scale of the cloud core. The distribution of the angular momentum vectors in a core also has a non-monotonic internal structure. This should create a time-dependent accretion of angular momenta onto the circumstellar disk. Therefore, the circumstellar disks are expected to change their orientation as well as their sizes in the long-term evolutions.

  16. A Proposed Smart E-Learning System Using Cloud Computing Services: PaaS, IaaS and Web 3.0

    Directory of Open Access Journals (Sweden)

    Dr.Mona M. Nasr

    2012-09-01

    Full Text Available E-learning systems need to improve its infrastructure, which can devote the required computation and storage resources for e-learning systems. Microsoft cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. The objective of the paper is to combine various technologies to design architecture which describe E-learning systems. Web 3.0 uses widget aggregation, intelligent retrieval, user interest modeling and semantic annotation. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Cloud computing provides a low cost solution to academic institutions for their researchers, faculty and learners. In this paper we integrate cloud computing as a platform with web 3.0 for building intelligent e-learning systems.

  17. Gas Condensates onto a LHC Type Cryogenic Vacuum System Subjected to Electron Cloud

    CERN Multimedia

    Baglin, V

    2004-01-01

    In the Large Hadron Collider (LHC), the gas desorbed via photon stimulated molecular desorption or electron stimulated molecular desorption will be physisorbed onto the beam screen held between 5 and 20 K. Studies of the effects of the electron cloud onto a LHC type cryogenic vacuum chamber have been done with the cold bore experiment (COLDEX) installed in the CERN Super Proton Synchrotron (SPS). Experiments performed with gas condensates such as H2, H2O, CO and CO2 are described. Implications for the LHC design and operation are discussed.

  18. Cloud Computing Security: A Survey

    Directory of Open Access Journals (Sweden)

    Issa M. Khalil

    2014-02-01

    Full Text Available Cloud computing is an emerging technology paradigm that migrates current technological and computing concepts into utility-like solutions similar to electricity and water systems. Clouds bring out a wide range of benefits including configurable computing resources, economic savings, and service flexibility. However, security and privacy concerns are shown to be the primary obstacles to a wide adoption of clouds. The new concepts that clouds introduce, such as multi-tenancy, resource sharing and outsourcing, create new challenges to the security community. Addressing these challenges requires, in addition to the ability to cultivate and tune the security measures developed for traditional computing systems, proposing new security policies, models, and protocols to address the unique cloud security challenges. In this work, we provide a comprehensive study of cloud computing security and privacy concerns. We identify cloud vulnerabilities, classify known security threats and attacks, and present the state-of-the-art practices to control the vulnerabilities, neutralize the threats, and calibrate the attacks. Additionally, we investigate and identify the limitations of the current solutions and provide insights of the future security perspectives. Finally, we provide a cloud security framework in which we present the various lines of defense and identify the dependency levels among them. We identify 28 cloud security threats which we classify into five categories. We also present nine general cloud attacks along with various attack incidents, and provide effectiveness analysis of the proposed countermeasures.

  19. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  20. Pellet ablation and cloud flow characteristics in the JIPP T-IIU plasma with the injection-angle controllable system

    International Nuclear Information System (INIS)

    Sakakita, H.; Sato, K.N.; Liang, R.; Hamada, Y.; Ando, A.; Kano, Y.; Sakamoto, M.

    1994-01-01

    Pellet ablation and flow characteristics of ablation cloud have been studied in the JIPP T-IIU plasma by using an injection-angle controllable system. A new technique for an ice pellet injection system with controllability of injection angle has been developed and installed to the JIPP T-IIU tokamak in order to vary deposition profile of ice pellets within a plasma. Injection angle can be varied easily and successfully during an interval of two plasma shots in the course of an experiment, so that one can carry out various basic experiments by varying the pellet deposition profile. The injection angle has been varied poloidally from -6 to 6 degree by changing the angle of the last stage drift tube. This situation makes possible for pellets to aim at from about r = -2a/3 to r = 2a/3 of the plasma. From two dimensional observations by CCD cameras, details of the pellet ablation structures with various injection angles have been studied, and a couple of interesting phenomena have been found. In the case of an injection angle (θ) larger than a certain value (θ ≥ 4 o ), a pellet penetrates straightly through the plasma with a trace of straight ablation cloud, which has been expected from usual theoretical consideration. On the other hand, a long helical tail of ablation light has been observed in the case of the angle smaller than the certain value (θ ≤ 4 o ). (author) 4 refs., 4 figs

  1. Cloud-based services for your library a LITA guide

    CERN Document Server

    Mitchell, Erik T

    2013-01-01

    By exploring specific examples of cloud computing and virtualization, this book allows libraries considering cloud computing to start their exploration of these systems with a more informed perspective.

  2. The Development Of Windows Service Based Data Log System Using Light Dependent Resistor And Thingspeak IOT Cloud Platform

    Directory of Open Access Journals (Sweden)

    Tristan Jay P. Calaguas

    2017-03-01

    Full Text Available Microcontrollers are using in control and information processing it can be used in wide application such as agriculture health care commercial facilities robotics and education. These micro controllers are computers in chip that comprises of input and output ports central processing unit registers and main memory as well as communication interface such as Ethernet interface serial interface High Definition Multimedia Interface power source and many existing interface that can be found in this type of computer. In this study the researcher decided to conceptualize an innovative application of this type of computer where it has a potential to use as tracking system in specific individuals activities. Since some of office people are complaining in CCTV camera about their privacy this innovative concept of technology is in similar purpose but if we will compare the application concept in closed circuit camera the researcher decided to use visual graph instead video data that is in high exposure In the first phase the researcher made a concept on how the simple Light Dependent Resistor will apply in Schools Office Environment Application domain using microcontroller that was used as data log system and how this can be optimized without forcing the Dean or any designated person in office to operate it in hand due to their busy working hours. In the second phase the researcher develop the proposed data log system that are acquiring data through light luminance from fluorescent light of deans office and sending it in the IOT cloud platform. The researcher used fuzzy logic theory to model the operation of the proposed data log system. This study used experimental type of research when the prototype was developed during second phase the researcher simulated the operation. As the result the proposed data log system is sending data to Thingspeak IOT Cloud platform it displays the correct output which based from the rules and it is in column graph content

  3. The influence of rain and clouds on a satellite dual frequency radar altimeter system operating at 13 and 35 GHz

    Science.gov (United States)

    Walsh, E. J.; Monaldo, F. M.; Goldhirsh, J.

    1983-01-01

    The effects of inhomogeneous spatial attenuation resulting from clouds and rain on the altimeter estimate of the range to mean sea level are modelled. It is demonstrated that typical cloud and rain attenuation variability at commonly expected spatial scales can significantly degrade altimeter range precision. Rain cell and cloud scale sizes and attenuations are considered as factors. The model simulation of altimeter signature distortion is described, and the distortion of individual radar pulse waveforms by different spatial scales of attenuation is considered. Examples of range errors found for models of a single cloud, a rain cell, and cloud streets are discussed.

  4. Software Defined Resource Orchestration System for Multitask Application in Heterogeneous Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Qi Qi

    2016-01-01

    Full Text Available The mobile cloud computing (MCC that combines mobile computing and cloud concept takes wireless access network as the transmission medium and uses mobile devices as the client. When offloading the complicated multitask application to the MCC environment, each task executes individually in terms of its own computation, storage, and bandwidth requirement. Due to user’s mobility, the provided resources contain different performance metrics that may affect the destination choice. Nevertheless, these heterogeneous MCC resources lack integrated management and can hardly cooperate with each other. Thus, how to choose the appropriate offload destination and orchestrate the resources for multitask is a challenge problem. This paper realizes a programming resource provision for heterogeneous energy-constrained computing environments, where a software defined controller is responsible for resource orchestration, offload, and migration. The resource orchestration is formulated as multiobjective optimal problem that contains the metrics of energy consumption, cost, and availability. Finally, a particle swarm algorithm is used to obtain the approximate optimal solutions. Simulation results show that the solutions for all of our studied cases almost can hit Pareto optimum and surpass the comparative algorithm in approximation, coverage, and execution time.

  5. Life Cycle of Midlatitude Deep Convective Systems in a Lagrangian Framework

    Science.gov (United States)

    Feng, Zhe; Dong, Xiquan; Xie, Baike; McFarlane, Sally A.; Kennedy, Aaron; Lin, Bing; Minnis, Patrick

    2012-01-01

    Deep Convective Systems (DCSs) consist of intense convective cores (CC), large stratiform rain (SR) regions, and extensive non-precipitating anvil clouds (AC). This study focuses on the evolution of these three components and the factors that affect convective AC production. An automated satellite tracking method is used in conjunction with a recently developed multi-sensor hybrid classification to analyze the evolution of DCS structure in a Lagrangian framework over the central United States. Composite analysis from 4221 tracked DCSs during two warm seasons (May-August, 2010-2011) shows that maximum system size correlates with lifetime, and longer-lived DCSs have more extensive SR and AC. Maximum SR and AC area lag behind peak convective intensity and the lag increases linearly from approximately 1-hour for short-lived systems to more than 3-hours for long-lived ones. The increased lag, which depends on the convective environment, suggests that changes in the overall diabatic heating structure associated with the transition from CC to SR and AC could prolong the system lifetime by sustaining stratiform cloud development. Longer-lasting systems are associated with up to 60% higher mid-tropospheric relative humidity and up to 40% stronger middle to upper tropospheric wind shear. Regression analysis shows that the areal coverage of thick AC is strongly correlated with the size of CC, updraft strength, and SR area. Ambient upper tropospheric wind speed and wind shear also play an important role for convective AC production where for systems with large AC (radius greater than 120-km) they are 24% and 20% higher, respectively, than those with small AC (radius=20 km).

  6. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  7. Mobile Clouds

    DEFF Research Database (Denmark)

    Fitzek, Frank; Katz, Marcos

    A mobile cloud is a cooperative arrangement of dynamically connected communication nodes sharing opportunistic resources. In this book, authors provide a comprehensive and motivating overview of this rapidly emerging technology. The book explores how distributed resources can be shared by mobile...... users in very different ways and for various purposes. The book provides many stimulating examples of resource-sharing applications. Enabling technologies for mobile clouds are also discussed, highlighting the key role of network coding. Mobile clouds have the potential to enhance communications...... performance, improve utilization of resources and create flexible platforms to share resources in very novel ways. Energy efficient aspects of mobile clouds are discussed in detail, showing how being cooperative can bring mobile users significant energy saving. The book presents and discusses multiple...

  8. Context-aware distributed cloud computing using CloudScheduler

    Science.gov (United States)

    Seuster, R.; Leavett-Brown, CR; Casteels, K.; Driemel, C.; Paterson, M.; Ring, D.; Sobie, RJ; Taylor, RP; Weldon, J.

    2017-10-01

    The distributed cloud using the CloudScheduler VM provisioning service is one of the longest running systems for HEP workloads. It has run millions of jobs for ATLAS and Belle II over the past few years using private and commercial clouds around the world. Our goal is to scale the distributed cloud to the 10,000-core level, with the ability to run any type of application (low I/O, high I/O and high memory) on any cloud. To achieve this goal, we have been implementing changes that utilize context-aware computing designs that are currently employed in the mobile communication industry. Context-awareness makes use of real-time and archived data to respond to user or system requirements. In our distributed cloud, we have many opportunistic clouds with no local HEP services, software or storage repositories. A context-aware design significantly improves the reliability and performance of our system by locating the nearest location of the required services. We describe how we are collecting and managing contextual information from our workload management systems, the clouds, the virtual machines and our services. This information is used not only to monitor the system but also to carry out automated corrective actions. We are incrementally adding new alerting and response services to our distributed cloud. This will enable us to scale the number of clouds and virtual machines. Further, a context-aware design will enable us to run analysis or high I/O application on opportunistic clouds. We envisage an open-source HTTP data federation (for example, the DynaFed system at CERN) as a service that would provide us access to existing storage elements used by the HEP experiments.

  9. An Assessment of the Application of Pharma Cloud System to the National Health Insurance Program of Taiwan and the Result in Hospitals.

    Science.gov (United States)

    Yan, Yu-Hua; Lu, Chen-Luan

    2016-01-01

    National Health Insurance Administration established Pharma Cloud System in July 2014. The purpose is to decrease therapeutic duplications and enhance public medication safety. Comparison will be made among individual hospitals and the administering branches of National Health Insurance Bureau (NHIB) on the statistical data on the inquiry of the cloud medication history record system to understand the result of the installation and advocacy of this system. The results show (1) there were 2,329,846 entries of data collected from the branches of the NHIB from 2015 on cloud medication history record and 50,224 entries of data from individual hospitals. (2) The inquiry rate at the branches of the NHIB was 43.2% from January to April, 2015 and at individual hospitals was 18.8%. (3) The improvement rate at the branches of the NHIB was 32.5% and at the individual hospitals was 47.0% from January to April, 2015.

  10. Initial Field Test of a Cloud-Based Cardiac Auscultation System to Determine Murmur Etiology in Rural China.

    Science.gov (United States)

    Pyles, Lee; Hemmati, Pouya; Pan, J; Yu, Xiaoju; Liu, Ke; Wang, Jing; Tsakistos, Andreas; Zheleva, Bistra; Shao, Weiguang; Ni, Quan

    2017-04-01

    A system for collection, distribution, and long distant, asynchronous interpretation of cardiac auscultation has been developed and field-tested in rural China. We initiated a proof-of-concept test as a critical component of design of a system to allow rural physicians with little experience in evaluation of congenital heart disease (CHD) to obtain assistance in diagnosis and management of children with significant heart disease. The project tested the hypothesis that acceptable screening of heart murmurs could be accomplished using a digital stethoscope and internet cloud transmittal to deliver phonocardiograms to an experienced observer. Of the 7993 children who underwent school-based screening in the Menghai District of Yunnan Province, Peoples Republic of China, 149 had a murmur noted by a screener. They had digital heart sounds and phonocardiograms collected with the HeartLink tele auscultation system, and underwent echocardiography by a cardiology resident from the First Affiliated Hospital of Kunming Medical University. The digital phonocardiograms, stored on a cloud server, were later remotely reviewed by a board-certified American pediatric cardiologist. Fourteen of these subjects were found to have CHD confirmed by echocardiogram. Using the HeartLink system, the pediatric cardiologist identified 11 of the 14 subjects with pathological murmurs, and missed three subjects with atrial septal defects, which were incorrectly identified as venous hum or Still's murmur. In addition, ten subjects were recorded as having pathological murmurs, when no CHD was confirmed by echocardiography during the field study. The overall test accuracy was 91% with 78.5% sensitivity and 92.6% specificity. This proof-of-concept study demonstrated the feasibility of differentiating pathologic murmurs due to CHD from normal functional heart murmurs with the HeartLink system. This field study is an initial step to develop a cost-effective CHD screening strategy in low

  11. Time-dependent, non-monotonic response of warm convective cloud fields to changes in aerosol loading

    Directory of Open Access Journals (Sweden)

    G. Dagan

    2017-06-01

    Full Text Available Large eddy simulations (LESs with bin microphysics are used here to study cloud fields' sensitivity to changes in aerosol loading and the time evolution of this response. Similarly to the known response of a single cloud, we show that the mean field properties change in a non-monotonic trend, with an optimum aerosol concentration for which the field reaches its maximal water mass or rain yield. This trend is a result of competition between processes that encourage cloud development versus those that suppress it. However, another layer of complexity is added when considering clouds' impact on the field's thermodynamic properties and how this is dependent on aerosol loading. Under polluted conditions, rain is suppressed and the non-precipitating clouds act to increase atmospheric instability. This results in warming of the lower part of the cloudy layer (in which there is net condensation and cooling of the upper part (net evaporation. Evaporation at the upper part of the cloudy layer in the polluted simulations raises humidity at these levels and thus amplifies the development of the next generation of clouds (preconditioning effect. On the other hand, under clean conditions, the precipitating clouds drive net warming of the cloudy layer and net cooling of the sub-cloud layer due to rain evaporation. These two effects act to stabilize the atmospheric boundary layer with time (consumption of the instability. The evolution of the field's thermodynamic properties affects the cloud properties in return, as shown by the migration of the optimal aerosol concentration toward higher values.

  12. ATLAS cloud R and D

    International Nuclear Information System (INIS)

    Panitkin, Sergey; Bejar, Jose Caballero; Hover, John; Zaytsev, Alexander; Megino, Fernando Barreiro; Girolamo, Alessandro Di; Kucharczyk, Katarzyna; Llamas, Ramon Medrano; Benjamin, Doug; Gable, Ian; Paterson, Michael; Sobie, Randall; Taylor, Ryan; Hendrix, Val; Love, Peter; Ohman, Henrik; Walker, Rodney

    2014-01-01

    The computing model of the ATLAS experiment was designed around the concept of grid computing and, since the start of data taking, this model has proven very successful. However, new cloud computing technologies bring attractive features to improve the operations and elasticity of scientific distributed computing. ATLAS sees grid and cloud computing as complementary technologies that will coexist at different levels of resource abstraction, and two years ago created an R and D working group to investigate the different integration scenarios. The ATLAS Cloud Computing R and D has been able to demonstrate the feasibility of offloading work from grid to cloud sites and, as of today, is able to integrate transparently various cloud resources into the PanDA workload management system. The ATLAS Cloud Computing R and D is operating various PanDA queues on private and public resources and has provided several hundred thousand CPU days to the experiment. As a result, the ATLAS Cloud Computing R and D group has gained a significant insight into the cloud computing landscape and has identified points that still need to be addressed in order to fully utilize this technology. This contribution will explain the cloud integration models that are being evaluated and will discuss ATLAS' learning during the collaboration with leading commercial and academic cloud providers.

  13. ATLAS Cloud R&D

    Science.gov (United States)

    Panitkin, Sergey; Barreiro Megino, Fernando; Caballero Bejar, Jose; Benjamin, Doug; Di Girolamo, Alessandro; Gable, Ian; Hendrix, Val; Hover, John; Kucharczyk, Katarzyna; Medrano Llamas, Ramon; Love, Peter; Ohman, Henrik; Paterson, Michael; Sobie, Randall; Taylor, Ryan; Walker, Rodney; Zaytsev, Alexander; Atlas Collaboration

    2014-06-01

    The computing model of the ATLAS experiment was designed around the concept of grid computing and, since the start of data taking, this model has proven very successful. However, new cloud computing technologies bring attractive features to improve the operations and elasticity of scientific distributed computing. ATLAS sees grid and cloud computing as complementary technologies that will coexist at different levels of resource abstraction, and two years ago created an R&D working group to investigate the different integration scenarios. The ATLAS Cloud Computing R&D has been able to demonstrate the feasibility of offloading work from grid to cloud sites and, as of today, is able to integrate transparently various cloud resources into the PanDA workload management system. The ATLAS Cloud Computing R&D is operating various PanDA queues on private and public resources and has provided several hundred thousand CPU days to the experiment. As a result, the ATLAS Cloud Computing R&D group has gained a significant insight into the cloud computing landscape and has identified points that still need to be addressed in order to fully utilize this technology. This contribution will explain the cloud integration models that are being evaluated and will discuss ATLAS' learning during the collaboration with leading commercial and academic cloud providers.

  14. Development of the regional EPR and PACS sharing system on the infrastructure of cloud computing technology controlled by patient identifier cross reference manager.

    Science.gov (United States)

    Kondoh, Hiroshi; Teramoto, Kei; Kawai, Tatsurou; Mochida, Maki; Nishimura, Motohiro

    2013-01-01

    A Newly developed Oshidori-Net2, providing medical professionals with remote access to electronic patient record systems (EPR) and PACSs of four hospitals, of different venders, using cloud computing technology and patient identifier cross reference manager. The operation was started from April 2012. The patients moved to other hospital were applied. Objective is to show the merit and demerit of the new system.

  15. Maximize Minimum Utility Function of Fractional Cloud Computing System Based on Search Algorithm Utilizing the Mittag-Leffler Sum

    Directory of Open Access Journals (Sweden)

    Rabha W. Ibrahim

    2018-01-01

    Full Text Available The maximum min utility function (MMUF problem is an important representative of a large class of cloud computing systems (CCS. Having numerous applications in practice, especially in economy and industry. This paper introduces an effective solution-based search (SBS algorithm for solving the problem MMUF. First, we suggest a new formula of the utility function in term of the capacity of the cloud. We formulate the capacity in CCS, by using a fractional diffeo-integral equation. This equation usually describes the flow of CCS. The new formula of the utility function is modified recent active utility functions. The suggested technique first creates a high-quality initial solution by eliminating the less promising components, and then develops the quality of the achieved solution by the summation search solution (SSS. This method is considered by the Mittag-Leffler sum as hash functions to determine the position of the agent. Experimental results commonly utilized in the literature demonstrate that the proposed algorithm competes approvingly with the state-of-the-art algorithms both in terms of solution quality and computational efficiency.

  16. On the integrability of a Hamiltonian reduction of a 2+1-dimensional non-isothermal rotating gas cloud system

    International Nuclear Information System (INIS)

    Rogers, C; Schief, W K

    2011-01-01

    A 2+1-dimensional version of a non-isothermal gas dynamic system with origins in the work of Ovsiannikov and Dyson on spinning gas clouds is shown to admit a Hamiltonian reduction which is completely integrable when the adiabatic index γ = 2. This nonlinear dynamical subsystem is obtained via an elliptic vortex ansatz which is intimately related to the construction of a Lax pair in the integrable case. The general solution of the gas dynamic system is derived in terms of Weierstrass (elliptic) functions. The latter derivation makes use of a connection with a stationary nonlinear Schrödinger equation and a Steen–Ermakov–Pinney equation, the superposition principle of which is based on the classical Lamé equation

  17. Cloud-Based Smart Health Monitoring System for Automatic Cardiovascular and Fall Risk Assessment in Hypertensive Patients.

    Science.gov (United States)

    Melillo, P; Orrico, A; Scala, P; Crispino, F; Pecchia, L

    2015-10-01

    The aim of this paper is to describe the design and the preliminary validation of a platform developed to collect and automatically analyze biomedical signals for risk assessment of vascular events and falls in hypertensive patients. This m-health platform, based on cloud computing, was designed to be flexible, extensible, and transparent, and to provide proactive remote monitoring via data-mining functionalities. A retrospective study was conducted to train and test the platform. The developed system was able to predict a future vascular event within the next 12 months with an accuracy rate of 84 % and to identify fallers with an accuracy rate of 72 %. In an ongoing prospective trial, almost all the recruited patients accepted favorably the system with a limited rate of inadherences causing data losses (<20 %). The developed platform supported clinical decision by processing tele-monitored data and providing quick and accurate risk assessment of vascular events and falls.

  18. The Evolution of Cloud Computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; Berghaus, Frank; Brasolin, Franco; Cordeiro, Cristovao; Desmarais, Ron; Field, Laurence; Gable, Ian; Giordano, Domenico; Di Girolamo, Alessandro; Hover, John; Leblanc, Matthew Edgar; Love, Peter; Paterson, Michael; Sobie, Randall; Zaytsev, Alexandr

    2015-01-01

    The ATLAS experiment has successfully incorporated cloud computing technology and cloud resources into its primarily grid-based model of distributed computing. Cloud R&D activities continue to mature and transition into stable production systems, while ongoing evolutionary changes are still needed to adapt and refine the approaches used, in response to changes in prevailing cloud technology. In addition, completely new developments are needed to handle emerging requirements. This paper describes the overall evolution of cloud computing in ATLAS. The current status of the virtual machine (VM) management systems used for harnessing infrastructure as a service (IaaS) resources are discussed. Monitoring and accounting systems tailored for clouds are needed to complete the integration of cloud resources within ATLAS' distributed computing framework. We are developing and deploying new solutions to address the challenge of operation in a geographically distributed multi-cloud scenario, including a system for ma...

  19. Concept of a Cloud Service for Data Preparation and Computational Control on Custom HPC Systems in Application to Molecular Dynamics

    Science.gov (United States)

    Puzyrkov, Dmitry; Polyakov, Sergey; Podryga, Viktoriia; Markizov, Sergey

    2018-02-01

    At the present stage of computer technology development it is possible to study the properties and processes in complex systems at molecular and even atomic levels, for example, by means of molecular dynamics methods. The most interesting are problems related with the study of complex processes under real physical conditions. Solving such problems requires the use of high performance computing systems of various types, for example, GRID systems and HPC clusters. Considering the time consuming computational tasks, the need arises of software for automatic and unified monitoring of such computations. A complex computational task can be performed over different HPC systems. It requires output data synchronization between the storage chosen by a scientist and the HPC system used for computations. The design of the computational domain is also quite a problem. It requires complex software tools and algorithms for proper atomistic data generation on HPC systems. The paper describes the prototype of a cloud service, intended for design of atomistic systems of large volume for further detailed molecular dynamic calculations and computational management for this calculations, and presents the part of its concept aimed at initial data generation on the HPC systems.

  20. Concept of a Cloud Service for Data Preparation and Computational Control on Custom HPC Systems in Application to Molecular Dynamics

    Directory of Open Access Journals (Sweden)

    Puzyrkov Dmitry

    2018-01-01

    Full Text Available At the present stage of computer technology development it is possible to study the properties and processes in complex systems at molecular and even atomic levels, for example, by means of molecular dynamics methods. The most interesting are problems related with the study of complex processes under real physical conditions. Solving such problems requires the use of high performance computing systems of various types, for example, GRID systems and HPC clusters. Considering the time consuming computational tasks, the need arises of software for automatic and unified monitoring of such computations. A complex computational task can be performed over different HPC systems. It requires output data synchronization between the storage chosen by a scientist and the HPC system used for computations. The design of the computational domain is also quite a problem. It requires complex software tools and algorithms for proper atomistic data generation on HPC systems. The paper describes the prototype of a cloud service, intended for design of atomistic systems of large volume for further detailed molecular dynamic calculations and computational management for this calculations, and presents the part of its concept aimed at initial data generation on the HPC systems.