WorldWideScience

Sample records for sampling cloud base

  1. First Transmitted Hyperspectral Light Measurements and Cloud Properties from Recent Field Campaign Sampling Clouds Under Biomass Burning Aerosol

    Science.gov (United States)

    Leblanc, S.; Redemann, Jens; Shinozuka, Yohei; Flynn, Connor J.; Segal Rozenhaimer, Michal; Kacenelenbogen, Meloe Shenandoah; Pistone, Kristina Marie Myers; Schmidt, Sebastian; Cochrane, Sabrina

    2016-01-01

    We present a first view of data collected during a recent field campaign aimed at measuring biomass burning aerosol above clouds from airborne platforms. The NASA ObseRvations of CLouds above Aerosols and their intEractionS (ORACLES) field campaign recently concluded its first deployment sampling clouds and overlying aerosol layer from the airborne platform NASA P3. We present results from the Spectrometer for Sky-Scanning, Sun-Tracking Atmospheric Research (4STAR), in conjunction with the Solar Spectral Flux Radiometers (SSFR). During this deployment, 4STAR sampled transmitted solar light either via direct solar beam measurements and scattered light measurements, enabling the measurement of aerosol optical thickness and the retrieval of information on aerosol particles in addition to overlying cloud properties. We focus on the zenith-viewing scattered light measurements, which are used to retrieve cloud optical thickness, effective radius, and thermodynamic phase of clouds under a biomass burning layer. The biomass burning aerosol layer present above the clouds is the cause of potential bias in retrieved cloud optical depth and effective radius from satellites. We contrast the typical reflection based approach used by satellites to the transmission based approach used by 4STAR during ORACLES for retrieving cloud properties. It is suspected that these differing approaches will yield a change in retrieved properties since light transmitted through clouds is sensitive to a different cloud volume than reflected light at cloud top. We offer a preliminary view of the implications of these differences in sampling volumes to the calculation of cloud radiative effects (CRE).

  2. Stratocumulus Cloud Top Radiative Cooling and Cloud Base Updraft Speeds

    Science.gov (United States)

    Kazil, J.; Feingold, G.; Balsells, J.; Klinger, C.

    2017-12-01

    Cloud top radiative cooling is a primary driver of turbulence in the stratocumulus-topped marine boundary. A functional relationship between cloud top cooling and cloud base updraft speeds may therefore exist. A correlation of cloud top radiative cooling and cloud base updraft speeds has been recently identified empirically, providing a basis for satellite retrieval of cloud base updraft speeds. Such retrievals may enable analysis of aerosol-cloud interactions using satellite observations: Updraft speeds at cloud base co-determine supersaturation and therefore the activation of cloud condensation nuclei, which in turn co-determine cloud properties and precipitation formation. We use large eddy simulation and an off-line radiative transfer model to explore the relationship between cloud-top radiative cooling and cloud base updraft speeds in a marine stratocumulus cloud over the course of the diurnal cycle. We find that during daytime, at low cloud water path (CWP correlated, in agreement with the reported empirical relationship. During the night, in the absence of short-wave heating, CWP builds up (CWP > 50 g m-2) and long-wave emissions from cloud top saturate, while cloud base heating increases. In combination, cloud top cooling and cloud base updrafts become weakly anti-correlated. A functional relationship between cloud top cooling and cloud base updraft speed can hence be expected for stratocumulus clouds with a sufficiently low CWP and sub-saturated long-wave emissions, in particular during daytime. At higher CWPs, in particular at night, the relationship breaks down due to saturation of long-wave emissions from cloud top.

  3. +Cloud: An Agent-Based Cloud Computing Platform

    OpenAIRE

    González, Roberto; Hernández de la Iglesia, Daniel; de la Prieta Pintado, Fernando; Gil González, Ana Belén

    2017-01-01

    Cloud computing is revolutionizing the services provided through the Internet, and is continually adapting itself in order to maintain the quality of its services. This study presents the platform +Cloud, which proposes a cloud environment for storing information and files by following the cloud paradigm. This study also presents Warehouse 3.0, a cloud-based application that has been developed to validate the services provided by +Cloud.

  4. Cloud networking understanding cloud-based data center networks

    CERN Document Server

    Lee, Gary

    2014-01-01

    Cloud Networking: Understanding Cloud-Based Data Center Networks explains the evolution of established networking technologies into distributed, cloud-based networks. Starting with an overview of cloud technologies, the book explains how cloud data center networks leverage distributed systems for network virtualization, storage networking, and software-defined networking. The author offers insider perspective to key components that make a cloud network possible such as switch fabric technology and data center networking standards. The final chapters look ahead to developments in architectures

  5. STAR FORMATION LAWS: THE EFFECTS OF GAS CLOUD SAMPLING

    International Nuclear Information System (INIS)

    Calzetti, D.; Liu, G.; Koda, J.

    2012-01-01

    Recent observational results indicate that the functional shape of the spatially resolved star formation-molecular gas density relation depends on the spatial scale considered. These results may indicate a fundamental role of sampling effects on scales that are typically only a few times larger than those of the largest molecular clouds. To investigate the impact of this effect, we construct simple models for the distribution of molecular clouds in a typical star-forming spiral galaxy and, assuming a power-law relation between star formation rate (SFR) and cloud mass, explore a range of input parameters. We confirm that the slope and the scatter of the simulated SFR-molecular gas surface density relation depend on the size of the sub-galactic region considered, due to stochastic sampling of the molecular cloud mass function, and the effect is larger for steeper relations between SFR and molecular gas. There is a general trend for all slope values to tend to ∼unity for region sizes larger than 1-2 kpc, irrespective of the input SFR-cloud relation. The region size of 1-2 kpc corresponds to the area where the cloud mass function becomes fully sampled. We quantify the effects of selection biases in data tracing the SFR, either as thresholds (i.e., clouds smaller than a given mass value do not form stars) or as backgrounds (e.g., diffuse emission unrelated to current star formation is counted toward the SFR). Apparently discordant observational results are brought into agreement via this simple model, and the comparison of our simulations with data for a few galaxies supports a steep (>1) power-law index between SFR and molecular gas.

  6. Sampling of solid particles in clouds

    International Nuclear Information System (INIS)

    Feuillebois, F.; Lasek, A.; Scibilia, M.F.

    1986-01-01

    This paper is concerned with the sampling of small solid particles from clouds by an airborne apparatus to be mounted on an airplane for meteorological investigations. In the airborne experiment the particles entering the test tube should be as representative as possible of the upstream conditions ahead of the plane, in the real cloud. Due to the inertia of the particles, the proportion of the different sizes of particles entering the test tube depends on the location of the tube mouth. We present a method of calculating the real concentration in particles of different sizes, using the results of measurements executed during the flight of an airplane in a cloud. Two geometries are considered: the nose of the airplane, represented schematically by a hemisphere, and a wing represented by a (2D) Joukowski profile which matches well a NACA 0015 profile on its leading edge

  7. On Cloud-based Oversubscription

    OpenAIRE

    Householder, Rachel; Arnold, Scott; Green, Robert

    2014-01-01

    Rising trends in the number of customers turning to the cloud for their computing needs has made effective resource allocation imperative for cloud service providers. In order to maximize profits and reduce waste, providers have started to explore the role of oversubscribing cloud resources. However, the benefits of cloud-based oversubscription are not without inherent risks. This paper attempts to unveil the incentives, risks, and techniques behind oversubscription in a cloud infrastructure....

  8. Relationship between cloud radiative forcing, cloud fraction and cloud albedo, and new surface-based approach for determining cloud albedo

    OpenAIRE

    Y. Liu; W. Wu; M. P. Jensen; T. Toto

    2011-01-01

    This paper focuses on three interconnected topics: (1) quantitative relationship between surface shortwave cloud radiative forcing, cloud fraction, and cloud albedo; (2) surfaced-based approach for measuring cloud albedo; (3) multiscale (diurnal, annual and inter-annual) variations and covariations of surface shortwave cloud radiative forcing, cloud fraction, and cloud albedo. An analytical expression is first derived to quantify the relationship between cloud radiative forcing, cloud fractio...

  9. Cloud GIS Based Watershed Management

    Science.gov (United States)

    Bediroğlu, G.; Colak, H. E.

    2017-11-01

    In this study, we generated a Cloud GIS based watershed management system with using Cloud Computing architecture. Cloud GIS is used as SAAS (Software as a Service) and DAAS (Data as a Service). We applied GIS analysis on cloud in terms of testing SAAS and deployed GIS datasets on cloud in terms of DAAS. We used Hybrid cloud computing model in manner of using ready web based mapping services hosted on cloud (World Topology, Satellite Imageries). We uploaded to system after creating geodatabases including Hydrology (Rivers, Lakes), Soil Maps, Climate Maps, Rain Maps, Geology and Land Use. Watershed of study area has been determined on cloud using ready-hosted topology maps. After uploading all the datasets to systems, we have applied various GIS analysis and queries. Results shown that Cloud GIS technology brings velocity and efficiency for watershed management studies. Besides this, system can be easily implemented for similar land analysis and management studies.

  10. Sampling hydrometeors in clouds in-situ - the replicator technique

    Science.gov (United States)

    Wex, Heike; Löffler, Mareike; Griesche, Hannes; Bühl, Johannes; Stratmann, Frank; Schmitt, Carl; Dirksen, Ruud; Reichardt, Jens; Wolf, Veronika; Kuhn, Thomas; Prager, Lutz; Seifert, Patric

    2017-04-01

    For the examination of ice crystals in clouds, concerning their number concentrations, sizes and shapes, often instruments mounted on fast flying aircraft are used. One related disadvantage is possible shattering of the ice crystals on inlets, which has been improved with the introduction of the "Korolev-tip" and by accounting for inter-arrival times (Korolev et al., 2013, 2015), but additionally, the typically fast flying aircraft allow only for a low spatial resolution. Alternative sampling methods have been introduced as e.g., a replicator by Miloshevich & Heymsfield (1997) and an in-situ imager by by Kuhn & Heymsfield (2016). They both sample ice crystals onto an advancing stripe while ascending on a balloon, conserving the ice crystals either in formvar for later off-line analysis under a microscope (Miloshevich & Heymsfield, 1997) or imaging them upon their impaction on silicone oil (Kuhn & Heymsfield, 2016), both yielding vertical profiles for different ice crystal properties. A measurement campaign was performed at the Lindenberg Meteorological Observatory of the German Meteorological Service (DWD) in Germany in October 2016, during which both types of instruments were used during balloon ascents, while ground-based Lidar and cloud-radar measurements were performed simultaneously. The two ice particle sondes were operated by people from the Lulea University of Technology and from TROPOS, where the latter one was made operational only recently. Here, we will show first results of the TROPOS replicator on ice crystals sampled during one ascent, for which the collected ice crystals were analyzed off-line using a microscope. Literature: Korolev, A., E. Emery, and K. Creelman (2013), Modification and tests of particle probe tips to mitigate effects of ice shattering, J. Atmos. Ocean. Tech., 30, 690-708, 2013. Korolev, A., and P. R. Field (2015), Assessment of the performance of the inter-arrival time algorithm to identify ice shattering artifacts in cloud

  11. Cloud-Based RFID Mutual Authentication Protocol without Leaking Location Privacy to the Cloud

    OpenAIRE

    Dong, Qingkuan; Tong, Jiaqing; Chen, Yuan

    2015-01-01

    With the rapid developments of the IoT (Internet of Things) and the cloud computing, cloud-based RFID systems attract more attention. Users can reduce their cost of deploying and maintaining the RFID system by purchasing cloud services. However, the security threats of cloud-based RFID systems are more serious than those of traditional RFID systems. In cloud-based RFID systems, the connection between the reader and the cloud database is not secure and cloud service provider is not trusted. Th...

  12. pCloud: A Cloud-based Power Market Simulation Environment

    Energy Technology Data Exchange (ETDEWEB)

    Rudkevich, Aleksandr; Goldis, Evgeniy

    2012-12-02

    This research conducted by the Newton Energy Group, LLC (NEG) is dedicated to the development of pCloud: a Cloud-based Power Market Simulation Environment. pCloud is offering power industry stakeholders the capability to model electricity markets and is organized around the Software as a Service (SaaS) concept -- a software application delivery model in which software is centrally hosted and provided to many users via the internet. During the Phase I of this project NEG developed a prototype design for pCloud as a SaaS-based commercial service offering, system architecture supporting that design, ensured feasibility of key architecture's elements, formed technological partnerships and negotiated commercial agreements with partners, conducted market research and other related activities and secured funding for continue development of pCloud between the end of Phase I and beginning of Phase II, if awarded. Based on the results of Phase I activities, NEG has established that the development of a cloud-based power market simulation environment within the Windows Azure platform is technologically feasible, can be accomplished within the budget and timeframe available through the Phase II SBIR award with additional external funding. NEG believes that pCloud has the potential to become a game-changing technology for the modeling and analysis of electricity markets. This potential is due to the following critical advantages of pCloud over its competition: - Standardized access to advanced and proven power market simulators offered by third parties. - Automated parallelization of simulations and dynamic provisioning of computing resources on the cloud. This combination of automation and scalability dramatically reduces turn-around time while offering the capability to increase the number of analyzed scenarios by a factor of 10, 100 or even 1000. - Access to ready-to-use data and to cloud-based resources leading to a reduction in software, hardware, and IT costs

  13. Cloud-based Networked Visual Servo Control

    OpenAIRE

    Wu, Haiyan; Lu, Lei; Chen, Chih-Chung; Hirche, Sandra; Kühnlenz, Kolja

    2013-01-01

    The performance of vision-based control systems, in particular of highly dynamic vision-based motion control systems, is often limited by the low sampling rate of the visual feedback caused by the long image processing time. In order to overcome this problem, the networked visual servo control, which integrates networked computational resources for cloud image processing, is considered in this article. The main contributions of this article are i) a real-time transport protocol for transmitti...

  14. A comparison of food crispness based on the cloud model.

    Science.gov (United States)

    Wang, Minghui; Sun, Yonghai; Hou, Jumin; Wang, Xia; Bai, Xue; Wu, Chunhui; Yu, Libo; Yang, Jie

    2018-02-01

    The cloud model is a typical model which transforms the qualitative concept into the quantitative description. The cloud model has been used less extensively in texture studies before. The purpose of this study was to apply the cloud model in food crispness comparison. The acoustic signals of carrots, white radishes, potatoes, Fuji apples, and crystal pears were recorded during compression. And three time-domain signal characteristics were extracted, including sound intensity, maximum short-time frame energy, and waveform index. The three signal characteristics and the cloud model were used to compare the crispness of the samples mentioned above. The crispness based on the Ex value of the cloud model, in a descending order, was carrot > potato > white radish > Fuji apple > crystal pear. To verify the results of the acoustic signals, mechanical measurement and sensory evaluation were conducted. The results of the two verification experiments confirmed the feasibility of the cloud model. The microstructures of the five samples were also analyzed. The microstructure parameters were negatively related with crispness (p cloud model method can be used for crispness comparison of different kinds of foods. The method is more accurate than the traditional methods such as mechanical measurement and sensory evaluation. The cloud model method can also be applied to other texture studies extensively. © 2017 Wiley Periodicals, Inc.

  15. Cloud Based Applications and Platforms (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Brodt-Giles, D.

    2014-05-15

    Presentation to the Cloud Computing East 2014 Conference, where we are highlighting our cloud computing strategy, describing the platforms on the cloud (including Smartgrid.gov), and defining our process for implementing cloud based applications.

  16. Point Cloud Based Change Detection - an Automated Approach for Cloud-based Services

    Science.gov (United States)

    Collins, Patrick; Bahr, Thomas

    2016-04-01

    The fusion of stereo photogrammetric point clouds with LiDAR data or terrain information derived from SAR interferometry has a significant potential for 3D topographic change detection. In the present case study latest point cloud generation and analysis capabilities are used to examine a landslide that occurred in the village of Malin in Maharashtra, India, on 30 July 2014, and affected an area of ca. 44.000 m2. It focuses on Pléiades high resolution satellite imagery and the Airbus DS WorldDEMTM as a product of the TanDEM-X mission. This case study was performed using the COTS software package ENVI 5.3. Integration of custom processes and automation is supported by IDL (Interactive Data Language). Thus, ENVI analytics is running via the object-oriented and IDL-based ENVITask API. The pre-event topography is represented by the WorldDEMTM product, delivered with a raster of 12 m x 12 m and based on the EGM2008 geoid (called pre-DEM). For the post-event situation a Pléiades 1B stereo image pair of the AOI affected was obtained. The ENVITask "GeneratePointCloudsByDenseImageMatching" was implemented to extract passive point clouds in LAS format from the panchromatic stereo datasets: • A dense image-matching algorithm is used to identify corresponding points in the two images. • A block adjustment is applied to refine the 3D coordinates that describe the scene geometry. • Additionally, the WorldDEMTM was input to constrain the range of heights in the matching area, and subsequently the length of the epipolar line. The "PointCloudFeatureExtraction" task was executed to generate the post-event digital surface model from the photogrammetric point clouds (called post-DEM). Post-processing consisted of the following steps: • Adding the geoid component (EGM 2008) to the post-DEM. • Pre-DEM reprojection to the UTM Zone 43N (WGS-84) coordinate system and resizing. • Subtraction of the pre-DEM from the post-DEM. • Filtering and threshold based classification of

  17. Peltier-based cloud chamber

    Science.gov (United States)

    Nar, Sevda Yeliz; Cakir, Altan

    2018-02-01

    Particles produced by nuclear decay, cosmic radiation and reactions can be identified through various methods. One of these methods that has been effective in the last century is the cloud chamber. The chamber makes visible cosmic particles that we are exposed to radiation per second. Diffusion cloud chamber is a kind of cloud chamber that is cooled by dry ice. This traditional model has some application difficulties. In this work, Peltier-based cloud chamber cooled by thermoelectric modules is studied. The new model provided uniformly cooled base of the chamber, moreover, it has longer lifetime than the traditional chamber in terms of observation time. This gain has reduced the costs which spent each time for cosmic particle observation. The chamber is an easy-to-use system according to traditional diffusion cloud chamber. The new model is portable, easier to make, and can be used in the nuclear physics experiments. In addition, it would be very useful to observe Muons which are the direct evidence for Lorentz contraction and time expansion predicted by Einsteins special relativity principle.

  18. Identity-Based Authentication for Cloud Computing

    Science.gov (United States)

    Li, Hongwei; Dai, Yuanshun; Tian, Ling; Yang, Haomiao

    Cloud computing is a recently developed new technology for complex systems with massive-scale services sharing among numerous users. Therefore, authentication of both users and services is a significant issue for the trust and security of the cloud computing. SSL Authentication Protocol (SAP), once applied in cloud computing, will become so complicated that users will undergo a heavily loaded point both in computation and communication. This paper, based on the identity-based hierarchical model for cloud computing (IBHMCC) and its corresponding encryption and signature schemes, presented a new identity-based authentication protocol for cloud computing and services. Through simulation testing, it is shown that the authentication protocol is more lightweight and efficient than SAP, specially the more lightweight user side. Such merit of our model with great scalability is very suited to the massive-scale cloud.

  19. Cloud-based Networked Visual Servo Control

    DEFF Research Database (Denmark)

    Wu, Haiyan; Lu, Lei; Chen, Chih-Chung

    2013-01-01

    , which integrates networked computational resources for cloud image processing, is considered in this article. The main contributions of this article are i) a real-time transport protocol for transmitting large volume image data on a cloud computing platform, which enables high sampling rate visual...

  20. Cloud fraction and cloud base measurements from scanning Doppler lidar during WFIP-2

    Science.gov (United States)

    Bonin, T.; Long, C.; Lantz, K. O.; Choukulkar, A.; Pichugina, Y. L.; McCarty, B.; Banta, R. M.; Brewer, A.; Marquis, M.

    2017-12-01

    The second Wind Forecast Improvement Project (WFIP-2) consisted of an 18-month field deployment of a variety of instrumentation with the principle objective of validating and improving NWP forecasts for wind energy applications in complex terrain. As a part of the set of instrumentation, several scanning Doppler lidars were installed across the study domain to primarily measure profiles of the mean wind and turbulence at high-resolution within the planetary boundary layer. In addition to these measurements, Doppler lidar observations can be used to directly quantify the cloud fraction and cloud base, since clouds appear as a high backscatter return. These supplementary measurements of clouds can then be used to validate cloud cover and other properties in NWP output. Herein, statistics of the cloud fraction and cloud base height from the duration of WFIP-2 are presented. Additionally, these cloud fraction estimates from Doppler lidar are compared with similar measurements from a Total Sky Imager and Radiative Flux Analysis (RadFlux) retrievals at the Wasco site. During mostly cloudy to overcast conditions, estimates of the cloud radiating temperature from the RadFlux methodology are also compared with Doppler lidar measured cloud base height.

  1. Development of methods for inferring cloud thickness and cloud-base height from satellite radiance data

    Science.gov (United States)

    Smith, William L., Jr.; Minnis, Patrick; Alvarez, Joseph M.; Uttal, Taneil; Intrieri, Janet M.; Ackerman, Thomas P.; Clothiaux, Eugene

    1993-01-01

    Cloud-top height is a major factor determining the outgoing longwave flux at the top of the atmosphere. The downwelling radiation from the cloud strongly affects the cooling rate within the atmosphere and the longwave radiation incident at the surface. Thus, determination of cloud-base temperature is important for proper calculation of fluxes below the cloud. Cloud-base altitude is also an important factor in aircraft operations. Cloud-top height or temperature can be derived in a straightforward manner using satellite-based infrared data. Cloud-base temperature, however, is not observable from the satellite, but is related to the height, phase, and optical depth of the cloud in addition to other variables. This study uses surface and satellite data taken during the First ISCCP Regional Experiment (FIRE) Phase-2 Intensive Field Observation (IFO) period (13 Nov. - 7 Dec. 1991, to improve techniques for deriving cloud-base height from conventional satellite data.

  2. Curvature computation in volume-of-fluid method based on point-cloud sampling

    Science.gov (United States)

    Kassar, Bruno B. M.; Carneiro, João N. E.; Nieckele, Angela O.

    2018-01-01

    This work proposes a novel approach to compute interface curvature in multiphase flow simulation based on Volume of Fluid (VOF) method. It is well documented in the literature that curvature and normal vector computation in VOF may lack accuracy mainly due to abrupt changes in the volume fraction field across the interfaces. This may cause deterioration on the interface tension forces estimates, often resulting in inaccurate results for interface tension dominated flows. Many techniques have been presented over the last years in order to enhance accuracy in normal vectors and curvature estimates including height functions, parabolic fitting of the volume fraction, reconstructing distance functions, coupling Level Set method with VOF, convolving the volume fraction field with smoothing kernels among others. We propose a novel technique based on a representation of the interface by a cloud of points. The curvatures and the interface normal vectors are computed geometrically at each point of the cloud and projected onto the Eulerian grid in a Front-Tracking manner. Results are compared to benchmark data and significant reduction on spurious currents as well as improvement in the pressure jump are observed. The method was developed in the open source suite OpenFOAM® extending its standard VOF implementation, the interFoam solver.

  3. Macrophysical and optical properties of midlatitude cirrus clouds from four ground-based lidars and collocated CALIOP observations

    Energy Technology Data Exchange (ETDEWEB)

    Dupont, Jean-Charles; Haeffelin, M.; Morille, Y.; Noel, V.; Keckhut, P.; Winker, D.; Comstock, Jennifer M.; Chervet, P.; Roblin, A.

    2010-05-27

    Ground-based lidar and CALIOP datasets gathered over four mid-latitude sites, two US and two French sites, are used to evaluate the consistency of cloud macrophysical and optical property climatologies that can be derived by such datasets. The consistency in average cloud height (both base and top height) between the CALIOP and ground datasets ranges from -0.4km to +0.5km. The cloud geometrical thickness distributions vary significantly between the different datasets, due in part to the original vertical resolutions of the lidar profiles. Average cloud geometrical thicknesses vary from 1.2 to 1.9km, i.e. by more than 50%. Cloud optical thickness distributions in subvisible, semi-transparent and moderate intervals differ by more than 50% between ground and space-based datasets. The cirrus clouds with 2 optical thickness below 0.1 (not included in historical cloud climatologies) represent 30-50% of the non-opaque cirrus class. The differences in average cloud base altitude between ground and CALIOP datasets of 0.0-0.1 km, 0.0-0.2 km and 0.0-0.2 km can be attributed to irregular sampling of seasonal variations in the ground-based data, to day-night differences in detection capabilities by CALIOP, and to the restriction to situations without low-level clouds in ground-based data, respectively. The cloud geometrical thicknesses are not affected by irregular sampling of seasonal variations in the ground-based data, while up to 0.0-0.2 km and 0.1-0.3 km differences can be attributed to day-night differences in detection capabilities by CALIOP, and to the restriction to situations without lowlevel clouds in ground-based data, respectively.

  4. Cloud-based Virtual Organization Engineering

    Directory of Open Access Journals (Sweden)

    Liviu Gabriel CRETU

    2012-01-01

    Full Text Available Nowadays we may notice that SOA arrived to its maturity stage and Cloud Computing brings the next paradigm-shift regarding the software delivery business model. In such a context, we consider that there is a need for frameworks to guide the creation, execution and management of virtual organizations (VO based on services from different Clouds. This paper will introduce the main components of such a framework that will innovatively combine the principles of event-driven SOA, REST and ISO/IEC 42010:2007 multiple views and viewpoints in order to provide the required methodology for Cloud-based virtual organization (Cloud-VO engi-neering. The framework will consider the resource concept found in software architectures like REST or RDF as the basic building block of Cloud-VO. and will make use of resources’ URIs to create the Cloud-VO’s resource allocation matrix. While the matrix is used to declare activity-resources relationships, the resource catalogue concept will be introduced as a way to describe the resource in one place, using as many viewpoints as needed, and then to reuse that description for the creation or simulation of different VOs.

  5. The use of marine cloud water samples as a diagnostic tool for aqueous chemistry, cloud microphysical processes and dynamics

    Science.gov (United States)

    Crosbie, E.; Ziemba, L. D.; Moore, R.; Shook, M.; Jordan, C.; Thornhill, K. L., II; Winstead, E.; Shingler, T.; Brown, M.; MacDonald, A. B.; Dadashazar, H.; Sorooshian, A.; Weiss-Penzias, P. S.; Anderson, B.

    2017-12-01

    Clouds play several roles in the Earth's climate system. In addition to their clear significance to the hydrological cycle, they strongly modulate the shortwave and longwave radiative balance of the atmosphere, with subsequent feedback on the atmospheric circulation. Furthermore, clouds act as a conduit for the fate and emergence of important trace chemical species and are the predominant removal mechanism for atmospheric aerosols. Marine boundary layer clouds cover large swaths of the global oceans. Because of their global significance, they have attracted significant attention into understanding how changes in aerosols are translated into changes in cloud macro- and microphysical properties. The circular nature of the influence of clouds-on-aerosols and aerosols-on-clouds has been used to explain the chaotic patterns often seen in marine clouds, however, this feedback also presents a substantial hurdle in resolving the uncertain role of anthropogenic aerosols on climate. Here we discuss ways in which the chemical constituents found in cloud water can offer insight into the physical and chemical processes inherent in marine clouds, through the use of aircraft measurements. We focus on observational data from cloud water samples collected during flights conducted over the remote North Atlantic and along coastal California across multiple campaigns. We explore topics related to aqueous processing, wet scavenging and source apportionment.

  6. A Location-Based Interactive Model of Internet of Things and Cloud (IoT-Cloud) for Mobile Cloud Computing Applications.

    Science.gov (United States)

    Dinh, Thanh; Kim, Younghan; Lee, Hyukjoon

    2017-03-01

    This paper presents a location-based interactive model of Internet of Things (IoT) and cloud integration (IoT-cloud) for mobile cloud computing applications, in comparison with the periodic sensing model. In the latter, sensing collections are performed without awareness of sensing demands. Sensors are required to report their sensing data periodically regardless of whether or not there are demands for their sensing services. This leads to unnecessary energy loss due to redundant transmission. In the proposed model, IoT-cloud provides sensing services on demand based on interest and location of mobile users. By taking advantages of the cloud as a coordinator, sensing scheduling of sensors is controlled by the cloud, which knows when and where mobile users request for sensing services. Therefore, when there is no demand, sensors are put into an inactive mode to save energy. Through extensive analysis and experimental results, we show that the location-based model achieves a significant improvement in terms of network lifetime compared to the periodic model.

  7. Graph based techniques for tag cloud generation

    DEFF Research Database (Denmark)

    Leginus, Martin; Dolog, Peter; Lage, Ricardo Gomes

    2013-01-01

    Tag cloud is one of the navigation aids for exploring documents. Tag cloud also link documents through the user defined terms. We explore various graph based techniques to improve the tag cloud generation. Moreover, we introduce relevance measures based on underlying data such as ratings...... or citation counts for improved measurement of relevance of tag clouds. We show, that on the given data sets, our approach outperforms the state of the art baseline methods with respect to such relevance by 41 % on Movielens dataset and by 11 % on Bibsonomy data set....

  8. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  9. Analyzing cloud base at local and regional scales to understand tropical montane cloud forest vulnerability to climate change

    Science.gov (United States)

    Van Beusekom, Ashley E.; González, Grizelle; Scholl, Martha A.

    2017-01-01

    The degree to which cloud immersion provides water in addition to rainfall, suppresses transpiration, and sustains tropical montane cloud forests (TMCFs) during rainless periods is not well understood. Climate and land use changes represent a threat to these forests if cloud base altitude rises as a result of regional warming or deforestation. To establish a baseline for quantifying future changes in cloud base, we installed a ceilometer at 100 m altitude in the forest upwind of the TMCF that occupies an altitude range from ∼ 600 m to the peaks at 1100 m in the Luquillo Mountains of eastern Puerto Rico. Airport Automated Surface Observing System (ASOS) ceilometer data, radiosonde data, and Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite data were obtained to investigate seasonal cloud base dynamics, altitude of the trade-wind inversion (TWI), and typical cloud thickness for the surrounding Caribbean region. Cloud base is rarely quantified near mountains, so these results represent a first look at seasonal and diurnal cloud base dynamics for the TMCF. From May 2013 to August 2016, cloud base was lowest during the midsummer dry season, and cloud bases were lower than the mountaintops as often in the winter dry season as in the wet seasons. The lowest cloud bases most frequently occurred at higher elevation than 600 m, from 740 to 964 m. The Luquillo forest low cloud base altitudes were higher than six other sites in the Caribbean by ∼ 200–600 m, highlighting the importance of site selection to measure topographic influence on cloud height. Proximity to the oceanic cloud system where shallow cumulus clouds are seasonally invariant in altitude and cover, along with local trade-wind orographic lifting and cloud formation, may explain the dry season low clouds. The results indicate that climate change threats to low-elevation TMCFs are not limited to the dry season; changes in synoptic-scale weather patterns

  10. A Location-Based Interactive Model of Internet of Things and Cloud (IoT-Cloud for Mobile Cloud Computing Applications

    Directory of Open Access Journals (Sweden)

    Thanh Dinh

    2017-03-01

    Full Text Available This paper presents a location-based interactive model of Internet of Things (IoT and cloud integration (IoT-cloud for mobile cloud computing applications, in comparison with the periodic sensing model. In the latter, sensing collections are performed without awareness of sensing demands. Sensors are required to report their sensing data periodically regardless of whether or not there are demands for their sensing services. This leads to unnecessary energy loss due to redundant transmission. In the proposed model, IoT-cloud provides sensing services on demand based on interest and location of mobile users. By taking advantages of the cloud as a coordinator, sensing scheduling of sensors is controlled by the cloud, which knows when and where mobile users request for sensing services. Therefore, when there is no demand, sensors are put into an inactive mode to save energy. Through extensive analysis and experimental results, we show that the location-based model achieves a significant improvement in terms of network lifetime compared to the periodic model.

  11. Open Source Cloud-Based Technologies for Bim

    Science.gov (United States)

    Logothetis, S.; Karachaliou, E.; Valari, E.; Stylianidis, E.

    2018-05-01

    This paper presents a Cloud-based open source system for storing and processing data from a 3D survey approach. More specifically, we provide an online service for viewing, storing and analysing BIM. Cloud technologies were used to develop a web interface as a BIM data centre, which can handle large BIM data using a server. The server can be accessed by many users through various electronic devices anytime and anywhere so they can view online 3D models using browsers. Nowadays, the Cloud computing is engaged progressively in facilitating BIM-based collaboration between the multiple stakeholders and disciplinary groups for complicated Architectural, Engineering and Construction (AEC) projects. Besides, the development of Open Source Software (OSS) has been rapidly growing and their use tends to be united. Although BIM and Cloud technologies are extensively known and used, there is a lack of integrated open source Cloud-based platforms able to support all stages of BIM processes. The present research aims to create an open source Cloud-based BIM system that is able to handle geospatial data. In this effort, only open source tools will be used; from the starting point of creating the 3D model with FreeCAD to its online presentation through BIMserver. Python plug-ins will be developed to link the two software which will be distributed and freely available to a large community of professional for their use. The research work will be completed by benchmarking four Cloud-based BIM systems: Autodesk BIM 360, BIMserver, Graphisoft BIMcloud and Onuma System, which present remarkable results.

  12. OPEN SOURCE CLOUD-BASED TECHNOLOGIES FOR BIM

    Directory of Open Access Journals (Sweden)

    S. Logothetis

    2018-05-01

    Full Text Available This paper presents a Cloud-based open source system for storing and processing data from a 3D survey approach. More specifically, we provide an online service for viewing, storing and analysing BIM. Cloud technologies were used to develop a web interface as a BIM data centre, which can handle large BIM data using a server. The server can be accessed by many users through various electronic devices anytime and anywhere so they can view online 3D models using browsers. Nowadays, the Cloud computing is engaged progressively in facilitating BIM-based collaboration between the multiple stakeholders and disciplinary groups for complicated Architectural, Engineering and Construction (AEC projects. Besides, the development of Open Source Software (OSS has been rapidly growing and their use tends to be united. Although BIM and Cloud technologies are extensively known and used, there is a lack of integrated open source Cloud-based platforms able to support all stages of BIM processes. The present research aims to create an open source Cloud-based BIM system that is able to handle geospatial data. In this effort, only open source tools will be used; from the starting point of creating the 3D model with FreeCAD to its online presentation through BIMserver. Python plug-ins will be developed to link the two software which will be distributed and freely available to a large community of professional for their use. The research work will be completed by benchmarking four Cloud-based BIM systems: Autodesk BIM 360, BIMserver, Graphisoft BIMcloud and Onuma System, which present remarkable results.

  13. Exploring the Effects of Cloud Vertical Structure on Cloud Microphysical Retrievals based on Polarized Reflectances

    Science.gov (United States)

    Miller, D. J.; Zhang, Z.; Platnick, S. E.; Ackerman, A. S.; Cornet, C.; Baum, B. A.

    2013-12-01

    A polarized cloud reflectance simulator was developed by coupling an LES cloud model with a polarized radiative transfer model to assess the capabilities of polarimetric cloud retrievals. With future remote sensing campaigns like NASA's Aerosols/Clouds/Ecosystems (ACE) planning to feature advanced polarimetric instruments it is important for the cloud remote sensing community to understand the retrievable information available and the related systematic/methodical limitations. The cloud retrieval simulator we have developed allows us to probe these important questions in a realistically relevant test bed. Our simulator utilizes a polarized adding-doubling radiative transfer model and an LES cloud field from a DHARMA simulation (Ackerman et al. 2004) with cloud properties based on the stratocumulus clouds observed during the DYCOMS-II field campaign. In this study we will focus on how the vertical structure of cloud microphysics can influence polarized cloud effective radius retrievals. Numerous previous studies have explored how retrievals based on total reflectance are affected by cloud vertical structure (Platnick 2000, Chang and Li 2002) but no such studies about the effects of vertical structure on polarized retrievals exist. Unlike the total cloud reflectance, which is predominantly multiply scattered light, the polarized reflectance is primarily the result of singly scattered photons. Thus the polarized reflectance is sensitive to only the uppermost region of the cloud (tau~influencer on the microphysical development of cloud droplets, can be potentially studied with polarimetric retrievals.

  14. Comparison of cloud optical depth and cloud mask applying BRDF model-based background surface reflectance

    Science.gov (United States)

    Kim, H. W.; Yeom, J. M.; Woo, S. H.

    2017-12-01

    Over the thin cloud region, satellite can simultaneously detect the reflectance from thin clouds and land surface. Since the mixed reflectance is not the exact cloud information, the background surface reflectance should be eliminated to accurately distinguish thin cloud such as cirrus. In the previous research, Kim et al (2017) was developed the cloud masking algorithm using the Geostationary Ocean Color Imager (GOCI), which is one of significant instruments for Communication, Ocean, and Meteorology Satellite (COMS). Although GOCI has 8 spectral channels including visible and near infra-red spectral ranges, the cloud masking has quantitatively reasonable result when comparing with MODIS cloud mask (Collection 6 MYD35). Especially, we noticed that this cloud masking algorithm is more specialized in thin cloud detections through the validation with Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) data. Because this cloud masking method was concentrated on eliminating background surface effects from the top-of-atmosphere (TOA) reflectance. Applying the difference between TOA reflectance and the bi-directional reflectance distribution function (BRDF) model-based background surface reflectance, cloud areas both thick cloud and thin cloud can be discriminated without infra-red channels which were mostly used for detecting clouds. Moreover, when the cloud mask result was utilized as the input data when simulating BRDF model and the optimized BRDF model-based surface reflectance was used for the optimized cloud masking, the probability of detection (POD) has higher value than POD of the original cloud mask. In this study, we examine the correlation between cloud optical depth (COD) and its cloud mask result. Cloud optical depths mostly depend on the cloud thickness, the characteristic of contents, and the size of cloud contents. COD ranges from less than 0.1 for thin clouds to over 1000 for the huge cumulus due to scattering by droplets. With

  15. A Location-Based Interactive Model of Internet of Things and Cloud (IoT-Cloud) for Mobile Cloud Computing Applications †

    Science.gov (United States)

    Dinh, Thanh; Kim, Younghan; Lee, Hyukjoon

    2017-01-01

    This paper presents a location-based interactive model of Internet of Things (IoT) and cloud integration (IoT-cloud) for mobile cloud computing applications, in comparison with the periodic sensing model. In the latter, sensing collections are performed without awareness of sensing demands. Sensors are required to report their sensing data periodically regardless of whether or not there are demands for their sensing services. This leads to unnecessary energy loss due to redundant transmission. In the proposed model, IoT-cloud provides sensing services on demand based on interest and location of mobile users. By taking advantages of the cloud as a coordinator, sensing scheduling of sensors is controlled by the cloud, which knows when and where mobile users request for sensing services. Therefore, when there is no demand, sensors are put into an inactive mode to save energy. Through extensive analysis and experimental results, we show that the location-based model achieves a significant improvement in terms of network lifetime compared to the periodic model. PMID:28257067

  16. An investigation of cloud base height in Chiang Mai

    Science.gov (United States)

    Peengam, S.; Tohsing, K.

    2017-09-01

    Clouds play very important role in the variation of surface solar radiation and rain formation. To understand this role, it is necessary to know the physical and geometrical of properties of cloud. However, clouds vary with location and time, which lead to a difficulty to obtain their properties. In this work, a ceilometer was installed at a station of the Royal Rainmaking and Agricultural Aviation Department in Chiang Mai (17.80° N, 98.43° E) in order to measure cloud base height. The cloud base height data from this instrument were compared with those obtained from LiDAR, a more sophisticated instrument installed at the same site. It was found that the cloud base height from both instruments was in reasonable agreement, with root mean square difference (RMSD) and mean bias difference (MBD) of 19.21% and 1.58%, respectively. Afterward, a six-month period (August, 2016-January, 2017) of data from the ceilometer was analyzed. The results show that mean cloud base height during this period is 1.5 km, meaning that most clouds are in the category of low-level cloud.

  17. Cloud-based solution to identify statistically significant MS peaks differentiating sample categories.

    Science.gov (United States)

    Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B

    2013-03-23

    Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.

  18. Using Cloud-based Storage Technologies for Earth Science Data

    Science.gov (United States)

    Michaelis, A.; Readey, J.; Votava, P.

    2016-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  19. Security Framework for Agent-Based Cloud Computing

    Directory of Open Access Journals (Sweden)

    K Venkateshwaran

    2015-06-01

    Full Text Available Agent can play a key role in bringing suitable cloud services to the customer based on their requirements. In agent based cloud computing, agent does negotiation, coordination, cooperation and collaboration on behalf of the customer to make the decisions in efficient manner. However the agent based cloud computing have some security issues like (a. addition of malicious agent in the cloud environment which could demolish the process by attacking other agents, (b. denial of service by creating flooding attacks on other involved agents. (c. Some of the exceptions in the agent interaction protocol such as Not-Understood and Cancel_Meta protocol can be misused and may lead to terminating the connection of all the other agents participating in the negotiating services. Also, this paper proposes algorithms to solve these issues to ensure that there will be no intervention of any malicious activities during the agent interaction.

  20. A Cloud Based Data Integration Framework

    OpenAIRE

    Jiang , Nan; Xu , Lai; Vrieze , Paul ,; Lim , Mian-Guan; Jarabo , Oscar

    2012-01-01

    Part 7: Cloud-Based Support; International audience; Virtual enterprise (VE) relies on resource sharing and collaboration across geographically dispersed and dynamically allied businesses in order to better respond to market opportunities. It is generally considered that effective data integration and management is crucial to realise the value of VE. This paper describes a cloud-based data integration framework that can be used for supporting VE to discover, explore and respond more emerging ...

  1. Cloud-Based Mobile Learning

    Directory of Open Access Journals (Sweden)

    Alexandru BUTOI

    2013-01-01

    Full Text Available As the cloud technologies are largely studied and mobile technologies are evolving, new di-rections for development of mobile learning tools deployed on cloud are proposed.. M-Learning is treated as part of the ubiquitous learning paradigm and is a pervasive extension of E-Learning technologies. Development of such learning tools requires specific development strategies for an effective abstracting of pedagogical principles at the software design and implementation level. Current paper explores an interdisciplinary approach for designing and development of cloud based M-Learning tools by mapping a specific development strategy used for educational programs to software prototyping strategy. In order for such instruments to be user effective from the learning outcome point of view, the evaluation process must be rigorous as we propose a metric model for expressing the trainee’s overall learning experience with evaluated levels of interactivity, content presentation and graphical user interface usability.

  2. Electrical signature in polar night cloud base variations

    International Nuclear Information System (INIS)

    Harrison, R Giles; Ambaum, Maarten H P

    2013-01-01

    Layer clouds are globally extensive. Their lower edges are charged negatively by the fair weather atmospheric electricity current flowing vertically through them. Using polar winter surface meteorological data from Sodankylä (Finland) and Halley (Antarctica), we find that when meteorological diurnal variations are weak, an appreciable diurnal cycle, on average, persists in the cloud base heights, detected using a laser ceilometer. The diurnal cloud base heights from both sites correlate more closely with the Carnegie curve of global atmospheric electricity than with local meteorological measurements. The cloud base sensitivities are indistinguishable between the northern and southern hemispheres, averaging a (4.0 ± 0.5) m rise for a 1% change in the fair weather electric current density. This suggests that the global fair weather current, which is affected by space weather, cosmic rays and the El Niño Southern Oscillation, is linked with layer cloud properties. (letter)

  3. Cloud-based Architecture Capabilities Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Vang, Leng [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steven R [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    In collaborating scientific research arena it is important to have an environment where analysts have access to a shared of information documents, software tools and be able to accurately maintain and track historical changes in models. A new cloud-based environment would be accessible remotely from anywhere regardless of computing platforms given that the platform has available of Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report reviews development of a Cloud-based Architecture Capabilities (CAC) as a web portal for PRA tools.

  4. Developing cloud-based Business Process Management (BPM): a survey

    Science.gov (United States)

    Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh

    2018-03-01

    In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.

  5. A multi-sensor study of the impact of ground-based glaciogenic seeding on orogrpahic clouds and precipitation

    Science.gov (United States)

    Pokharel, Binod

    This dissertation examines reflectivity data from three different radar systems, as well as airborne and ground-based in situ particle imaging data, to study the impact of ground-based glaciogenic seeding on orographic clouds and precipitation formed over the mountains in southern Wyoming. The data for this study come from the AgI Seeding Cloud Impact Investigation (ASCII) field campaign conducted over the Sierra Madre mountains in 2012 (ASCII-12) and over the Medicine Bow mountains in 2013 (ASCII-13) in the context of the Wyoming Weather Modification Pilot Project (WWMPP). The campaigns were supported by a network of ground-based instruments, including a microwave radiometer, two profiling Ka-band Micro Rain Radars (MRRs), a Doppler on Wheels (DOW), rawinsondes, a Cloud Particle Imager, and a Parsivel disdrometer. The University of Wyoming King Air with profiling Wyoming Cloud Radar (WCR) conducted nine successful flights in ASCII-12, and eight flights in ASCII-13. WCR profiles from these flights are combined with those from seven other flights, which followed the same geographically-fixed pattern in 2008-09 (pre-ASCII) over the Medicine Bow range. All sampled storms were relatively shallow, with low-level air forced over the target mountain, and cold enough to support ice initiation by silver iodide (AgI) nuclei in cloud. Three detailed case studies are conducted, each with different atmospheric conditions and different cloud and snow growth properties: one case (21 Feb 2012) is stratiform, with strong winds and cloud droplets too small to enable snow growth by accretion (riming). A second case (13 Feb 2012) contains shallow convective cells. Clouds in the third case study (22 Feb 2012) are stratiform but contain numerous large droplets (mode ~35 microm in diameter), large enough for ice particle growth by riming. These cases and all others, each with a treated period following an untreated period, show that a clear seeding signature is not immediately apparent

  6. A cloud-based system for automatic glaucoma screening.

    Science.gov (United States)

    Fengshou Yin; Damon Wing Kee Wong; Ying Quan; Ai Ping Yow; Ngan Meng Tan; Gopalakrishnan, Kavitha; Beng Hai Lee; Yanwu Xu; Zhuo Zhang; Jun Cheng; Jiang Liu

    2015-08-01

    In recent years, there has been increasing interest in the use of automatic computer-based systems for the detection of eye diseases including glaucoma. However, these systems are usually standalone software with basic functions only, limiting their usage in a large scale. In this paper, we introduce an online cloud-based system for automatic glaucoma screening through the use of medical image-based pattern classification technologies. It is designed in a hybrid cloud pattern to offer both accessibility and enhanced security. Raw data including patient's medical condition and fundus image, and resultant medical reports are collected and distributed through the public cloud tier. In the private cloud tier, automatic analysis and assessment of colour retinal fundus images are performed. The ubiquitous anywhere access nature of the system through the cloud platform facilitates a more efficient and cost-effective means of glaucoma screening, allowing the disease to be detected earlier and enabling early intervention for more efficient intervention and disease management.

  7. The Suitability of Cloud-Based Speech Recognition Engines for Language Learning

    Science.gov (United States)

    Daniels, Paul; Iwago, Koji

    2017-01-01

    As online automatic speech recognition (ASR) engines become more accurate and more widely implemented with call software, it becomes important to evaluate the effectiveness and the accuracy of these recognition engines using authentic speech samples. This study investigates two of the most prominent cloud-based speech recognition engines--Apple's…

  8. A novel approach to Lagrangian sampling of marine boundary layer cloud and aerosol in the northeast Pacific: case studies from CSET

    Science.gov (United States)

    Mohrmann, J.; Albrecht, B. A.; Bretherton, C. S.; Ghate, V. P.; Zuidema, P.; Wood, R.

    2015-12-01

    The Cloud System Evolution in the Trades (CSET) field campaign took place during July/August 2015 with the purpose of characterizing the cloud, aerosol and thermodynamic properties of the northeast Pacific marine boundary layer. One major science goal of the campaign was to observe a Lagrangian transition from thin stratocumulus (Sc) upwind near California to trade cumulus (Cu) nearer to Hawaii. Cloud properties were observed from the NSF/NCAR Gulfstream V research plane (GV) using the HIAPER Cloud Radar (HCR) and the HIAPER Spectral Resolution Lidar (HSRL), among other instrumentation. Aircraft observations were complemented by a suite of satellite-derived products. To observe a the evolution of airmasses over the course of two days, upwind regions were sampled on an outbound flight to from Sacramento, CA, to Kona, HI. The sampled airmasses were then tracked using HYSPLIT trajectories based on GFS model forecasts, and the return flight to California was planned to intercept those airmasses, using satellite observation to track cloud evolution in the interim. This approach required that trajectories were reasonably stable up to 3 days prior to final sampling, and also that forecast trajectories were in agreement with post-flight analysis and visual cloud feature tracking. The extent to which this was realised, and hence the validity of this new approach to Lagrangian airmass observation, is assessed here. We also present results showing that a Sc-Cu airmass transition was consistently observed during the CSET study using measurements from research flights and satellite.

  9. Overview of Boundary Layer Clouds Using Satellite and Ground-Based Measurements

    Science.gov (United States)

    Xi, B.; Dong, X.; Wu, P.; Qiu, S.

    2017-12-01

    A comprehensive summary of boundary layer clouds properties based on our few recently studies will be presented. The analyses include the global cloud fractions and cloud macro/micro- physical properties based on satellite measurements using both CERES-MODIS and CloudSat/Caliposo data products,; the annual/seasonal/diurnal variations of stratocumulus clouds over different climate regions (mid-latitude land, mid-latitude ocean, and Arctic region) using DOE ARM ground-based measurements over Southern great plain (SGP), Azores (GRW), and North slope of Alaska (NSA) sites; the impact of environmental conditions to the formation and dissipation process of marine boundary layer clouds over Azores site; characterizing Arctice mixed-phase cloud structure and favorable environmental conditions for the formation/maintainess of mixed-phase clouds over NSA site. Though the presentation has widely spread topics, we will focus on the representation of the ground-based measurements over different climate regions; evaluation of satellite retrieved cloud properties using these ground-based measurements, and understanding the uncertainties of both satellite and ground-based retrievals and measurements.

  10. A physically based algorithm for non-blackbody correction of the cloud top temperature for the convective clouds

    Science.gov (United States)

    Wang, C.; Luo, Z. J.; Chen, X.; Zeng, X.; Tao, W.; Huang, X.

    2012-12-01

    Cloud top temperature is a key parameter to retrieval in the remote sensing of convective clouds. Passive remote sensing cannot directly measure the temperature at the cloud tops. Here we explore a synergistic way of estimating cloud top temperature by making use of the simultaneous passive and active remote sensing of clouds (in this case, CloudSat and MODIS). Weighting function of the MODIS 11μm band is explicitly calculated by feeding cloud hydrometer profiles from CloudSat retrievals and temperature and humidity profiles based on ECMWF ERA-interim reanalysis into a radiation transfer model. Among 19,699 tropical deep convective clouds observed by the CloudSat in 2008, the averaged effective emission level (EEL, where the weighting function attains its maximum) is at optical depth 0.91 with a standard deviation of 0.33. Furthermore, the vertical gradient of CloudSat radar reflectivity, an indicator of the fuzziness of convective cloud top, is linearly proportional to, d_{CTH-EEL}, the distance between the EEL of 11μm channel and cloud top height (CTH) determined by the CloudSat when d_{CTH-EEL}<0.6km. Beyond 0.6km, the distance has little sensitivity to the vertical gradient of CloudSat radar reflectivity. Based on these findings, we derive a formula between the fuzziness in the cloud top region, which is measurable by CloudSat, and the MODIS 11μm brightness temperature assuming that the difference between effective emission temperature and the 11μm brightness temperature is proportional to the cloud top fuzziness. This formula is verified using the simulated deep convective cloud profiles by the Goddard Cumulus Ensemble model. We further discuss the application of this formula in estimating cloud top buoyancy as well as the error characteristics of the radiative calculation within such deep-convective clouds.

  11. Comparison of Cloud Base Height Derived from a Ground-Based Infrared Cloud Measurement and Two Ceilometers

    Directory of Open Access Journals (Sweden)

    Lei Liu

    2015-01-01

    Full Text Available The cloud base height (CBH derived from the whole-sky infrared cloud-measuring system (WSIRCMS and two ceilometers (Vaisala CL31 and CL51 from November 1, 2011, to June 12, 2012, at the Chinese Meteorological Administration (CMA Beijing Observatory Station are analysed. Significant differences can be found by comparing the measurements of different instruments. More exactly, the cloud occurrence retrieved from CL31 is 3.8% higher than that from CL51, while WSIRCMS data shows 3.6% higher than ceilometers. More than 75.5% of the two ceilometers’ differences are within ±200 m and about 89.5% within ±500 m, while only 30.7% of the differences between WSIRCMS and ceilometers are within ±500 m and about 55.2% within ±1000 m. These differences may be caused by the measurement principles and CBH retrieval algorithm. A combination of a laser ceilometer and an infrared cloud instrument is recommended to improve the capability for determining cloud occurrence and retrieving CBHs.

  12. ID based cryptography for secure cloud data storage

    OpenAIRE

    Kaaniche , Nesrine; Boudguiga , Aymen; Laurent , Maryline

    2013-01-01

    International audience; This paper addresses the security issues of storing sensitive data in a cloud storage service and the need for users to trust the commercial cloud providers. It proposes a cryptographic scheme for cloud storage, based on an original usage of ID-Based Cryptography. Our solution has several advantages. First, it provides secrecy for encrypted data which are stored in public servers. Second, it offers controlled data access and sharing among users, so that unauthorized us...

  13. Cloud-Based Data Storage

    Science.gov (United States)

    Waters, John K.

    2011-01-01

    The vulnerability and inefficiency of backing up data on-site is prompting school districts to switch to more secure, less troublesome cloud-based options. District auditors are pushing for a better way to back up their data than the on-site, tape-based system that had been used for years. About three years ago, Hendrick School District in…

  14. SnowCloud - a Framework to Predict Streamflow in Snowmelt-dominated Watersheds Using Cloud-based Computing

    Science.gov (United States)

    Sproles, E. A.; Crumley, R. L.; Nolin, A. W.; Mar, E.; Lopez-Moreno, J. J.

    2017-12-01

    Streamflow in snowy mountain regions is extraordinarily challenging to forecast, and prediction efforts are hampered by the lack of timely snow data—particularly in data sparse regions. SnowCloud is a prototype web-based framework that integrates remote sensing, cloud computing, interactive mapping tools, and a hydrologic model to offer a new paradigm for delivering key data to water resource managers. We tested the skill of SnowCloud to forecast monthly streamflow with one month lead time in three snow-dominated headwaters. These watersheds represent a range of precipitation/runoff schemes: the Río Elqui in northern Chile (200 mm/yr, entirely snowmelt); the John Day River, Oregon, USA (635 mm/yr, primarily snowmelt); and the Río Aragon in the northern Spain (850 mm/yr, snowmelt dominated). Model skill corresponded to snowpack contribution with Nash-Sutcliffe Efficiencies of 0.86, 0.52, and 0.21 respectively. SnowCloud does not require the user to possess advanced programming skills or proprietary software. We access NASA's MOD10A1 snow cover product to calculate the snow metrics globally using Google Earth Engine's geospatial analysis and cloud computing service. The analytics and forecast tools are provided through a web-based portal that requires only internet access and minimal training. To test the efficacy of SnowCloud we provided the tools and a series of tutorials in English and Spanish to water resource managers in Chile, Spain, and the United States. Participants assessed their user experience and provided feedback, and the results of our multi-cultural assessment are also presented. While our results focus on SnowCloud, they outline methods to develop cloud-based tools that function effectively across cultures and languages. Our approach also addresses the primary challenges of science-based computing; human resource limitations, infrastructure costs, and expensive proprietary software. These challenges are particularly problematic in developing

  15. Professional SharePoint 2010 Cloud-Based Solutions

    CERN Document Server

    Fox, Steve; Stubbs, Paul; Follette, Donovan

    2011-01-01

    An authoritative guide to extending SharePoint's power with cloud-based services If you want to be part of the next major shift in the IT industry, you'll want this book. Melding two of the hottest trends in the industry—the widespread popularity of the SharePoint collaboration platform and the rapid rise of cloud computing—this practical guide shows developers how to extend their SharePoint solutions with the cloud's almost limitless capabilities. See how to get started, discover smart ways to leverage cloud data and services through Azure, start incorporating Twitter or LinkedIn

  16. A Reputation-Based Identity Management Model for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Lifa Wu

    2015-01-01

    Full Text Available In the field of cloud computing, most research on identity management has concentrated on protecting user data. However, users typically leave a trail when they access cloud services, and the resulting user traceability can potentially lead to the leakage of sensitive user information. Meanwhile, malicious users can do harm to cloud providers through the use of pseudonyms. To solve these problems, we introduce a reputation mechanism and design a reputation-based identity management model for cloud computing. In the model, pseudonyms are generated based on a reputation signature so as to guarantee the untraceability of pseudonyms, and a mechanism that calculates user reputation is proposed, which helps cloud service providers to identify malicious users. Analysis verifies that the model can ensure that users access cloud services anonymously and that cloud providers assess the credibility of users effectively without violating user privacy.

  17. Reliability Analysis Based on a Jump Diffusion Model with Two Wiener Processes for Cloud Computing with Big Data

    Directory of Open Access Journals (Sweden)

    Yoshinobu Tamura

    2015-06-01

    Full Text Available At present, many cloud services are managed by using open source software, such as OpenStack and Eucalyptus, because of the unification management of data, cost reduction, quick delivery and work savings. The operation phase of cloud computing has a unique feature, such as the provisioning processes, the network-based operation and the diversity of data, because the operation phase of cloud computing changes depending on many external factors. We propose a jump diffusion model with two-dimensional Wiener processes in order to consider the interesting aspects of the network traffic and big data on cloud computing. In particular, we assess the stability of cloud software by using the sample paths obtained from the jump diffusion model with two-dimensional Wiener processes. Moreover, we discuss the optimal maintenance problem based on the proposed jump diffusion model. Furthermore, we analyze actual data to show numerical examples of dependability optimization based on the software maintenance cost considering big data on cloud computing.

  18. NASA Cloud-Based Climate Data Services

    Science.gov (United States)

    McInerney, M. A.; Schnase, J. L.; Duffy, D. Q.; Tamkin, G. S.; Strong, S.; Ripley, W. D., III; Thompson, J. H.; Gill, R.; Jasen, J. E.; Samowich, B.; Pobre, Z.; Salmon, E. M.; Rumney, G.; Schardt, T. D.

    2012-12-01

    Cloud-based scientific data services are becoming an important part of NASA's mission. Our technological response is built around the concept of specialized virtual climate data servers, repetitive cloud provisioning, image-based deployment and distribution, and virtualization-as-a-service (VaaS). A virtual climate data server (vCDS) is an Open Archive Information System (OAIS) compliant, iRODS-based data server designed to support a particular type of scientific data collection. iRODS is data grid middleware that provides policy-based control over collection-building, managing, querying, accessing, and preserving large scientific data sets. We have deployed vCDS Version 1.0 in the Amazon EC2 cloud using S3 object storage and are using the system to deliver a subset of NASA's Intergovernmental Panel on Climate Change (IPCC) data products to the latest CentOS federated version of Earth System Grid Federation (ESGF), which is also running in the Amazon cloud. vCDS-managed objects are exposed to ESGF through FUSE (Filesystem in User Space), which presents a POSIX-compliant filesystem abstraction to applications such as the ESGF server that require such an interface. A vCDS manages data as a distinguished collection for a person, project, lab, or other logical unit. A vCDS can manage a collection across multiple storage resources using rules and microservices to enforce collection policies. And a vCDS can federate with other vCDSs to manage multiple collections over multiple resources, thereby creating what can be thought of as an ecosystem of managed collections. With the vCDS approach, we are trying to enable the full information lifecycle management of scientific data collections and make tractable the task of providing diverse climate data services. In this presentation, we describe our approach, experiences, lessons learned, and plans for the future.; (A) vCDS/ESG system stack. (B) Conceptual architecture for NASA cloud-based data services.

  19. Methodology for cloud-based design of robots

    Science.gov (United States)

    Ogorodnikova, O. M.; Vaganov, K. A.; Putimtsev, I. D.

    2017-09-01

    This paper presents some important results for cloud-based designing a robot arm by a group of students. Methodology for the cloud-based design was developed and used to initiate interdisciplinary project about research and development of a specific manipulator. The whole project data files were hosted by Ural Federal University data center. The 3D (three-dimensional) model of the robot arm was created using Siemens PLM software (Product Lifecycle Management) and structured as a complex mechatronics product by means of Siemens Teamcenter thin client; all processes were performed in the clouds. The robot arm was designed in purpose to load blanks up to 1 kg into the work space of the milling machine for performing student's researches.

  20. AN QUALITY BASED ENHANCEMENT OF USER DATA PROTECTION VIA FUZZY RULE BASED SYSTEMS IN CLOUD ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    R Poorva Devi

    2016-04-01

    Full Text Available So far, in cloud computing distinct customer is accessed and consumed enormous amount of services through web, offered by cloud service provider (CSP. However cloud is providing one of the services is, security-as-a-service to its clients, still people are terrified to use the service from cloud vendor. Number of solutions, security components and measurements are coming with the new scope for the cloud security issue, but 79.2% security outcome only obtained from the different scientists, researchers and other cloud based academy community. To overcome the problem of cloud security the proposed model that is, “Quality based Enhancing the user data protection via fuzzy rule based systems in cloud environment”, will helps to the cloud clients by the way of accessing the cloud resources through remote monitoring management (RMMM and what are all the services are currently requesting and consuming by the cloud users that can be well analyzed with Managed service provider (MSP rather than a traditional CSP. Normally, people are trying to secure their own private data by applying some key management and cryptographic based computations again it will direct to the security problem. In order to provide good quality of security target result by making use of fuzzy rule based systems (Constraint & Conclusion segments in cloud environment. By using this technique, users may obtain an efficient security outcome through the cloud simulation tool of Apache cloud stack simulator.

  1. Feasibility and demonstration of a cloud-based RIID analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Michael C., E-mail: wrightmc@ornl.gov [Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Hertz, Kristin L.; Johnson, William C. [Sandia National Laboratories, Livermore, CA 94551 (United States); Sword, Eric D.; Younkin, James R. [Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Sadler, Lorraine E. [Sandia National Laboratories, Livermore, CA 94551 (United States)

    2015-06-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed. - Highlights: • A prototype cloud-based RIID analysis system was implemented and demonstrated. • A cloud-based system was shown to be feasible with currently available technology. • A system study identified the operational characteristics required by the users. • The system study showed that the user community could derive significant benefit. • An architecture was defined for field testing by users in relevant environments.

  2. Feasibility and demonstration of a cloud-based RIID analysis system

    International Nuclear Information System (INIS)

    Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.

    2015-01-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed. - Highlights: • A prototype cloud-based RIID analysis system was implemented and demonstrated. • A cloud-based system was shown to be feasible with currently available technology. • A system study identified the operational characteristics required by the users. • The system study showed that the user community could derive significant benefit. • An architecture was defined for field testing by users in relevant environments

  3. Evaluating the Usage of Cloud-Based Collaboration Services through Teamwork

    Science.gov (United States)

    Qin, Li; Hsu, Jeffrey; Stern, Mel

    2016-01-01

    With the proliferation of cloud computing for both organizational and educational use, cloud-based collaboration services are transforming how people work in teams. The authors investigated the determinants of the usage of cloud-based collaboration services including teamwork quality, computer self-efficacy, and prior experience, as well as its…

  4. Optimising TCP for cloud-based mobile networks

    DEFF Research Database (Denmark)

    Artuso, Matteo; Christiansen, Henrik Lehrmann

    2016-01-01

    Cloud-based mobile networks are foreseen to be a technological enabler for the next generation of mobile networks. Their design requires substantial research as they pose unique challenges, especially from the point of view of additional delays in the fronthaul network. Commonly used network...... implementations of 3 popular operating systems are investigated in our network model. The results on the most influential parameters are used to design an optimized TCP for cloud-based mobile networks....

  5. Research on cloud-based remote measurement and analysis system

    Science.gov (United States)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  6. CloudLM: a Cloud-based Language Model for Machine Translation

    Directory of Open Access Journals (Sweden)

    Ferrández-Tordera Jorge

    2016-04-01

    Full Text Available Language models (LMs are an essential element in statistical approaches to natural language processing for tasks such as speech recognition and machine translation (MT. The advent of big data leads to the availability of massive amounts of data to build LMs, and in fact, for the most prominent languages, using current techniques and hardware, it is not feasible to train LMs with all the data available nowadays. At the same time, it has been shown that the more data is used for a LM the better the performance, e.g. for MT, without any indication yet of reaching a plateau. This paper presents CloudLM, an open-source cloud-based LM intended for MT, which allows to query distributed LMs. CloudLM relies on Apache Solr and provides the functionality of state-of-the-art language modelling (it builds upon KenLM, while allowing to query massive LMs (as the use of local memory is drastically reduced, at the expense of slower decoding speed.

  7. Big data mining analysis method based on cloud computing

    Science.gov (United States)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  8. Automatic markerless registration of point clouds with semantic-keypoint-based 4-points congruent sets

    Science.gov (United States)

    Ge, Xuming

    2017-08-01

    The coarse registration of point clouds from urban building scenes has become a key topic in applications of terrestrial laser scanning technology. Sampling-based algorithms in the random sample consensus (RANSAC) model have emerged as mainstream solutions to address coarse registration problems. In this paper, we propose a novel combined solution to automatically align two markerless point clouds from building scenes. Firstly, the method segments non-ground points from ground points. Secondly, the proposed method detects feature points from each cross section and then obtains semantic keypoints by connecting feature points with specific rules. Finally, the detected semantic keypoints from two point clouds act as inputs to a modified 4PCS algorithm. Examples are presented and the results compared with those of K-4PCS to demonstrate the main contributions of the proposed method, which are the extension of the original 4PCS to handle heavy datasets and the use of semantic keypoints to improve K-4PCS in relation to registration accuracy and computational efficiency.

  9. Route Assessment for Unmanned Aerial Vehicle Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Xixia Sun

    2014-01-01

    Full Text Available An integrated route assessment approach based on cloud model is proposed in this paper, where various sources of uncertainties are well kept and modeled by cloud theory. Firstly, a systemic criteria framework incorporating models for scoring subcriteria is developed. Then, the cloud model is introduced to represent linguistic variables, and survivability probability histogram of each route is converted into normal clouds by cloud transformation, enabling both randomness and fuzziness in the assessment environment to be managed simultaneously. Finally, a new way to measure the similarity between two normal clouds satisfying reflexivity, symmetry, transitivity, and overlapping is proposed. Experimental results demonstrate that the proposed route assessment approach outperforms fuzzy logic based assessment approach with regard to feasibility, reliability, and consistency with human thinking.

  10. Feasibility and demonstration of a cloud-based RIID analysis system

    Science.gov (United States)

    Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.

    2015-06-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed.

  11. Characterization of organic matter in cloud waters sampled at the puy de Dôme mountain using FT-ICR-MS

    Science.gov (United States)

    Bianco, A.; Chaumerliac, N.; Vaitilingom, M.; Deguillaume, L.; Bridoux, M. C.

    2017-12-01

    The chemical composition of organic matter in cloud water is highly complex. The organic species result from their dissolution from the gas phase or from the soluble fraction of the particle phase. They are also produced by aqueous phase reactivity. Several low molecular weight organic species have been quantified such as aldehydes and carboxylic acids. Recently, amino acids were also detected in cloud water and their presence is related to the presence of microorganisms. Compounds presenting similarities with high molecular weight organic substances or HULIS found in aerosols were also observed in clouds. Overall, these studies mainly focused on individual compounds or functional groups rather than the complex mixture at the molecular level. This study presents a non-targeted approach to characterize the organic matter in clouds. Samples were collected at the puy de Dôme Mountain (France). Two cloud water samples (June & July 2016) were analyzed using high resolution mass spectrometry (ESI-FT-ICR-MS 9.4T). A reversed solid phase extraction (SPE) procedure was performed to concentrate dissolved organic matter components. Composer (v.1.5.3) software was used to filter the mass spectral data, recalibrate externally the dataset and calculate all possible formulas for detected anions. The first cloud sample (June) resulted from air mass coming from the North (North Sea) while the second one (July) resulted from air mass coming from the West (Atlantic Ocean). Thus, both cloud events derived from marine air masses but were characterized by different hydrogen peroxide concentration and dissolved organic carbon content and were sampled at different periods during the day. Elemental compositions of 6487 and 3284 unique molecular species were identified in each sample. Nitrogen-containing compounds (CHNO compounds), sulfur-containing compounds (CHOS & CHNOS compounds) and other oxygen-containing compounds (CHO compounds) with molecular weights up to 800 Da were detected

  12. Learners’ views about cloud computing-based group activities

    Directory of Open Access Journals (Sweden)

    Yildirim Serkan

    2017-01-01

    Full Text Available Thanks to its use independently of time and place during the process of software development and by making it easier to access to information with mobile technologies, cloud based environments attracted the attention of education world and this technology started to be used in various activities. In this study, for programming education, the effects of extracurricular group assignments in cloud based environments on learners were evaluated in terms of group work satisfaction, ease of use and user satisfaction. Within the scope of computer programming education lasting eight weeks, a total of 100 students participated in the study including 34 men and 66 women. Participants were divided into groups of at least three people considering the advantages of cooperative learning in programming education. In this study carried out in both conventional and cloud based environments, between groups factorial design was used as research design. The data collected by questionnaires of opinions of group work were examined with quantitative analysis method. According to the study results extracurricular learning activities as group activity created satisfaction. However, perceptions of easy use of the environment and user satisfaction were partly positive. Despite the similar understandings; male participants were easier to perceive use of cloud computing based environments. Some variables such as class level, satisfaction, computer and internet usage time do not have any effect on satisfaction and perceptions of ease of use. Evening class students stated that they found it easy to use cloud based learning environments and became more satisfied with using these environments besides being happier with group work than daytime students.

  13. Security of Heterogeneous Content in Cloud Based Library Information Systems Using an Ontology Based Approach

    Directory of Open Access Journals (Sweden)

    Mihai DOINEA

    2014-01-01

    Full Text Available As in any domain that involves the use of software, the library information systems take advantages of cloud computing. The paper highlights the main aspect of cloud based systems, describing some public solutions provided by the most important players on the market. Topics related to content security in cloud based services are tackled in order to emphasize the requirements that must be met by these types of systems. A cloud based implementation of an Information Library System is presented and some adjacent tools that are used together with it to provide digital content and metadata links are described. In a cloud based Information Library System security is approached by means of ontologies. Aspects such as content security in terms of digital rights are presented and a methodology for security optimization is proposed.

  14. The governance of cloud based Supply Chain Collaborations

    NARCIS (Netherlands)

    Chandra, Dissa Riandaso; van Hillegersberg, Jos

    2015-01-01

    Despite of the promising benefits of cloud computing in enabling efficient, sustainable and agile Supply Chain Collaborations (SCCs), this service does not eliminate governance challenges in SCCs. Cloud based SCCs may flounder without a proper understanding of how to govern inter-organizational

  15. Crowdsourcing cloud-based software development

    CERN Document Server

    Li, Wei; Tsai, Wei-Tek; Wu, Wenjun

    2015-01-01

    This book presents the latest research on the software crowdsourcing approach to develop large and complex software in a cloud-based platform. It develops the fundamental principles, management organization and processes, and a cloud-based infrastructure to support this new software development approach. The book examines a variety of issues in software crowdsourcing processes, including software quality, costs, diversity of solutions, and the competitive nature of crowdsourcing processes. Furthermore, the book outlines a research roadmap of this emerging field, including all the key technology and management issues for the foreseeable future. Crowdsourcing, as demonstrated by Wikipedia and Facebook for online web applications, has shown promising results for a variety of applications, including healthcare, business, gold mining exploration, education, and software development. Software crowdsourcing is emerging as a promising solution to designing, developing and maintaining software. Preliminary software cr...

  16. Definition of "banner clouds" based on time lapse movies

    OpenAIRE

    Schween , J. H.; Kuettner , J.; Reinert , D.; Reuder , J.; Wirth , V.

    2007-01-01

    International audience; Banner clouds appear on the leeward side of a mountain and resemble a banner or a flag. This article provides a comprehensive definition of "banner clouds". It is based primarily on an extensive collection of time lapse movies, but previous attempts at an explanation of this phenomenon are also taken into account. The following ingredients are considered essential: the cloud must be attached to the mountain but not appear on the windward side; the cloud must originate ...

  17. Cloud Collaboration: Cloud-Based Instruction for Business Writing Class

    Science.gov (United States)

    Lin, Charlie; Yu, Wei-Chieh Wayne; Wang, Jenny

    2014-01-01

    Cloud computing technologies, such as Google Docs, Adobe Creative Cloud, Dropbox, and Microsoft Windows Live, have become increasingly appreciated to the next generation digital learning tools. Cloud computing technologies encourage students' active engagement, collaboration, and participation in their learning, facilitate group work, and support…

  18. Supporting reputation based trust management enhancing security layer for cloud service models

    Science.gov (United States)

    Karthiga, R.; Vanitha, M.; Sumaiya Thaseen, I.; Mangaiyarkarasi, R.

    2017-11-01

    In the existing system trust between cloud providers and consumers is inadequate to establish the service level agreement though the consumer’s response is good cause to assess the overall reliability of cloud services. Investigators recognized the significance of trust can be managed and security can be provided based on feedback collected from participant. In this work a face recognition system that helps to identify the user effectively. So we use an image comparison algorithm where the user face is captured during registration time and get stored in database. With that original image we compare it with the sample image that is already stored in database. If both the image get matched then the users are identified effectively. When the confidential data are subcontracted to the cloud, data holders will become worried about the confidentiality of their data in the cloud. Encrypting the data before subcontracting has been regarded as the important resources of keeping user data privacy beside the cloud server. So in order to keep the data secure we use an AES algorithm. Symmetric-key algorithms practice a shared key concept, keeping data secret requires keeping this key secret. So only the user with private key can decrypt data.

  19. Modeling, Design, and Implementation of a Cloud Workflow Engine Based on Aneka

    OpenAIRE

    Zhou, Jiantao; Sun, Chaoxin; Fu, Weina; Liu, Jing; Jia, Lei; Tan, Hongyan

    2014-01-01

    This paper presents a Petri net-based model for cloud workflow which plays a key role in industry. Three kinds of parallelisms in cloud workflow are characterized and modeled. Based on the analysis of the modeling, a cloud workflow engine is designed and implemented in Aneka cloud environment. The experimental results validate the effectiveness of our approach of modeling, design, and implementation of cloud workflow.

  20. Performance Isolation in Cloud-Based Big Data Architectures

    NARCIS (Netherlands)

    Tekinerdogan, B.; Oral, Alp

    2017-01-01

    Cloud-based big data systems usually have many different tenants that require access to the server's functionality. In a nonisolated cloud system, the different tenants can freely use the resources of the server. Hereby, disruptive tenants who exceed their limits can easily cause degradation of

  1. Inverted Polarity Thunderstorms Linked with Elevated Cloud Base Height

    Science.gov (United States)

    Cummins, K. L.; Williams, E.

    2016-12-01

    The great majority of thunderstorms worldwide exhibit gross positive dipole structure, produce intracloud lightning that reduces this positive dipole (positive intracloud flashes), and produce negative cloud-to-ground lightning from the lower negative end of this dipole. During the STEPS experiment in 2000 much new evidence for thunderstorms (or cells within multi-cellular storms) with inverted polarity came to light, both from balloon soundings of electric field and from LMA analysis. Many of the storms with inverted polarity cells developed in eastern Colorado. Fleenor et al. (2009) followed up after STEPS to document a dominance of positive polarity CG lightning in many of these cases. In the present study, surface thermodynamic observations (temperature and dew point temperature) have been used to estimate the cloud base heights and temperatures at the time of the Fleenor et al. lightning observations. It was found that when more than 90% of the observed CG lightning polarity within a storm is negative, the cloud base heights were low (2000 m AGL or lower, and warmer, with T>10 C), and when more than 90% of the observed CG lightning within a storm was positive, the cloud base heights were high (3000 m AGL or higher, and colder, with Tmixed polarity were generally associated with intermediate cloud base heights. These findings on inverted polarity thunderstorms are remarkably consistent with results in other parts of the world where strong instability prevails in the presence of high cloud base height: the plateau regions of China (Liu et al., 1989; Qie et al., 2005), and in pre-monsoon India (Pawar et al., 2016), particularly when mixed polarity cases are excluded. Calculations of adiabatic cloud water content for lifting from near 0 oC cast some doubt on earlier speculation (Williams et al., 2005) that the graupel particles in these inverted polarity storms attain a wet growth condition, and so exhibit positive charging following laboratory experiments. This

  2. An improved approach for flow-based cloud point extraction.

    Science.gov (United States)

    Frizzarin, Rejane M; Rocha, Fábio R P

    2014-04-11

    Novel strategies are proposed to circumvent the main drawbacks of flow-based cloud point extraction (CPE). The surfactant-rich phase (SRP) was directly retained into the optical path of the spectrophotometric cell, thus avoiding its dilution previously to the measurement and yielding higher sensitivity. Solenoid micro-pumps were exploited to improve mixing by the pulsed flow and also to modulate the flow-rate for retention and removal of the SRP, thus avoiding the elution step, often carried out with organic solvents. The heat released and the increase of the salt concentration provided by an on-line neutralization reaction were exploited to induce the cloud point without an external heating device. These innovations were demonstrated by the spectrophotometric determination of iron, yielding a linear response from 10 to 200 μg L(-1) with a coefficient of variation of 2.3% (n=7). Detection limit and sampling rate were estimated at 5 μg L(-1) (95% confidence level) and 26 samples per hour, respectively. The enrichment factor was 8.9 and the procedure consumed only 6 μg of TAN and 390 μg of Triton X-114 per determination. At the 95% confidence level, the results obtained for freshwater samples agreed with the reference procedure and those obtained for digests of bovine muscle, rice flour, brown bread and tort lobster agreed with the certified reference values. The proposed procedure thus shows advantages in relation to previously proposed approaches for flow-based CPE, being a fast and environmental friendly alternative for on-line separation and pre-concentration. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Foundations of Blueprint for Cloud-based Service Engineering

    OpenAIRE

    Nguyen, D.K.

    2011-01-01

    Current cloud-based service offerings are often provided as one-size-fits-all solution and give little or no room for customization. This limits the ability for application developers to pick and choose offerings from multiple software, platform and infrastructure service providers and configure them dynamically and in an optimal fashion to address their application requirements. Furthermore, combining different independent cloud-based services necessitates a uniform description format that f...

  4. Comparison of cloud top heights derived from FY-2 meteorological satellites with heights derived from ground-based millimeter wavelength cloud radar

    Science.gov (United States)

    Wang, Zhe; Wang, Zhenhui; Cao, Xiaozhong; Tao, Fa

    2018-01-01

    Clouds are currently observed by both ground-based and satellite remote sensing techniques. Each technique has its own strengths and weaknesses depending on the observation method, instrument performance and the methods used for retrieval. It is important to study synergistic cloud measurements to improve the reliability of the observations and to verify the different techniques. The FY-2 geostationary orbiting meteorological satellites continuously observe the sky over China. Their cloud top temperature product can be processed to retrieve the cloud top height (CTH). The ground-based millimeter wavelength cloud radar can acquire information about the vertical structure of clouds-such as the cloud base height (CBH), CTH and the cloud thickness-and can continuously monitor changes in the vertical profiles of clouds. The CTHs were retrieved using both cloud top temperature data from the FY-2 satellites and the cloud radar reflectivity data for the same time period (June 2015 to May 2016) and the resulting datasets were compared in order to evaluate the accuracy of CTH retrievals using FY-2 satellites. The results show that the concordance rate of cloud detection between the two datasets was 78.1%. Higher consistencies were obtained for thicker clouds with larger echo intensity and for more continuous clouds. The average difference in the CTH between the two techniques was 1.46 km. The difference in CTH between low- and mid-level clouds was less than that for high-level clouds. An attenuation threshold of the cloud radar for rainfall was 0.2 mm/min; a rainfall intensity below this threshold had no effect on the CTH. The satellite CTH can be used to compensate for the attenuation error in the cloud radar data.

  5. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment

    Directory of Open Access Journals (Sweden)

    Vinothkumar Muthurajan

    2016-01-01

    Full Text Available Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function provide minimum protection level compared to asymmetric key (RSA, AES, and ECC schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.

  6. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment.

    Science.gov (United States)

    Muthurajan, Vinothkumar; Narayanasamy, Balaji

    2016-01-01

    Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric) mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function) provide minimum protection level compared to asymmetric key (RSA, AES, and ECC) schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA) to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT) regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.

  7. QUALITY ASSESSMENT AND COMPARISON OF SMARTPHONE AND LEICA C10 LASER SCANNER BASED POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    B. Sirmacek

    2016-06-01

    Full Text Available 3D urban models are valuable for urban map generation, environment monitoring, safety planning and educational purposes. For 3D measurement of urban structures, generally airborne laser scanning sensors or multi-view satellite images are used as a data source. However, close-range sensors (such as terrestrial laser scanners and low cost cameras (which can generate point clouds based on photogrammetry can provide denser sampling of 3D surface geometry. Unfortunately, terrestrial laser scanning sensors are expensive and trained persons are needed to use them for point cloud acquisition. A potential effective 3D modelling can be generated based on a low cost smartphone sensor. Herein, we show examples of using smartphone camera images to generate 3D models of urban structures. We compare a smartphone based 3D model of an example structure with a terrestrial laser scanning point cloud of the structure. This comparison gives us opportunity to discuss the differences in terms of geometrical correctness, as well as the advantages, disadvantages and limitations in data acquisition and processing. We also discuss how smartphone based point clouds can help to solve further problems with 3D urban model generation in a practical way. We show that terrestrial laser scanning point clouds which do not have color information can be colored using smartphones. The experiments, discussions and scientific findings might be insightful for the future studies in fast, easy and low-cost 3D urban model generation field.

  8. Quality Assessment and Comparison of Smartphone and Leica C10 Laser Scanner Based Point Clouds

    Science.gov (United States)

    Sirmacek, Beril; Lindenbergh, Roderik; Wang, Jinhu

    2016-06-01

    3D urban models are valuable for urban map generation, environment monitoring, safety planning and educational purposes. For 3D measurement of urban structures, generally airborne laser scanning sensors or multi-view satellite images are used as a data source. However, close-range sensors (such as terrestrial laser scanners) and low cost cameras (which can generate point clouds based on photogrammetry) can provide denser sampling of 3D surface geometry. Unfortunately, terrestrial laser scanning sensors are expensive and trained persons are needed to use them for point cloud acquisition. A potential effective 3D modelling can be generated based on a low cost smartphone sensor. Herein, we show examples of using smartphone camera images to generate 3D models of urban structures. We compare a smartphone based 3D model of an example structure with a terrestrial laser scanning point cloud of the structure. This comparison gives us opportunity to discuss the differences in terms of geometrical correctness, as well as the advantages, disadvantages and limitations in data acquisition and processing. We also discuss how smartphone based point clouds can help to solve further problems with 3D urban model generation in a practical way. We show that terrestrial laser scanning point clouds which do not have color information can be colored using smartphones. The experiments, discussions and scientific findings might be insightful for the future studies in fast, easy and low-cost 3D urban model generation field.

  9. Development and calibration of a ground-based active collector for cloud- and fogwater

    Energy Technology Data Exchange (ETDEWEB)

    Kins, L.; Junkermann, W.; Meixner, F.X.; Muller, K.P.; Ehhalt, D.H.

    1986-04-01

    In spring 1985, field experiments were started to study the scavenging processes of atmospheric trace substances. Besides the chemical analysis of precipitation sample, these studies required simultaneous collection of cloud water for chemical analysis. In particular, a ground-based cloud water collector was needed, suitable for use on the top of a TV-tower. Existing designs of ground-based cloud or fogwater samplers be divided into two general classes: a) passive collectors, which utilize the ambient wind to impact the droplets on the collection surface; b) active collectors, which accelerate the droplets to a certain velocity as they approach the collection surface. Teflon-strings are extended between two disks which are 1m apart. The disadvantage of this collector, for these experiments, was that the collector strings are always exposed to the ambient air, so that contamination by aerosol impact during dry periods can not be excluded. Furthermore, because of the length of the strings, impacted droplets need a certain time to drain off, during which they remain exposed to the ambient air stream and continue to scavenge trace gases.

  10. Trust-Enhanced Cloud Service Selection Model Based on QoS Analysis.

    Science.gov (United States)

    Pan, Yuchen; Ding, Shuai; Fan, Wenjuan; Li, Jing; Yang, Shanlin

    2015-01-01

    Cloud computing technology plays a very important role in many areas, such as in the construction and development of the smart city. Meanwhile, numerous cloud services appear on the cloud-based platform. Therefore how to how to select trustworthy cloud services remains a significant problem in such platforms, and extensively investigated owing to the ever-growing needs of users. However, trust relationship in social network has not been taken into account in existing methods of cloud service selection and recommendation. In this paper, we propose a cloud service selection model based on the trust-enhanced similarity. Firstly, the direct, indirect, and hybrid trust degrees are measured based on the interaction frequencies among users. Secondly, we estimate the overall similarity by combining the experience usability measured based on Jaccard's Coefficient and the numerical distance computed by Pearson Correlation Coefficient. Then through using the trust degree to modify the basic similarity, we obtain a trust-enhanced similarity. Finally, we utilize the trust-enhanced similarity to find similar trusted neighbors and predict the missing QoS values as the basis of cloud service selection and recommendation. The experimental results show that our approach is able to obtain optimal results via adjusting parameters and exhibits high effectiveness. The cloud services ranking by our model also have better QoS properties than other methods in the comparison experiments.

  11. Definition of "banner clouds" based on time lapse movies

    Directory of Open Access Journals (Sweden)

    J. H. Schween

    2007-01-01

    Full Text Available Banner clouds appear on the leeward side of a mountain and resemble a banner or a flag. This article provides a comprehensive definition of "banner clouds". It is based primarily on an extensive collection of time lapse movies, but previous attempts at an explanation of this phenomenon are also taken into account. The following ingredients are considered essential: the cloud must be attached to the mountain but not appear on the windward side; the cloud must originate from condensation of water vapour contained in the air (rather than consist of blowing snow; the cloud must be persistent; and the cloud must not be of convective nature. The definition is illustrated and discussed with the help of still images and time lapse movies taken at Mount Zugspitze in the Bavarian Alps.

  12. 16 year climatology of cirrus clouds over a tropical station in southern India using ground and space-based lidar observations

    Science.gov (United States)

    Pandit, A. K.; Gadhavi, H. S.; Venkat Ratnam, M.; Raghunath, K.; Rao, S. V. B.; Jayaraman, A.

    2015-06-01

    16 year (1998-2013) climatology of cirrus clouds and their macrophysical (base height, top height and geometrical thickness) and optical properties (cloud optical thickness) observed using a ground-based lidar over Gadanki (13.5° N, 79.2° E), India, is presented. The climatology obtained from the ground-based lidar is compared with the climatology obtained from seven and half years (June 2006-December 2013) of Cloud-Aerosol LIdar with Orthogonal Polarization (CALIOP) observations. A very good agreement is found between the two climatologies in spite of their opposite viewing geometries and difference in sampling frequencies. Nearly 50-55% of cirrus clouds were found to possess geometrical thickness less than 2 km. Ground-based lidar is found to detect more number of sub-visible clouds than CALIOP which has implications for global warming studies as sub-visible cirrus clouds have significant positive radiative forcing. Cirrus clouds with mid-cloud temperatures between -50 to -70 °C have a mean geometrical thickness greater than 2 km in contrast to the earlier reported value of 1.7 km. Trend analyses reveal a statistically significant increase in the altitude of sub-visible cirrus clouds which is consistent with the recent climate model simulations. Also, the fraction of sub-visible cirrus cloud is found to be increasing during the last sixteen years (1998 to 2013) which has implications to the temperature and water vapour budget in the tropical tropopause layer.

  13. Generic-distributed framework for cloud services marketplace based on unified ontology

    Directory of Open Access Journals (Sweden)

    Samer Hasan

    2017-11-01

    Full Text Available Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors’ knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  14. Generic-distributed framework for cloud services marketplace based on unified ontology.

    Science.gov (United States)

    Hasan, Samer; Valli Kumari, V

    2017-11-01

    Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  15. The Study of Pallet Pooling Information Platform Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jia-bin Li

    2018-01-01

    Full Text Available Effective implementation of pallet pooling system needs a strong information platform to support. Through the analysis of existing pallet pooling information platform (PPIP, the paper pointed out that the existing studies of PPIP are mainly based on traditional IT infrastructures and technologies which have software, hardware, resource utilization, and process restrictions. Because of the advantages of cloud computing technology like strong computing power, high flexibility, and low cost which meet the requirements of the PPIP well, this paper gave a PPIP architecture of two parts based on cloud computing: the users client and the cloud services. The cloud services include three layers, which are IaaS, PaaS, and SaaS. The method of how to deploy PPIP based on cloud computing is proposed finally.

  16. Integration of cloud-based storage in BES III computing environment

    International Nuclear Information System (INIS)

    Wang, L; Hernandez, F; Deng, Z

    2014-01-01

    We present an on-going work that aims to evaluate the suitability of cloud-based storage as a supplement to the Lustre file system for storing experimental data for the BES III physics experiment and as a backend for storing files belonging to individual members of the collaboration. In particular, we discuss our findings regarding the support of cloud-based storage in the software stack of the experiment. We report on our development work that improves the support of CERN' s ROOT data analysis framework and allows efficient remote access to data through several cloud storage protocols. We also present our efforts providing the experiment with efficient command line tools for navigating and interacting with cloud storage-based data repositories both from interactive sessions and grid jobs.

  17. CLUSTOM-CLOUD: In-Memory Data Grid-Based Software for Clustering 16S rRNA Sequence Data in the Cloud Environment.

    Directory of Open Access Journals (Sweden)

    Jeongsu Oh

    Full Text Available High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs. The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG technology-a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD

  18. CLUSTOM-CLOUD: In-Memory Data Grid-Based Software for Clustering 16S rRNA Sequence Data in the Cloud Environment.

    Science.gov (United States)

    Oh, Jeongsu; Choi, Chi-Hwan; Park, Min-Kyu; Kim, Byung Kwon; Hwang, Kyuin; Lee, Sang-Heon; Hong, Soon Gyu; Nasir, Arshan; Cho, Wan-Sup; Kim, Kyung Mo

    2016-01-01

    High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs). The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG) technology-a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes) and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD is written in JAVA

  19. CLUSTOM-CLOUD: In-Memory Data Grid-Based Software for Clustering 16S rRNA Sequence Data in the Cloud Environment

    Science.gov (United States)

    Park, Min-Kyu; Kim, Byung Kwon; Hwang, Kyuin; Lee, Sang-Heon; Hong, Soon Gyu; Nasir, Arshan; Cho, Wan-Sup; Kim, Kyung Mo

    2016-01-01

    High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs). The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG) technology–a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes) and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD is written in

  20. Data Mining Based on Cloud-Computing Technology

    Directory of Open Access Journals (Sweden)

    Ren Ying

    2016-01-01

    Full Text Available There are performance bottlenecks and scalability problems when traditional data-mining system is used in cloud computing. In this paper, we present a data-mining platform based on cloud computing. Compared with a traditional data mining system, this platform is highly scalable, has massive data processing capacities, is service-oriented, and has low hardware cost. This platform can support the design and applications of a wide range of distributed data-mining systems.

  1. Fluctuations in a quasi-stationary shallow cumulus cloud ensemble

    Directory of Open Access Journals (Sweden)

    M. Sakradzija

    2015-01-01

    Full Text Available We propose an approach to stochastic parameterisation of shallow cumulus clouds to represent the convective variability and its dependence on the model resolution. To collect information about the individual cloud lifecycles and the cloud ensemble as a whole, we employ a large eddy simulation (LES model and a cloud tracking algorithm, followed by conditional sampling of clouds at the cloud-base level. In the case of a shallow cumulus ensemble, the cloud-base mass flux distribution is bimodal, due to the different shallow cloud subtypes, active and passive clouds. Each distribution mode can be approximated using a Weibull distribution, which is a generalisation of exponential distribution by accounting for the change in distribution shape due to the diversity of cloud lifecycles. The exponential distribution of cloud mass flux previously suggested for deep convection parameterisation is a special case of the Weibull distribution, which opens a way towards unification of the statistical convective ensemble formalism of shallow and deep cumulus clouds. Based on the empirical and theoretical findings, a stochastic model has been developed to simulate a shallow convective cloud ensemble. It is formulated as a compound random process, with the number of convective elements drawn from a Poisson distribution, and the cloud mass flux sampled from a mixed Weibull distribution. Convective memory is accounted for through the explicit cloud lifecycles, making the model formulation consistent with the choice of the Weibull cloud mass flux distribution function. The memory of individual shallow clouds is required to capture the correct convective variability. The resulting distribution of the subgrid convective states in the considered shallow cumulus case is scale-adaptive – the smaller the grid size, the broader the distribution.

  2. The State of Cloud-Based Biospecimen and Biobank Data Management Tools.

    Science.gov (United States)

    Paul, Shonali; Gade, Aditi; Mallipeddi, Sumani

    2017-04-01

    Biobanks are critical for collecting and managing high-quality biospecimens from donors with appropriate clinical annotation. The high-quality human biospecimens and associated data are required to better understand disease processes. Therefore, biobanks have become an important and essential resource for healthcare research and drug discovery. However, collecting and managing huge volumes of data (biospecimens and associated clinical data) necessitate that biobanks use appropriate data management solutions that can keep pace with the ever-changing requirements of research. To automate biobank data management, biobanks have been investing in traditional Laboratory Information Management Systems (LIMS). However, there are a myriad of challenges faced by biobanks in acquiring traditional LIMS. Traditional LIMS are cost-intensive and often lack the flexibility to accommodate changes in data sources and workflows. Cloud technology is emerging as an alternative that provides the opportunity to small and medium-sized biobanks to automate their operations in a cost-effective manner, even without IT personnel. Cloud-based solutions offer the advantage of heightened security, rapid scalability, dynamic allocation of services, and can facilitate collaboration between different research groups by using a shared environment on a "pay-as-you-go" basis. The benefits offered by cloud technology have resulted in the development of cloud-based data management solutions as an alternative to traditional on-premise software. After evaluating the advantages offered by cloud technology, several biobanks have started adopting cloud-based tools. Cloud-based tools provide biobanks with easy access to biospecimen data for real-time sharing with clinicians. Another major benefit realized by biobanks by implementing cloud-based applications is unlimited data storage on the cloud and automatic backups for protecting any data loss in the face of natural calamities.

  3. Analyzing cloud base at local and regional scales to understand tropical montane cloud forest vulnerability to climate change

    Science.gov (United States)

    Ashley E. Van Beusekom; Grizelle Gonzalez; Martha A. Scholl

    2017-01-01

    The degree to which cloud immersion provides water in addition to rainfall, suppresses transpiration, and sustains tropical montane cloud forests (TMCFs) during rainless periods is not well understood. Climate and land use changes represent a threat to these forests if cloud base altitude rises as a result of regional warming or deforestation. To establish a baseline...

  4. A New Method of Cloud Detection Based on Cascaded AdaBoost

    International Nuclear Information System (INIS)

    Ma, C; Chen, F; Liu, J; Duan, J

    2014-01-01

    Cloud detection of remote sensing image is a critical step in the processing of the remote sensing images. How to quickly, accurately and effectively detect cloud on remote sensing images, is still a challenging issue in this area. In order to avoid disadvantages of the current algorithms, the cascaded AdaBoost classifier algorithm is successfully applied to the cloud detection. A new algorithm combined cascaded AdaBoost classifier and multi-features, is proposed in this paper. First, multi-features based on the color, texture and spectral features are extracted from the remote sensing image. Second, the automatic cloud detection model is obtained based on the cascaded AdaBoost algorithm. In this paper, the results show that the new algorithm can determine cloud detection model and threshold values adaptively for different resolution remote sensing training data. The accuracy of cloud detection is improved. So it is a new effective algorithm for the cloud detection of remote sensing images

  5. NAPR: a Cloud-Based Framework for Neuroanatomical Age Prediction.

    Science.gov (United States)

    Pardoe, Heath R; Kuzniecky, Ruben

    2018-01-01

    The availability of cloud computing services has enabled the widespread adoption of the "software as a service" (SaaS) approach for software distribution, which utilizes network-based access to applications running on centralized servers. In this paper we apply the SaaS approach to neuroimaging-based age prediction. Our system, named "NAPR" (Neuroanatomical Age Prediction using R), provides access to predictive modeling software running on a persistent cloud-based Amazon Web Services (AWS) compute instance. The NAPR framework allows external users to estimate the age of individual subjects using cortical thickness maps derived from their own locally processed T1-weighted whole brain MRI scans. As a demonstration of the NAPR approach, we have developed two age prediction models that were trained using healthy control data from the ABIDE, CoRR, DLBS and NKI Rockland neuroimaging datasets (total N = 2367, age range 6-89 years). The provided age prediction models were trained using (i) relevance vector machines and (ii) Gaussian processes machine learning methods applied to cortical thickness surfaces obtained using Freesurfer v5.3. We believe that this transparent approach to out-of-sample evaluation and comparison of neuroimaging age prediction models will facilitate the development of improved age prediction models and allow for robust evaluation of the clinical utility of these methods.

  6. Analysis of cloud-based solutions on EHRs systems in different scenarios.

    Science.gov (United States)

    Fernández-Cardeñosa, Gonzalo; de la Torre-Díez, Isabel; López-Coronado, Miguel; Rodrigues, Joel J P C

    2012-12-01

    Nowadays with the growing of the wireless connections people can access all the resources hosted in the Cloud almost everywhere. In this context, organisms can take advantage of this fact, in terms of e-Health, deploying Cloud-based solutions on e-Health services. In this paper two Cloud-based solutions for different scenarios of Electronic Health Records (EHRs) management system are proposed. We have researched articles published between the years 2005 and 2011 about the implementation of e-Health services based on the Cloud in Medline. In order to analyze the best scenario for the deployment of Cloud Computing two solutions for a large Hospital and a network of Primary Care Health centers have been studied. Economic estimation of the cost of the implementation for both scenarios has been done via the Amazon calculator tool. As a result of this analysis two solutions are suggested depending on the scenario: To deploy a Cloud solution for a large Hospital a typical Cloud solution in which are hired just the needed services has been assumed. On the other hand to work with several Primary Care Centers it's suggested the implementation of a network, which interconnects these centers with just one Cloud environment. Finally it's considered the fact of deploying a hybrid solution: in which EHRs with images will be hosted in the Hospital or Primary Care Centers and the rest of them will be migrated to the Cloud.

  7. Continuous Extraction of Subway Tunnel Cross Sections Based on Terrestrial Point Clouds

    Directory of Open Access Journals (Sweden)

    Zhizhong Kang

    2014-01-01

    Full Text Available An efficient method for the continuous extraction of subway tunnel cross sections using terrestrial point clouds is proposed. First, the continuous central axis of the tunnel is extracted using a 2D projection of the point cloud and curve fitting using the RANSAC (RANdom SAmple Consensus algorithm, and the axis is optimized using a global extraction strategy based on segment-wise fitting. The cross-sectional planes, which are orthogonal to the central axis, are then determined for every interval. The cross-sectional points are extracted by intersecting straight lines that rotate orthogonally around the central axis within the cross-sectional plane with the tunnel point cloud. An interpolation algorithm based on quadric parametric surface fitting, using the BaySAC (Bayesian SAmpling Consensus algorithm, is proposed to compute the cross-sectional point when it cannot be acquired directly from the tunnel points along the extraction direction of interest. Because the standard shape of the tunnel cross section is a circle, circle fitting is implemented using RANSAC to reduce the noise. The proposed approach is tested on terrestrial point clouds that cover a 150-m-long segment of a Shanghai subway tunnel, which were acquired using a LMS VZ-400 laser scanner. The results indicate that the proposed quadric parametric surface fitting using the optimized BaySAC achieves a higher overall fitting accuracy (0.9 mm than the accuracy (1.6 mm obtained by the plain RANSAC. The results also show that the proposed cross section extraction algorithm can achieve high accuracy (millimeter level, which was assessed by comparing the fitted radii with the designed radius of the cross section and comparing corresponding chord lengths in different cross sections and high efficiency (less than 3 s/section on average.

  8. CLAAS: the CM SAF cloud property data set using SEVIRI

    Science.gov (United States)

    Stengel, M. S.; Kniffka, A. K.; Meirink, J. F. M.; Lockhoff, M. L.; Tan, J. T.; Hollmann, R. H.

    2014-04-01

    An 8-year record of satellite-based cloud properties named CLAAS (CLoud property dAtAset using SEVIRI) is presented, which was derived within the EUMETSAT Satellite Application Facility on Climate Monitoring. The data set is based on SEVIRI measurements of the Meteosat Second Generation satellites, of which the visible and near-infrared channels were intercalibrated with MODIS. Applying two state-of-the-art retrieval schemes ensures high accuracy in cloud detection, cloud vertical placement and microphysical cloud properties. These properties were further processed to provide daily to monthly averaged quantities, mean diurnal cycles and monthly histograms. In particular, the per-month histogram information enhances the insight in spatio-temporal variability of clouds and their properties. Due to the underlying intercalibrated measurement record, the stability of the derived cloud properties is ensured, which is exemplarily demonstrated for three selected cloud variables for the entire SEVIRI disc and a European subregion. All data products and processing levels are introduced and validation results indicated. The sampling uncertainty of the averaged products in CLAAS is minimized due to the high temporal resolution of SEVIRI. This is emphasized by studying the impact of reduced temporal sampling rates taken at typical overpass times of polar-orbiting instruments. In particular, cloud optical thickness and cloud water path are very sensitive to the sampling rate, which in our study amounted to systematic deviations of over 10% if only sampled once a day. The CLAAS data set facilitates many cloud related applications at small spatial scales of a few kilometres and short temporal scales of a~few hours. Beyond this, the spatiotemporal characteristics of clouds on diurnal to seasonal, but also on multi-annual scales, can be studied.

  9. A new dispersive liquid-liquid microextraction using ionic liquid based microemulsion coupled with cloud point extraction for determination of copper in serum and water samples.

    Science.gov (United States)

    Arain, Salma Aslam; Kazi, Tasneem Gul; Afridi, Hassan Imran; Arain, Mariam Shahzadi; Panhwar, Abdul Haleem; Khan, Naeemullah; Baig, Jameel Ahmed; Shah, Faheem

    2016-04-01

    A simple and rapid dispersive liquid-liquid microextraction procedure based on ionic liquid assisted microemulsion (IL-µE-DLLME) combined with cloud point extraction has been developed for preconcentration copper (Cu(2+)) in drinking water and serum samples of adolescent female hepatitits C (HCV) patients. In this method a ternary system was developed to form microemulsion (µE) by phase inversion method (PIM), using ionic liquid, 1-butyl-3-methylimidazolium hexafluorophosphate ([C4mim][PF6]) and nonionic surfactant, TX-100 (as a stabilizer in aqueous media). The Ionic liquid microemulsion (IL-µE) was evaluated through visual assessment, optical light microscope and spectrophotometrically. The Cu(2+) in real water and aqueous acid digested serum samples were complexed with 8-hydroxyquinoline (oxine) and extracted into IL-µE medium. The phase separation of stable IL-µE was carried out by the micellar cloud point extraction approach. The influence of of different parameters such as pH, oxine concentration, centrifugation time and rate were investigated. At optimized experimental conditions, the limit of detection and enhancement factor were found to be 0.132 µg/L and 70 respectively, with relative standard deviation <5%. In order to validate the developed method, certified reference materials (SLRS-4 Riverine water) and human serum (Sero-M10181) were analyzed. The resulting data indicated a non-significant difference in obtained and certified values of Cu(2+). The developed procedure was successfully applied for the preconcentration and determination of trace levels of Cu(2+) in environmental and biological samples. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. A novel cost based model for energy consumption in cloud computing.

    Science.gov (United States)

    Horri, A; Dastghaibyfard, Gh

    2015-01-01

    Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment.

  11. Research on cloud background infrared radiation simulation based on fractal and statistical data

    Science.gov (United States)

    Liu, Xingrun; Xu, Qingshan; Li, Xia; Wu, Kaifeng; Dong, Yanbing

    2018-02-01

    Cloud is an important natural phenomenon, and its radiation causes serious interference to infrared detector. Based on fractal and statistical data, a method is proposed to realize cloud background simulation, and cloud infrared radiation data field is assigned using satellite radiation data of cloud. A cloud infrared radiation simulation model is established using matlab, and it can generate cloud background infrared images for different cloud types (low cloud, middle cloud, and high cloud) in different months, bands and sensor zenith angles.

  12. Cloud-based adaptive exon prediction for DNA analysis.

    Science.gov (United States)

    Putluri, Srinivasareddy; Zia Ur Rahman, Md; Fathima, Shaik Yasmeen

    2018-02-01

    Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database.

  13. Challenges of Future VANET and Cloud-Based Approaches

    Directory of Open Access Journals (Sweden)

    Rakesh Shrestha

    2018-01-01

    Full Text Available Vehicular ad hoc networks (VANETs have been studied intensively due to their wide variety of applications and services, such as passenger safety, enhanced traffic efficiency, and infotainment. With the evolution of technology and sudden growth in the number of smart vehicles, traditional VANETs face several technical challenges in deployment and management due to less flexibility, scalability, poor connectivity, and inadequate intelligence. Cloud computing is considered a way to satisfy these requirements in VANETs. However, next-generation VANETs will have special requirements of autonomous vehicles with high mobility, low latency, real-time applications, and connectivity, which may not be resolved by conventional cloud computing. Hence, merging of fog computing with the conventional cloud for VANETs is discussed as a potential solution for several issues in current and future VANETs. In addition, fog computing can be enhanced by integrating Software-Defined Network (SDN, which provides flexibility, programmability, and global knowledge of the network. We present two example scenarios for timely dissemination of safety messages in future VANETs based on fog and a combination of fog and SDN. We also explained the issues that need to be resolved for the deployment of three different cloud-based approaches.

  14. Providing a New Model for Discovering Cloud Services Based on Ontology

    Directory of Open Access Journals (Sweden)

    B. Heydari

    2017-12-01

    Full Text Available Due to its efficient, flexible, and dynamic substructure in information technology and service quality parameters estimation, cloud computing has become one of the most important issues in computer world. Discovering cloud services has been posed as a fundamental issue in reaching out high efficiency. In order to do one’s own operations in cloud space, any user needs to request several various services either simultaneously or according to a working routine. These services can be presented by different cloud producers or different decision-making policies. Therefore, service management is one of the important and challenging issues in cloud computing. With the advent of semantic web and practical services accordingly in cloud computing space, access to different kinds of applications has become possible. Ontology is the core of semantic web and can be used to ease the process of discovering services. A new model based on ontology has been proposed in this paper. The results indicate that the proposed model has explored cloud services based on user search results in lesser time compared to other models.

  15. Comparing parameterized versus measured microphysical properties of tropical convective cloud bases during the ACRIDICON–CHUVA campaign

    Directory of Open Access Journals (Sweden)

    R. C. Braga

    2017-06-01

    Full Text Available The objective of this study is to validate parameterizations that were recently developed for satellite retrievals of cloud condensation nuclei supersaturation spectra, NCCN(S, at cloud base alongside more traditional parameterizations connecting NCCN(S with cloud base updrafts and drop concentrations. This was based on the HALO aircraft measurements during the ACRIDICON–CHUVA campaign over the Amazon region, which took place in September 2014. The properties of convective clouds were measured with a cloud combination probe (CCP, a cloud and aerosol spectrometer (CAS-DPOL, and a CCN counter onboard the HALO aircraft. An intercomparison of the cloud drop size distributions (DSDs and the cloud water content (CWC derived from the different instruments generally shows good agreement within the instrumental uncertainties. To this end, the directly measured cloud drop concentrations (Nd near cloud base were compared with inferred values based on the measured cloud base updraft velocity (Wb and NCCN(S spectra. The measurements of Nd at cloud base were also compared with drop concentrations (Na derived on the basis of an adiabatic assumption and obtained from the vertical evolution of cloud drop effective radius (re above cloud base. The measurements of NCCN(S and Wb reproduced the observed Nd within the measurements uncertainties when the old (1959 Twomey's parameterization was used. The agreement between the measured and calculated Nd was only within a factor of 2 with attempts to use cloud base S, as obtained from the measured Wb, Nd, and NCCN(S. This underscores the yet unresolved challenge of aircraft measurements of S in clouds. Importantly, the vertical evolution of re with height reproduced the observation-based nearly adiabatic cloud base drop concentrations, Na. The combination of these results provides aircraft observational support for the various components of the satellite-retrieved methodology that was recently developed to

  16. Ground-based SMART-COMMIT Measurements for Studying Aerosol and Cloud Properties

    Science.gov (United States)

    Tsay, Si-Chee

    2008-01-01

    From radiometric principles, it is expected that the retrieved properties of extensive aerosols and clouds from reflected/emitted measurements by satellite (and/or aircraft) should be consistent with those retrieved from transmitted/emitted radiance observed at the surface. Although space-borne remote sensing observations cover large spatial domain, they are often plagued by contamination of surface signatures. Thus, ground-based in-situ and remote-sensing measurements, where signals come directly from atmospheric constituents, the sun, and/or the Earth-atmosphere interactions, provide additional information content for comparisons that confirm quantitatively the usefulness of the integrated surface, aircraft, and satellite data sets. The development and deployment of SMARTCOMMIT (Surface-sensing Measurements for Atmospheric Radiative Transfer - Chemical, Optical & Microphysical Measurements of In-situ Troposphere) mobile facilities are aimed for the optimal utilization of collocated ground-based observations as constraints to yield higher fidelity satellite retrievals and to determine any sampling bias due to target conditions. To quantify the energetics of the surface-atmosphere system and the atmospheric processes, SMART-COMMIT instruments fall into three categories: flux radiometer, radiance sensor and in-situ probe. In this paper, we will demonstrate the capability of SMART-COMMIT in recent field campaigns (e.g., CRYSTAL-FACE, UAE 2, BASEASIA, NAMMA) that were designed and executed to study the compelling variability in temporal scale of both anthropogenic and natural aerosols (e.g., biomass-burning smoke, airborne dust) and cirrus clouds. We envision robust approaches in which well-collocated ground-based measurements and space-borne observations will greatly advance our knowledge of extensive aerosols and clouds.

  17. Testing a polarimetric cloud imager aboard research vessel Polarstern: comparison of color-based and polarimetric cloud detection algorithms.

    Science.gov (United States)

    Barta, András; Horváth, Gábor; Horváth, Ákos; Egri, Ádám; Blahó, Miklós; Barta, Pál; Bumke, Karl; Macke, Andreas

    2015-02-10

    Cloud cover estimation is an important part of routine meteorological observations. Cloudiness measurements are used in climate model evaluation, nowcasting solar radiation, parameterizing the fluctuations of sea surface insolation, and building energy transfer models of the atmosphere. Currently, the most widespread ground-based method to measure cloudiness is based on analyzing the unpolarized intensity and color distribution of the sky obtained by digital cameras. As a new approach, we propose that cloud detection can be aided by the additional use of skylight polarization measured by 180° field-of-view imaging polarimetry. In the fall of 2010, we tested such a novel polarimetric cloud detector aboard the research vessel Polarstern during expedition ANT-XXVII/1. One of our goals was to test the durability of the measurement hardware under the extreme conditions of a trans-Atlantic cruise. Here, we describe the instrument and compare the results of several different cloud detection algorithms, some conventional and some newly developed. We also discuss the weaknesses of our design and its possible improvements. The comparison with cloud detection algorithms developed for traditional nonpolarimetric full-sky imagers allowed us to evaluate the added value of polarimetric quantities. We found that (1) neural-network-based algorithms perform the best among the investigated schemes and (2) global information (the mean and variance of intensity), nonoptical information (e.g., sun-view geometry), and polarimetric information (e.g., the degree of polarization) improve the accuracy of cloud detection, albeit slightly.

  18. Remote sensing image segmentation based on Hadoop cloud platform

    Science.gov (United States)

    Li, Jie; Zhu, Lingling; Cao, Fubin

    2018-01-01

    To solve the problem that the remote sensing image segmentation speed is slow and the real-time performance is poor, this paper studies the method of remote sensing image segmentation based on Hadoop platform. On the basis of analyzing the structural characteristics of Hadoop cloud platform and its component MapReduce programming, this paper proposes a method of image segmentation based on the combination of OpenCV and Hadoop cloud platform. Firstly, the MapReduce image processing model of Hadoop cloud platform is designed, the input and output of image are customized and the segmentation method of the data file is rewritten. Then the Mean Shift image segmentation algorithm is implemented. Finally, this paper makes a segmentation experiment on remote sensing image, and uses MATLAB to realize the Mean Shift image segmentation algorithm to compare the same image segmentation experiment. The experimental results show that under the premise of ensuring good effect, the segmentation rate of remote sensing image segmentation based on Hadoop cloud Platform has been greatly improved compared with the single MATLAB image segmentation, and there is a great improvement in the effectiveness of image segmentation.

  19. Teaching Thousands with Cloud-based GIS

    Science.gov (United States)

    Gould, Michael; DiBiase, David; Beale, Linda

    2016-04-01

    Teaching Thousands with Cloud-based GIS Educators often draw a distinction between "teaching about GIS" and "teaching with GIS." Teaching about GIS involves helping students learn what GIS is, what it does, and how it works. On the other hand, teaching with GIS involves using the technology as a means to achieve education objectives in the sciences, social sciences, professional disciplines like engineering and planning, and even the humanities. The same distinction applies to CyberGIS. Understandably, early efforts to develop CyberGIS curricula and educational resources tend to be concerned primarily with CyberGIS itself. However, if CyberGIS becomes as functional, usable and scalable as it aspires to be, teaching with CyberGIS has the potential to enable large and diverse global audiences to perform spatial analysis using hosted data, mapping and analysis services all running in the cloud. Early examples of teaching tens of thousands of students across the globe with cloud-based GIS include the massive open online courses (MOOCs) offered by Penn State University and others, as well as the series of MOOCs more recently developed and offered by Esri. In each case, ArcGIS Online was used to help students achieve educational objectives in subjects like business, geodesign, geospatial intelligence, and spatial analysis, as well as mapping. Feedback from the more than 100,000 total student participants to date, as well as from the educators and staff who supported these offerings, suggest that online education with cloud-based GIS is scalable to very large audiences. Lessons learned from the course design, development, and delivery of these early examples may be useful in informing the continuing development of CyberGIS education. While MOOCs may have passed the peak of their "hype cycle" in higher education, the phenomenon they revealed persists: namely, a global mass market of educated young adults who turn to free online education to expand their horizons. The

  20. Strengthen Cloud Computing Security with Federal Identity Management Using Hierarchical Identity-Based Cryptography

    Science.gov (United States)

    Yan, Liang; Rong, Chunming; Zhao, Gansen

    More and more companies begin to provide different kinds of cloud computing services for Internet users at the same time these services also bring some security problems. Currently the majority of cloud computing systems provide digital identity for users to access their services, this will bring some inconvenience for a hybrid cloud that includes multiple private clouds and/or public clouds. Today most cloud computing system use asymmetric and traditional public key cryptography to provide data security and mutual authentication. Identity-based cryptography has some attraction characteristics that seem to fit well the requirements of cloud computing. In this paper, by adopting federated identity management together with hierarchical identity-based cryptography (HIBC), not only the key distribution but also the mutual authentication can be simplified in the cloud.

  1. Spatiotemporal High-Resolution Cloud Mapping with a Ground-Based IR Scanner

    Directory of Open Access Journals (Sweden)

    Benjamin Brede

    2017-01-01

    Full Text Available The high spatiotemporal variability of clouds requires automated monitoring systems. This study presents a retrieval algorithm that evaluates observations of a hemispherically scanning thermal infrared radiometer, the NubiScope, to produce georeferenced, spatially explicit cloud maps. The algorithm uses atmospheric temperature and moisture profiles and an atmospheric radiative transfer code to differentiate between cloudy and cloudless measurements. In case of a cloud, it estimates its position by using the temperature profile and viewing geometry. The proposed algorithm was tested with 25 cloud maps generated by the Fmask algorithm from Landsat 7 images. The overall cloud detection rate was ranging from 0.607 for zenith angles of 0 to 10° to 0.298 for 50–60° on a pixel basis. The overall detection of cloudless pixels was 0.987 for zenith angles of 30–40° and much more stable over the whole range of zenith angles compared to cloud detection. This proves the algorithm’s capability in detecting clouds, but even better cloudless areas. Cloud-base height was best estimated up to a height of 4000 m compared to ceilometer base heights but showed large deviation above that level. This study shows the potential of the NubiScope system to produce high spatial and temporal resolution cloud maps. Future development is needed for a more accurate determination of cloud height with thermal infrared measurements.

  2. Searchable attribute-based encryption scheme with attribute revocation in cloud storage.

    Science.gov (United States)

    Wang, Shangping; Zhao, Duqiao; Zhang, Yaling

    2017-01-01

    Attribute based encryption (ABE) is a good way to achieve flexible and secure access control to data, and attribute revocation is the extension of the attribute-based encryption, and the keyword search is an indispensable part for cloud storage. The combination of both has an important application in the cloud storage. In this paper, we construct a searchable attribute-based encryption scheme with attribute revocation in cloud storage, the keyword search in our scheme is attribute based with access control, when the search succeeds, the cloud server returns the corresponding cipher text to user and the user can decrypt the cipher text definitely. Besides, our scheme supports multiple keywords search, which makes the scheme more practical. Under the assumption of decisional bilinear Diffie-Hellman exponent (q-BDHE) and decisional Diffie-Hellman (DDH) in the selective security model, we prove that our scheme is secure.

  3. Cloud point extraction, preconcentration and spectrophotometric determination of nickel in water samples using dimethylglyoxime

    Directory of Open Access Journals (Sweden)

    Morteza Bahram

    2013-01-01

    Full Text Available A new and simple method for the preconcentration and spectrophotometric determination of trace amounts of nickel was developed by cloud point extraction (CPE. In the proposed work, dimethylglyoxime (DMG was used as the chelating agent and Triton X-114 was selected as a non-ionic surfactant for CPE. The parameters affecting the cloud point extraction including the pH of sample solution, concentration of the chelating agent and surfactant, equilibration temperature and time were optimized. Under the optimum conditions, the calibration graph was linear in the range of 10-150 ng mL-1 with a detection limit of 4 ng mL-1. The relative standard deviation for 9 replicates of 100 ng mL-1 Ni(II was 1.04%. The interference effect of some anions and cations was studied. The method was applied to the determination of Ni(II in water samples with satisfactory results.

  4. CloudSat Preps for Launch at Vandenberg Air Force Base, CA

    Science.gov (United States)

    2005-01-01

    The CloudSat spacecraft sits encapsulated within its Boeing Delta launch vehicle dual payload attach fitting at Vandenberg Air Force Base, Calif. CloudSat will share its ride to orbit late next month with NASA's CALIPSO spacecraft. The two spacecraft are designed to reveal the secrets of clouds and aerosols.

  5. The MSG-SEVIRI-based cloud property data record CLAAS-2

    Directory of Open Access Journals (Sweden)

    N. Benas

    2017-07-01

    Full Text Available Clouds play a central role in the Earth's atmosphere, and satellite observations are crucial for monitoring clouds and understanding their impact on the energy budget and water cycle. Within the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF, a new cloud property data record was derived from geostationary Meteosat Spinning Enhanced Visible and Infrared Imager (SEVIRI measurements for the time frame 2004–2015. The resulting CLAAS-2 (CLoud property dAtAset using SEVIRI, Edition 2 data record is publicly available via the CM SAF website (https://doi.org/10.5676/EUM_SAF_CM/CLAAS/V002. In this paper we present an extensive evaluation of the CLAAS-2 cloud products, which include cloud fractional coverage, thermodynamic phase, cloud top properties, liquid/ice cloud water path and corresponding optical thickness and particle effective radius. Data validation and comparisons were performed on both level 2 (native SEVIRI grid and repeat cycle and level 3 (daily and monthly averages and histograms with reference datasets derived from lidar, microwave and passive imager measurements. The evaluation results show very good overall agreement with matching spatial distributions and temporal variability and small biases attributed mainly to differences in sensor characteristics, retrieval approaches, spatial and temporal samplings and viewing geometries. No major discrepancies were found. Underpinned by the good evaluation results, CLAAS-2 demonstrates that it is fit for the envisaged applications, such as process studies of the diurnal cycle of clouds and the evaluation of regional climate models. The data record is planned to be extended and updated in the future.

  6. Evaluation of Satellite-Based Upper Troposphere Cloud Top Height Retrievals in Multilayer Cloud Conditions During TC4

    Science.gov (United States)

    Chang, Fu-Lung; Minnis, Patrick; Ayers, J. Kirk; McGill, Matthew J.; Palikonda, Rabindra; Spangenberg, Douglas A.; Smith, William L., Jr.; Yost, Christopher R.

    2010-01-01

    Upper troposphere cloud top heights (CTHs), restricted to cloud top pressures (CTPs) less than 500 hPa, inferred using four satellite retrieval methods applied to Twelfth Geostationary Operational Environmental Satellite (GOES-12) data are evaluated using measurements during the July August 2007 Tropical Composition, Cloud and Climate Coupling Experiment (TC4). The four methods are the single-layer CO2-absorption technique (SCO2AT), a modified CO2-absorption technique (MCO2AT) developed for improving both single-layered and multilayered cloud retrievals, a standard version of the Visible Infrared Solar-infrared Split-window Technique (old VISST), and a new version of VISST (new VISST) recently developed to improve cloud property retrievals. They are evaluated by comparing with ER-2 aircraft-based Cloud Physics Lidar (CPL) data taken during 9 days having extensive upper troposphere cirrus, anvil, and convective clouds. Compared to the 89% coverage by upper tropospheric clouds detected by the CPL, the SCO2AT, MCO2AT, old VISST, and new VISST retrieved CTPs less than 500 hPa in 76, 76, 69, and 74% of the matched pixels, respectively. Most of the differences are due to subvisible and optically thin cirrus clouds occurring near the tropopause that were detected only by the CPL. The mean upper tropospheric CTHs for the 9 days are 14.2 (+/- 2.1) km from the CPL and 10.7 (+/- 2.1), 12.1 (+/- 1.6), 9.7 (+/- 2.9), and 11.4 (+/- 2.8) km from the SCO2AT, MCO2AT, old VISST, and new VISST, respectively. Compared to the CPL, the MCO2AT CTHs had the smallest mean biases for semitransparent high clouds in both single-layered and multilayered situations whereas the new VISST CTHs had the smallest mean biases when upper clouds were opaque and optically thick. The biases for all techniques increased with increasing numbers of cloud layers. The transparency of the upper layer clouds tends to increase with the numbers of cloud layers.

  7. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Sanggoo Kang

    2016-08-01

    Full Text Available Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-based image processing algorithms by comparing the performance of a single virtual server and multiple auto-scaled virtual servers under identical experimental conditions. In this study, the cloud computing environment is built with OpenStack, and four algorithms from the Orfeo toolbox are used for practical geo-based image processing experiments. The auto-scaling results from all experimental performance tests demonstrate applicable significance with respect to cloud utilization concerning response time. Auto-scaling contributes to the development of web-based satellite image application services using cloud-based technologies.

  8. Cloud-Based Model Calibration Using OpenStudio: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hale, E.; Lisell, L.; Goldwasser, D.; Macumber, D.; Dean, J.; Metzger, I.; Parker, A.; Long, N.; Ball, B.; Schott, M.; Weaver, E.; Brackney, L.

    2014-03-01

    OpenStudio is a free, open source Software Development Kit (SDK) and application suite for performing building energy modeling and analysis. The OpenStudio Parametric Analysis Tool has been extended to allow cloud-based simulation of multiple OpenStudio models parametrically related to a baseline model. This paper describes the new cloud-based simulation functionality and presents a model cali-bration case study. Calibration is initiated by entering actual monthly utility bill data into the baseline model. Multiple parameters are then varied over multiple iterations to reduce the difference between actual energy consumption and model simulation results, as calculated and visualized by billing period and by fuel type. Simulations are per-formed in parallel using the Amazon Elastic Cloud service. This paper highlights model parameterizations (measures) used for calibration, but the same multi-nodal computing architecture is available for other purposes, for example, recommending combinations of retrofit energy saving measures using the calibrated model as the new baseline.

  9. CLOUD-BASED PLATFORM FOR CREATING AND SHARING WEB MAPS

    Directory of Open Access Journals (Sweden)

    Jean Pierre Gatera

    2014-01-01

    Full Text Available The rise of cloud computing is one the most important thing happening in information technology today. While many things are moving into the cloud, this trend has also reached the Geographic Information System (GIS world. For the users of GIS technology, the cloud opens new possibilities for sharing web maps, applications and spatial data. The goal of this presentation/demo is to demonstrate ArcGIS Online which is a cloud-based collaborative platform that allows to easily and quickly create interactive web maps that you can share with anyone. With ready-to-use content, apps, and templates you can produce web maps right away. And no matter what you use - desktops, browsers, smartphones, or tablets - you always have access to your content.

  10. Absorbed dose from traversing spherically symmetric, Gaussian radioactive clouds

    International Nuclear Information System (INIS)

    Thompson, J.M.; Poston, J.W.

    1999-01-01

    If a large radioactive cloud is produced, sampling may require that an airplane traverse the cloud. A method to predict the absorbed dose to the aircrew from penetrating the radioactive cloud is needed. Dose rates throughout spherically symmetric Gaussian clouds of various sizes, and the absorbed doses from traversing the clouds, were calculated. Cloud size is a dominant parameter causing dose to vary by orders of magnitude for a given dose rate measured at some distance. A method to determine cloud size, based on dose rate readings at two or more distances from the cloud center, was developed. This method, however, failed to resolve the smallest cloud sizes from measurements made at 1,000 m to 2,000 m from the cloud center

  11. Streaming support for data intensive cloud-based sequence analysis.

    Science.gov (United States)

    Issa, Shadi A; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of "resources-on-demand" and "pay-as-you-go", scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

  12. Streaming Support for Data Intensive Cloud-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Shadi A. Issa

    2013-01-01

    Full Text Available Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client’s site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

  13. Streaming Support for Data Intensive Cloud-Based Sequence Analysis

    Science.gov (United States)

    Issa, Shadi A.; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J.; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation. PMID:23710461

  14. Best practices for implementing, testing and using a cloud-based communication system in a disaster situation.

    Science.gov (United States)

    Makowski, Dale

    2016-01-01

    This paper sets out the basics for approaching the selection and implementation of a cloud-based communication system to support a business continuity programme, including: • consideration for how a cloud-based communication system can enhance a business continuity programme; • descriptions of some of the more popular features of a cloud-based communication system; • options to evaluate when selecting a cloud-based communication system; • considerations for how to design a system to be most effective for an organisation; • best practices for how to conduct the initial load of data to a cloud-based communication system; • best practices for how to conduct an initial validation of the data loaded to a cloud-based communication system; • considerations for how to keep contact information in the cloud-based communication system current and accurate; • best practices for conducting ongoing system testing; • considerations for how to conduct user training; • review of other potential uses of a cloud-based communication system; and • review of other tools and features many cloud-based communication systems may offer.

  15. Simultaneous and synergistic profiling of cloud and drizzle properties using ground-based observations

    Science.gov (United States)

    Rusli, Stephanie P.; Donovan, David P.; Russchenberg, Herman W. J.

    2017-12-01

    Despite the importance of radar reflectivity (Z) measurements in the retrieval of liquid water cloud properties, it remains nontrivial to interpret Z due to the possible presence of drizzle droplets within the clouds. So far, there has been no published work that utilizes Z to identify the presence of drizzle above the cloud base in an optimized and a physically consistent manner. In this work, we develop a retrieval technique that exploits the synergy of different remote sensing systems to carry out this task and to subsequently profile the microphysical properties of the cloud and drizzle in a unified framework. This is accomplished by using ground-based measurements of Z, lidar attenuated backscatter below as well as above the cloud base, and microwave brightness temperatures. Fast physical forward models coupled to cloud and drizzle structure parameterization are used in an optimal-estimation-type framework in order to retrieve the best estimate for the cloud and drizzle property profiles. The cloud retrieval is first evaluated using synthetic signals generated from large-eddy simulation (LES) output to verify the forward models used in the retrieval procedure and the vertical parameterization of the liquid water content (LWC). From this exercise it is found that, on average, the cloud properties can be retrieved within 5 % of the mean truth. The full cloud-drizzle retrieval method is then applied to a selected ACCEPT (Analysis of the Composition of Clouds with Extended Polarization Techniques) campaign dataset collected in Cabauw, the Netherlands. An assessment of the retrieval products is performed using three independent methods from the literature; each was specifically developed to retrieve only the cloud properties, the drizzle properties below the cloud base, or the drizzle fraction within the cloud. One-to-one comparisons, taking into account the uncertainties or limitations of each retrieval, show that our results are consistent with what is derived

  16. City Hub : a cloud based IoT platform for Smart Cities

    OpenAIRE

    Lea, Rodger; Blackstock, Michael

    2014-01-01

    Cloud based Smart City hubs are an attractive approach to addressing some of the complex issues faced when deploying PaaS infrastructure for Smart Cities. In this paper we introduce the general notion of IoT hubs and then discusses our work to generalize our IoT hub as a Smart City PaaS. Two key issues are identified, support for hybrid public/private cloud and interoperability. We briefly describe our approach to these issues and discuss our experiences deploying two cloud-based Smart City h...

  17. Retrieval of liquid water cloud properties from ground-based remote sensing observations

    NARCIS (Netherlands)

    Knist, C.L.

    2014-01-01

    Accurate ground-based remotely sensed microphysical and optical properties of liquid water clouds are essential references to validate satellite-observed cloud properties and to improve cloud parameterizations in weather and climate models. This requires the evaluation of algorithms for retrieval of

  18. Diffusion and deposition of the Schooner clouds

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, Todd V [Lawrence Radiation Laboratory, University of California, Livermore, CA (United States)

    1970-05-01

    Schooner was a 31-kt nuclear cratering experiment done as part of the U.S. Atomic Energy Commission's Plowshare Program. Detonation was at 0800 PST on December 8, 1968 at the Nevada Test Site. The resulting cloud had ceased its dynamic growth by about H+4 min. Two distinct parts, a base surge and a main cloud, were evident. Thereafter, further cloud growth was by diffusion and fallout as the cloud moved downwind. Aircraft sampling of the cloud at H+12.5 min revealed that the main cloud part contained about 10 times as much radioactivity as the base surge part. Later aircraft data, local fallout field measurements, and airborne particle size data indicate that the H+12.5-min cloud burdens, primarily the tungsten isotopes, were depleted by a factor of about 2, due to fallout, over the next few hours. The remaining airborne cloud burdens for each cloud were used as input to diffusion calculations. Calculated main cloud center concentrations using observed cloud sizes, cloud burdens, and meteorology agree with measurements to better than a factor of 2 over 1 1/2 days. These postshot calculations and data are about a factor of 3 higher than calculations done preshot. Base surge calculations are consistent with available data to within about a factor of 4, but the data needed to perform as complete an analysis as was done for the main cloud do not exist. Fallout, as distinguished from deposition of nonfalling debris, was important to a distance of about 500 km for the main cloud and to a distance of about 100 km for the base surge. At distances closer to ground zero, diffusion calculations under-predicted ground level concentration and deposition, but an isotopically scaled external gross gamma fallout calculation was within about a factor of 3 of the data. At larger distances downwind for the base surge, ground level exposure rate calculations and deposition for a variety of nuclides agree to within about a factor of 3 of measurements. (author)

  19. KNOWLEDGE-BASED OBJECT DETECTION IN LASER SCANNING POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    F. Boochs

    2012-07-01

    Full Text Available Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This “understanding” enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL, used for formulating the knowledge base and the Semantic Web Rule Language (SWRL with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists’ knowledge of the scene and algorithmic processing.

  20. Knowledge-Based Object Detection in Laser Scanning Point Clouds

    Science.gov (United States)

    Boochs, F.; Karmacharya, A.; Marbs, A.

    2012-07-01

    Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This "understanding" enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL), used for formulating the knowledge base and the Semantic Web Rule Language (SWRL) with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists' knowledge of the scene and algorithmic processing.

  1. A cloud-based multimodality case file for mobile devices.

    Science.gov (United States)

    Balkman, Jason D; Loehfelm, Thomas W

    2014-01-01

    Recent improvements in Web and mobile technology, along with the widespread use of handheld devices in radiology education, provide unique opportunities for creating scalable, universally accessible, portable image-rich radiology case files. A cloud database and a Web-based application for radiologic images were developed to create a mobile case file with reasonable usability, download performance, and image quality for teaching purposes. A total of 75 radiology cases related to breast, thoracic, gastrointestinal, musculoskeletal, and neuroimaging subspecialties were included in the database. Breast imaging cases are the focus of this article, as they best demonstrate handheld display capabilities across a wide variety of modalities. This case subset also illustrates methods for adapting radiologic content to cloud platforms and mobile devices. Readers will gain practical knowledge about storage and retrieval of cloud-based imaging data, an awareness of techniques used to adapt scrollable and high-resolution imaging content for the Web, and an appreciation for optimizing images for handheld devices. The evaluation of this software demonstrates the feasibility of adapting images from most imaging modalities to mobile devices, even in cases of full-field digital mammograms, where high resolution is required to represent subtle pathologic features. The cloud platform allows cases to be added and modified in real time by using only a standard Web browser with no application-specific software. Challenges remain in developing efficient ways to generate, modify, and upload radiologic and supplementary teaching content to this cloud-based platform. Online supplemental material is available for this article. ©RSNA, 2014.

  2. Evaluation of Decision Trees for Cloud Detection from AVHRR Data

    Science.gov (United States)

    Shiffman, Smadar; Nemani, Ramakrishna

    2005-01-01

    Automated cloud detection and tracking is an important step in assessing changes in radiation budgets associated with global climate change via remote sensing. Data products based on satellite imagery are available to the scientific community for studying trends in the Earth's atmosphere. The data products include pixel-based cloud masks that assign cloud-cover classifications to pixels. Many cloud-mask algorithms have the form of decision trees. The decision trees employ sequential tests that scientists designed based on empirical astrophysics studies and simulations. Limitations of existing cloud masks restrict our ability to accurately track changes in cloud patterns over time. In a previous study we compared automatically learned decision trees to cloud masks included in Advanced Very High Resolution Radiometer (AVHRR) data products from the year 2000. In this paper we report the replication of the study for five-year data, and for a gold standard based on surface observations performed by scientists at weather stations in the British Islands. For our sample data, the accuracy of automatically learned decision trees was greater than the accuracy of the cloud masks p < 0.001.

  3. Development and Usage of Software as a Service for a Cloud and Non-Cloud Based Environment- An Empirical Study

    OpenAIRE

    Pratiyush Guleria Guleria; Vikas Sharma; Manish Arora

    2012-01-01

    Cloud computing is Internet-based computing, whereby shared resources, software, and information are provided to computers and other devices on demand. Cloud computing is a natural evolution of the widespread adoption of virtualization, service-oriented architecture and utility computing. The computer applications nowadays are becoming more and more complex; there is an ever increasing demand for computing resources. As this demand has risen, the concepts of cloud computing and grid computing...

  4. Cardiovascular imaging environment: will the future be cloud-based?

    Science.gov (United States)

    Kawel-Boehm, Nadine; Bluemke, David A

    2017-07-01

    In cardiovascular CT and MR imaging large datasets have to be stored, post-processed, analyzed and distributed. Beside basic assessment of volume and function in cardiac magnetic resonance imaging e.g., more sophisticated quantitative analysis is requested requiring specific software. Several institutions cannot afford various types of software and provide expertise to perform sophisticated analysis. Areas covered: Various cloud services exist related to data storage and analysis specifically for cardiovascular CT and MR imaging. Instead of on-site data storage, cloud providers offer flexible storage services on a pay-per-use basis. To avoid purchase and maintenance of specialized software for cardiovascular image analysis, e.g. to assess myocardial iron overload, MR 4D flow and fractional flow reserve, evaluation can be performed with cloud based software by the consumer or complete analysis is performed by the cloud provider. However, challenges to widespread implementation of cloud services include regulatory issues regarding patient privacy and data security. Expert commentary: If patient privacy and data security is guaranteed cloud imaging is a valuable option to cope with storage of large image datasets and offer sophisticated cardiovascular image analysis for institutions of all sizes.

  5. DeepSAT's CloudCNN: A Deep Neural Network for Rapid Cloud Detection from Geostationary Satellites

    Science.gov (United States)

    Kalia, S.; Li, S.; Ganguly, S.; Nemani, R. R.

    2017-12-01

    Cloud and cloud shadow detection has important applications in weather and climate studies. It is even more crucial when we introduce geostationary satellites into the field of terrestrial remotesensing. With the challenges associated with data acquired in very high frequency (10-15 mins per scan), the ability to derive an accurate cloud/shadow mask from geostationary satellite data iscritical. The key to the success for most of the existing algorithms depends on spatially and temporally varying thresholds, which better capture local atmospheric and surface effects.However, the selection of proper threshold is difficult and may lead to erroneous results. In this work, we propose a deep neural network based approach called CloudCNN to classifycloud/shadow from Himawari-8 AHI and GOES-16 ABI multispectral data. DeepSAT's CloudCNN consists of an encoder-decoder based architecture for binary-class pixel wise segmentation. We train CloudCNN on multi-GPU Nvidia Devbox cluster, and deploy the prediction pipeline on NASA Earth Exchange (NEX) Pleiades supercomputer. We achieved an overall accuracy of 93.29% on test samples. Since, the predictions take only a few seconds to segment a full multi-spectral GOES-16 or Himawari-8 Full Disk image, the developed framework can be used for real-time cloud detection, cyclone detection, or extreme weather event predictions.

  6. An Enhanced Erasure Code-Based Security Mechanism for Cloud Storage

    Directory of Open Access Journals (Sweden)

    Wenfeng Wang

    2014-01-01

    Full Text Available Cloud computing offers a wide range of luxuries, such as high performance, rapid elasticity, on-demand self-service, and low cost. However, data security continues to be a significant impediment in the promotion and popularization of cloud computing. To address the problem of data leakage caused by unreliable service providers and external cyber attacks, an enhanced erasure code-based security mechanism is proposed and elaborated in terms of four aspects: data encoding, data transmission, data placement, and data reconstruction, which ensure data security throughout the whole traversing into cloud storage. Based on the mechanism, we implement a secure cloud storage system (SCSS. The key design issues, including data division, construction of generator matrix, data encoding, fragment naming, and data decoding, are also described in detail. Finally, we conduct an analysis of data availability and security and performance evaluation. Experimental results and analysis demonstrate that SCSS achieves high availability, strong security, and excellent performance.

  7. Cloud vertical profiles derived from CALIPSO and CloudSat and a comparison with MODIS derived clouds

    Science.gov (United States)

    Kato, S.; Sun-Mack, S.; Miller, W. F.; Rose, F. G.; Minnis, P.; Wielicki, B. A.; Winker, D. M.; Stephens, G. L.; Charlock, T. P.; Collins, W. D.; Loeb, N. G.; Stackhouse, P. W.; Xu, K.

    2008-05-01

    CALIPSO and CloudSat from the a-train provide detailed information of vertical distribution of clouds and aerosols. The vertical distribution of cloud occurrence is derived from one month of CALIPSO and CloudSat data as a part of the effort of merging CALIPSO, CloudSat and MODIS with CERES data. This newly derived cloud profile is compared with the distribution of cloud top height derived from MODIS on Aqua from cloud algorithms used in the CERES project. The cloud base from MODIS is also estimated using an empirical formula based on the cloud top height and optical thickness, which is used in CERES processes. While MODIS detects mid and low level clouds over the Arctic in April fairly well when they are the topmost cloud layer, it underestimates high- level clouds. In addition, because the CERES-MODIS cloud algorithm is not able to detect multi-layer clouds and the empirical formula significantly underestimates the depth of high clouds, the occurrence of mid and low-level clouds is underestimated. This comparison does not consider sensitivity difference to thin clouds but we will impose an optical thickness threshold to CALIPSO derived clouds for a further comparison. The effect of such differences in the cloud profile to flux computations will also be discussed. In addition, the effect of cloud cover to the top-of-atmosphere flux over the Arctic using CERES SSF and FLASHFLUX products will be discussed.

  8. Privacy authentication using key attribute-based encryption in mobile cloud computing

    Science.gov (United States)

    Mohan Kumar, M.; Vijayan, R.

    2017-11-01

    Mobile Cloud Computing is becoming more popular in nowadays were users of smartphones are getting increased. So, the security level of cloud computing as to be increased. Privacy Authentication using key-attribute based encryption helps the users for business development were the data sharing with the organization using the cloud in a secured manner. In Privacy Authentication the sender of data will have permission to add their receivers to whom the data access provided for others the access denied. In sender application, the user can choose the file which is to be sent to receivers and then that data will be encrypted using Key-attribute based encryption using AES algorithm. In which cipher created, and that stored in Amazon Cloud along with key value and the receiver list.

  9. Cloud-based systems for monitoring earthquakes and other environmental quantities

    Science.gov (United States)

    Clayton, R. W.; Olson, M.; Liu, A.; Chandy, M.; Bunn, J.; Guy, R.

    2013-12-01

    There are many advantages to using a cloud-based system to record and analyze environmental quantities such as earthquakes, radiation, various gases, dust and meteorological parameters. These advantages include robustness and dynamic scalability, and also reduced costs. In this paper, we present our experiences over the last three years in developing a cloud-based earthquake monitoring system (the Community Seismic Network). This network consists of over 600 sensors (accelerometers) in the S. California region that send data directly to the Google App Engine where they are analyzed. The system is capable of handing many other types of sensor data and generating a situation-awareness analysis as a product. Other advantages to the cloud-based system are integration with other peer networks, and being able to deploy anywhere in the world without have to build addition computing infrastructure.

  10. THE CLOUD TECHNOLOGIES IN PROFESSIONAL EDUCATION OF THE FUTURE ECONOMISTS

    Directory of Open Access Journals (Sweden)

    Yu. Dyulicheva

    2014-04-01

    Full Text Available The usage of the cloud services in the professional education of the future economists are investigated. The following cloud services are analyzed in the paper: 1 the cloud service gantter for project management, its resources management, risks evaluation with example of the capabilities for Ganter diagram creation, project critical path determination based on gantter cloud service are considered; 2 the cloud service SageMath Cloud with capabilities of programming languages R, Python, Cython GAP usage for data analysis with example of the linear regression model construction on data sample based on the language R usage are considered; 3 Google’s cloud service based on the spreadsheets for the linear programming problems solving based on the tool Solver for the linear optimization problem is investigated; 4 the educational project Big Data University the main goal of which is learning of the students to handle big data based on the cloud technologies with the capabilities to learn request language SQL, language R for data analyses and the methods for data warehouse preprocessing are considered; 5 the cloud services from «1С» and «BuhSoft» studying of the accounting information systems with the reports creation and payroll capabilities are analyzed.

  11. Multiview point clouds denoising based on interference elimination

    Science.gov (United States)

    Hu, Yang; Wu, Qian; Wang, Le; Jiang, Huanyu

    2018-03-01

    Newly emerging low-cost depth sensors offer huge potentials for three-dimensional (3-D) modeling, but existing high noise restricts these sensors from obtaining accurate results. Thus, we proposed a method for denoising registered multiview point clouds with high noise to solve that problem. The proposed method is aimed at fully using redundant information to eliminate the interferences among point clouds of different views based on an iterative procedure. In each iteration, noisy points are either deleted or moved to their weighted average targets in accordance with two cases. Simulated data and practical data captured by a Kinect v2 sensor were tested in experiments qualitatively and quantitatively. Results showed that the proposed method can effectively reduce noise and recover local features from highly noisy multiview point clouds with good robustness, compared to truncated signed distance function and moving least squares (MLS). Moreover, the resulting low-noise point clouds can be further smoothed by the MLS to achieve improved results. This study provides the feasibility of obtaining fine 3-D models with high-noise devices, especially for depth sensors, such as Kinect.

  12. Improved cloud parameterization for Arctic climate simulations based on satellite data

    Science.gov (United States)

    Klaus, Daniel; Dethloff, Klaus; Dorn, Wolfgang; Rinke, Annette

    2015-04-01

    The defective representation of Arctic cloud processes and properties remains a crucial problem in climate modelling and in reanalysis products. Satellite-based cloud observations (MODIS and CPR/CALIOP) and single-column model simulations (HIRHAM5-SCM) were exploited to evaluate and improve the simulated Arctic cloud cover of the atmospheric regional climate model HIRHAM5. The ECMWF reanalysis dataset 'ERA-Interim' (ERAint) was used for the model initialization, the lateral boundary forcing as well as the dynamical relaxation inside the pan-Arctic domain. HIRHAM5 has a horizontal resolution of 0.25° and uses 40 pressure-based and terrain-following vertical levels. In comparison with the satellite observations, the HIRHAM5 control run (HH5ctrl) systematically overestimates total cloud cover, but to a lesser extent than ERAint. The underestimation of high- and mid-level clouds is strongly outweighed by the overestimation of low-level clouds. Numerous sensitivity studies with HIRHAM5-SCM suggest (1) the parameter tuning, enabling a more efficient Bergeron-Findeisen process, combined with (2) an extension of the prognostic-statistical (PS) cloud scheme, enabling the use of negatively skewed beta distributions. This improved model setup was then used in a corresponding HIRHAM5 sensitivity run (HH5sens). While the simulated high- and mid-level cloud cover is improved only to a limited extent, the large overestimation of low-level clouds can be systematically and significantly reduced, especially over sea ice. Consequently, the multi-year annual mean area average of total cloud cover with respect to sea ice is almost 14% lower than in HH5ctrl. Overall, HH5sens slightly underestimates the observed total cloud cover but shows a halved multi-year annual mean bias of 2.2% relative to CPR/CALIOP at all latitudes north of 60° N. Importantly, HH5sens produces a more realistic ratio between the cloud water and ice content. The considerably improved cloud simulation manifests in

  13. A Developed Artificial Bee Colony Algorithm Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Ye Jin

    2018-04-01

    Full Text Available The Artificial Bee Colony (ABC algorithm is a bionic intelligent optimization method. The cloud model is a kind of uncertainty conversion model between a qualitative concept T ˜ that is presented by nature language and its quantitative expression, which integrates probability theory and the fuzzy mathematics. A developed ABC algorithm based on cloud model is proposed to enhance accuracy of the basic ABC algorithm and avoid getting trapped into local optima by introducing a new select mechanism, replacing the onlooker bees’ search formula and changing the scout bees’ updating formula. Experiments on CEC15 show that the new algorithm has a faster convergence speed and higher accuracy than the basic ABC and some cloud model based ABC variants.

  14. Geometric data perturbation-based personal health record transactions in cloud computing.

    Science.gov (United States)

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  15. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    Science.gov (United States)

    Balasubramaniam, S.; Kavitha, V.

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826

  16. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    Directory of Open Access Journals (Sweden)

    S. Balasubramaniam

    2015-01-01

    Full Text Available Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  17. Closing the Skill Gap of Cloud CRM Application Services in Cloud Computing for Evaluating Big Data Solutions

    Directory of Open Access Journals (Sweden)

    You-Shyang Chen

    2016-12-01

    Full Text Available Information systems (IS continually motivate various improvements in the state-of-the-art of issues and solutions for advanced geo-information technologies in cloud computing. Reducing IS project risks and improving organizational performance has become an important issue. This study proposes a research framework, constructed from the Stimulus-Organism-Response (S-O-R framework, in order to address the issues comprising the stimulus of project risk, the organism of project management, and the response of organizational performance for cloud service solutions. Cloud customer relationship management (cloud CRM experts, based on cloud computing, with many years of project management experience, were selected for the interview sample in this study. Decision Making Trial and Evaluation Laboratory–based analytical network process (DEMATEL based-ANP, DANP is a multiple-criteria decision-making (MCDM analysis tool that does not have prior assumptions and it was used to experience the dynamic relationships among project risk, project management, and organizational performance. The study results include three directions: (a Improving the internal business process performance can improve the efficiency of cloud CRM project processes and activities; (b The emphasis on financial performance management can reduce the cost of a cloud CRM project so that the project can be completed within the approved budget; (c Meeting user needs can improve user risk and reduce negative cloud CRM user experience. The scientific value of this study can be extended in order to study different projects, through research methods and frameworks, in order to explore project risk management and corporate performance improvements.

  18. Aerosols, clouds, and precipitation in the North Atlantic trades observed during the Barbados aerosol cloud experiment – Part 1: Distributions and variability

    Directory of Open Access Journals (Sweden)

    E. Jung

    2016-07-01

    clouds were less than 1 km deep. Clouds tend to precipitate when the cloud is thicker than 500–600 m. Distributions of cloud field characteristics (depth, radar reflectivity, Doppler velocity, precipitation were well identified in the reflectivity–velocity diagram from the cloud radar observations. Two types of precipitation features were observed for shallow marine cumulus clouds that may impact boundary layer differently: first, a classic cloud-base precipitation where precipitation shafts were observed to emanate from the cloud base; second, cloud-top precipitation where precipitation shafts emanated mainly near the cloud tops, sometimes accompanied by precipitation near the cloud base. The second type of precipitation was more frequently observed during the experiment. Only 42–44 % of the clouds sampled were non-precipitating throughout the entire cloud layer and the rest of the clouds showed precipitation somewhere in the cloud, predominantly closer to the cloud top.

  19. OpenID connect as a security service in Cloud-based diagnostic imaging systems

    Science.gov (United States)

    Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter

    2015-03-01

    The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.

  20. Cloud Study Investigators: Using NASA's CERES S'COOL in Problem-Based Learning

    Science.gov (United States)

    Moore, Susan; Popiolkowski, Gary

    2011-01-01

    1This article describes how, by incorporating NASA's Students' Cloud Observations On-Line (S'COOL) project into a problem-based learning (PBL) activity, middle school students are engaged in authentic scientific research where they observe and record information about clouds and contribute ground truth data to NASA's Clouds and the Earth's…

  1. Managing the move to the cloud – analyzing the risks and opportunities of cloud-based accounting information systems

    OpenAIRE

    Asatiani, Aleksandre; Penttinen, Esko

    2015-01-01

    The accounting industry is being disrupted by the introduction of cloud-based accounting information systems (AIS) that allow for a more efficient allocation of work between the accountant and the client company. In cloud-based AIS, the accountant and the client company as well as third parties such as auditors can simultaneously work on the data in real time. This, in turn, enables a much more granular division of work between the parties. This teaching case considers Kluuvin Apteekki, a sma...

  2. Study on Cloud Computing Resource Scheduling Strategy Based on the Ant Colony Optimization Algorithm

    OpenAIRE

    Lingna He; Qingshui Li; Linan Zhu

    2012-01-01

    In order to replace the traditional Internet software usage patterns and enterprise management mode, this paper proposes a new business calculation mode- cloud computing, resources scheduling strategy is the key technology in cloud computing, Based on the study of cloud computing system structure and the mode of operation, The key research for cloud computing the process of the work scheduling and resource allocation problems based on ant colony algorithm , Detailed analysis and design of the...

  3. GPU-Based Point Cloud Superpositioning for Structural Comparisons of Protein Binding Sites.

    Science.gov (United States)

    Leinweber, Matthias; Fober, Thomas; Freisleben, Bernd

    2018-01-01

    In this paper, we present a novel approach to solve the labeled point cloud superpositioning problem for performing structural comparisons of protein binding sites. The solution is based on a parallel evolution strategy that operates on large populations and runs on GPU hardware. The proposed evolution strategy reduces the likelihood of getting stuck in a local optimum of the multimodal real-valued optimization problem represented by labeled point cloud superpositioning. The performance of the GPU-based parallel evolution strategy is compared to a previously proposed CPU-based sequential approach for labeled point cloud superpositioning, indicating that the GPU-based parallel evolution strategy leads to qualitatively better results and significantly shorter runtimes, with speed improvements of up to a factor of 1,500 for large populations. Binary classification tests based on the ATP, NADH, and FAD protein subsets of CavBase, a database containing putative binding sites, show average classification rate improvements from about 92 percent (CPU) to 96 percent (GPU). Further experiments indicate that the proposed GPU-based labeled point cloud superpositioning approach can be superior to traditional protein comparison approaches based on sequence alignments.

  4. A Machine Learning Based Intrusion Impact Analysis Scheme for Clouds

    Directory of Open Access Journals (Sweden)

    Junaid Arshad

    2012-01-01

    Full Text Available Clouds represent a major paradigm shift, inspiring the contemporary approach to computing. They present fascinating opportunities to address dynamic user requirements with the provision of on demand expandable computing infrastructures. However, Clouds introduce novel security challenges which need to be addressed to facilitate widespread adoption. This paper is focused on one such challenge - intrusion impact analysis. In particular, we highlight the significance of intrusion impact analysis for the overall security of Clouds. Additionally, we present a machine learning based scheme to address this challenge in accordance with the specific requirements of Clouds for intrusion impact analysis. We also present rigorous evaluation performed to assess the effectiveness and feasibility of the proposed method to address this challenge for Clouds. The evaluation results demonstrate high degree of effectiveness to correctly determine the impact of an intrusion along with significant reduction with respect to the intrusion response time.

  5. Identity based Encryption and Biometric Authentication Scheme for Secure Data Access in Cloud Computing

    DEFF Research Database (Denmark)

    Cheng, Hongbing; Rong, Chunming; Tan, Zheng-Hua

    2012-01-01

    Cloud computing will be a main information infrastructure in the future; it consists of many large datacenters which are usually geographically distributed and heterogeneous. How to design a secure data access for cloud computing platform is a big challenge. In this paper, we propose a secure data...... access scheme based on identity-based encryption and biometric authentication for cloud computing. Firstly, we describe the security concern of cloud computing and then propose an integrated data access scheme for cloud computing, the procedure of the proposed scheme include parameter setup, key...... distribution, feature template creation, cloud data processing and secure data access control. Finally, we compare the proposed scheme with other schemes through comprehensive analysis and simulation. The results show that the proposed data access scheme is feasible and secure for cloud computing....

  6. Deadline based scheduling for data-intensive applications in clouds

    Institute of Scientific and Technical Information of China (English)

    Fu Xiong; Cang Yeliang; Zhu Lipeng; Hu Bin; Deng Song; Wang Dong

    2016-01-01

    Cloud computing emerges as a new computing pattern that can provide elastic services for any users around the world.It provides good chances to solve large scale scientific problems with fewer efforts.Application deployment remains an important issue in clouds.Appropriate scheduling mechanisms can shorten the total completion time of an application and therefore improve the quality of service (QoS) for cloud users.Unlike current scheduling algorithms which mostly focus on single task allocation,we propose a deadline based scheduling approach for data-intensive applications in clouds.It does not simply consider the total completion time of an application as the sum of all its subtasks' completion time.Not only the computation capacity of virtual machine (VM) is considered,but also the communication delay and data access latencies are taken into account.Simulations show that our proposed approach has a decided advantage over the two other algorithms.

  7. PRINCIPLES OF MODERN UNIVERSITY "ACADEMIC CLOUD" FORMATION BASED ON OPEN SOFTWARE PLATFORM

    Directory of Open Access Journals (Sweden)

    Olena H. Hlazunova

    2014-09-01

    Full Text Available In the article approaches to the use of cloud technology in teaching of higher education students are analyzed. The essence of the concept of "academic cloud" and its structural elements are justified. The model of academic clouds of the modern university, which operates on the basis of open software platforms, are proposed. Examples of functional software and platforms, that provide the needs of students in e-learning resources, are given. The models of deployment Cloud-oriented environment in higher education: private cloud, infrastructure as a service and platform as a service, are analyzed. The comparison of the cost of deployment "academic cloud" based on its own infrastructure of the institution and lease infrastructure vendor are substantiated.

  8. A Wing Pod-based Millimeter Wave Cloud Radar on HIAPER

    Science.gov (United States)

    Vivekanandan, Jothiram; Tsai, Peisang; Ellis, Scott; Loew, Eric; Lee, Wen-Chau; Emmett, Joanthan

    2014-05-01

    One of the attractive features of a millimeter wave radar system is its ability to detect micron-sized particles that constitute clouds with lower than 0.1 g m-3 liquid or ice water content. Scanning or vertically-pointing ground-based millimeter wavelength radars are used to study stratocumulus (Vali et al. 1998; Kollias and Albrecht 2000) and fair-weather cumulus (Kollias et al. 2001). Airborne millimeter wavelength radars have been used for atmospheric remote sensing since the early 1990s (Pazmany et al. 1995). Airborne millimeter wavelength radar systems, such as the University of Wyoming King Air Cloud Radar (WCR) and the NASA ER-2 Cloud Radar System (CRS), have added mobility to observe clouds in remote regions and over oceans. Scientific requirements of millimeter wavelength radar are mainly driven by climate and cloud initiation studies. Survey results from the cloud radar user community indicated a common preference for a narrow beam W-band radar with polarimetric and Doppler capabilities for airborne remote sensing of clouds. For detecting small amounts of liquid and ice, it is desired to have -30 dBZ sensitivity at a 10 km range. Additional desired capabilities included a second wavelength and/or dual-Doppler winds. Modern radar technology offers various options (e.g., dual-polarization and dual-wavelength). Even though a basic fixed beam Doppler radar system with a sensitivity of -30 dBZ at 10 km is capable of satisfying cloud detection requirements, the above-mentioned additional options, namely dual-wavelength, and dual-polarization, significantly extend the measurement capabilities to further reduce any uncertainty in radar-based retrievals of cloud properties. This paper describes a novel, airborne pod-based millimeter wave radar, preliminary radar measurements and corresponding derived scientific products. Since some of the primary engineering requirements of this millimeter wave radar are that it should be deployable on an airborne platform

  9. FINDING CUBOID-BASED BUILDING MODELS IN POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    W. Nguatem

    2012-07-01

    Full Text Available In this paper, we present an automatic approach for the derivation of 3D building models of level-of-detail 1 (LOD 1 from point clouds obtained from (dense image matching or, for comparison only, from LIDAR. Our approach makes use of the predominance of vertical structures and orthogonal intersections in architectural scenes. After robustly determining the scene's vertical direction based on the 3D points we use it as constraint for a RANSAC-based search for vertical planes in the point cloud. The planes are further analyzed to segment reliable outlines for rectangular surface within these planes, which are connected to construct cuboid-based building models. We demonstrate that our approach is robust and effective over a range of real-world input data sets with varying point density, amount of noise, and outliers.

  10. QoE Guarantee Scheme Based on Cooperative Cognitive Cloud and Opportunistic Weight Particle Swarm

    Directory of Open Access Journals (Sweden)

    Weihang Shi

    2015-01-01

    Full Text Available It is well known that the Internet application of cloud services may be affected by the inefficiency of cloud computing and inaccurate evaluation of quality of experience (QoE seriously. In our paper, based on construction algorithms of cooperative cognitive cloud platform and optimization algorithm of opportunities weight particle swarm clustering, the QoE guarantee mechanism was proposed. The mechanism, through the sending users of requests and the cognitive neighbor users’ cooperation, combined the cooperation of subcloud platforms and constructed the optimal cloud platform with the different service. At the same time, the particle swarm optimization algorithm could be enhanced dynamically according to all kinds of opportunity request weight, which could optimize the cooperative cognitive cloud platform. Finally, the QoE guarantee scheme was proposed with the opportunity weight particle swarm optimization algorithm and collaborative cognitive cloud platform. The experimental results show that the proposed mechanism compared is superior to the QoE guarantee scheme based on cooperative cloud and QoE guarantee scheme based on particle swarm optimization, compared with optimization fitness and high cloud computing service execution efficiency and high throughput performance advantages.

  11. Model Based Business and IT Cloud Alignment as a Cloud Offering

    OpenAIRE

    Robert Woitsch; Wilfrid Utz

    2015-01-01

    Cloud computing proved to offer flexible IT solutions. Although large enterprises may benefit from this technology by educating their IT departments, SMEs are dramatically falling behind in cloud usage and hence lose the ability to efficiently adapt their IT to their business needs. This paper introduces the project idea of the H2020 project CloudSocket, by elaborating the idea of Business Processes as a Service, where concept models and semantics are applied to align business pro...

  12. Clone-based Data Index in Cloud Storage Systems

    Directory of Open Access Journals (Sweden)

    He Jing

    2016-01-01

    Full Text Available The storage systems have been challenged by the development of cloud computing. The traditional data index cannot satisfy the requirements of cloud computing because of the huge index volumes and quick response time. Meanwhile, because of the increasing size of data index and its dynamic characteristics, the previous ways, which rebuilding the index or fully backup the index before the data has changed, cannot satisfy the need of today’s big data index. To solve these problems, we propose a double-layer index structure that overcomes the throughput limitation of single point server. Then, a clone based B+ tree structure is proposed to achieve high performance and adapt dynamic environment. The experimental results show that our clone-based solution has high efficiency.

  13. OpenID Connect as a security service in cloud-based medical imaging systems.

    Science.gov (United States)

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-04-01

    The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model.

  14. A simple method for determination of carmine in food samples based on cloud point extraction and spectrophotometric detection.

    Science.gov (United States)

    Heydari, Rouhollah; Hosseini, Mohammad; Zarabi, Sanaz

    2015-01-01

    In this paper, a simple and cost effective method was developed for extraction and pre-concentration of carmine in food samples by using cloud point extraction (CPE) prior to its spectrophotometric determination. Carmine was extracted from aqueous solution using Triton X-100 as extracting solvent. The effects of main parameters such as solution pH, surfactant and salt concentrations, incubation time and temperature were investigated and optimized. Calibration graph was linear in the range of 0.04-5.0 μg mL(-1) of carmine in the initial solution with regression coefficient of 0.9995. The limit of detection (LOD) and limit of quantification were 0.012 and 0.04 μg mL(-1), respectively. Relative standard deviation (RSD) at low concentration level (0.05 μg mL(-1)) of carmine was 4.8% (n=7). Recovery values in different concentration levels were in the range of 93.7-105.8%. The obtained results demonstrate the proposed method can be applied satisfactory to determine the carmine in food samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Cloud field classification based on textural features

    Science.gov (United States)

    Sengupta, Sailes Kumar

    1989-01-01

    An essential component in global climate research is accurate cloud cover and type determination. Of the two approaches to texture-based classification (statistical and textural), only the former is effective in the classification of natural scenes such as land, ocean, and atmosphere. In the statistical approach that was adopted, parameters characterizing the stochastic properties of the spatial distribution of grey levels in an image are estimated and then used as features for cloud classification. Two types of textural measures were used. One is based on the distribution of the grey level difference vector (GLDV), and the other on a set of textural features derived from the MaxMin cooccurrence matrix (MMCM). The GLDV method looks at the difference D of grey levels at pixels separated by a horizontal distance d and computes several statistics based on this distribution. These are then used as features in subsequent classification. The MaxMin tectural features on the other hand are based on the MMCM, a matrix whose (I,J)th entry give the relative frequency of occurrences of the grey level pair (I,J) that are consecutive and thresholded local extremes separated by a given pixel distance d. Textural measures are then computed based on this matrix in much the same manner as is done in texture computation using the grey level cooccurrence matrix. The database consists of 37 cloud field scenes from LANDSAT imagery using a near IR visible channel. The classification algorithm used is the well known Stepwise Discriminant Analysis. The overall accuracy was estimated by the percentage or correct classifications in each case. It turns out that both types of classifiers, at their best combination of features, and at any given spatial resolution give approximately the same classification accuracy. A neural network based classifier with a feed forward architecture and a back propagation training algorithm is used to increase the classification accuracy, using these two classes

  16. Introducing two Random Forest based methods for cloud detection in remote sensing images

    Science.gov (United States)

    Ghasemian, Nafiseh; Akhoondzadeh, Mehdi

    2018-07-01

    Cloud detection is a necessary phase in satellite images processing to retrieve the atmospheric and lithospheric parameters. Currently, some cloud detection methods based on Random Forest (RF) model have been proposed but they do not consider both spectral and textural characteristics of the image. Furthermore, they have not been tested in the presence of snow/ice. In this paper, we introduce two RF based algorithms, Feature Level Fusion Random Forest (FLFRF) and Decision Level Fusion Random Forest (DLFRF) to incorporate visible, infrared (IR) and thermal spectral and textural features (FLFRF) including Gray Level Co-occurrence Matrix (GLCM) and Robust Extended Local Binary Pattern (RELBP_CI) or visible, IR and thermal classifiers (DLFRF) for highly accurate cloud detection on remote sensing images. FLFRF first fuses visible, IR and thermal features. Thereafter, it uses the RF model to classify pixels to cloud, snow/ice and background or thick cloud, thin cloud and background. DLFRF considers visible, IR and thermal features (both spectral and textural) separately and inserts each set of features to RF model. Then, it holds vote matrix of each run of the model. Finally, it fuses the classifiers using the majority vote method. To demonstrate the effectiveness of the proposed algorithms, 10 Terra MODIS and 15 Landsat 8 OLI/TIRS images with different spatial resolutions are used in this paper. Quantitative analyses are based on manually selected ground truth data. Results show that after adding RELBP_CI to input feature set cloud detection accuracy improves. Also, the average cloud kappa values of FLFRF and DLFRF on MODIS images (1 and 0.99) are higher than other machine learning methods, Linear Discriminate Analysis (LDA), Classification And Regression Tree (CART), K Nearest Neighbor (KNN) and Support Vector Machine (SVM) (0.96). The average snow/ice kappa values of FLFRF and DLFRF on MODIS images (1 and 0.85) are higher than other traditional methods. The

  17. Stormbow: A Cloud-Based Tool for Reads Mapping and Expression Quantification in Large-Scale RNA-Seq Studies.

    Science.gov (United States)

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance

    2013-01-01

    RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.

  18. Security on Cloud Revocation Authority using Identity Based Encryption

    Science.gov (United States)

    Rajaprabha, M. N.

    2017-11-01

    As due to the era of cloud computing most of the people are saving there documents, files and other things on cloud spaces. Due to this security over the cloud is also important because all the confidential things are there on the cloud. So to overcome private key infrastructure (PKI) issues some revocable Identity Based Encryption (IBE) techniques are introduced which eliminates the demand of PKI. The technique introduced is key update cloud service provider which is having two issues in it and they are computation and communication cost is high and second one is scalability issue. So to overcome this problem we come along with the system in which the Cloud Revocation Authority (CRA) is there for the security which will only hold the secret key for each user. And the secret key was send with the help of advanced encryption standard security. The key is encrypted and send to the CRA for giving the authentication to the person who wants to share the data or files or for the communication purpose. Through that key only the other user will able to access that file and if the user apply some invalid key on the particular file than the information of that user and file is send to the administrator and administrator is having rights to block that person of black list that person to use the system services.

  19. Smart learning services based on smart cloud computing.

    Science.gov (United States)

    Kim, Svetlana; Song, Su-Mi; Yoon, Yong-Ik

    2011-01-01

    Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user's behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S)--smart pull, smart prospect, smart content, and smart push--concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users' needs by collecting and analyzing users' behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users' behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.

  20. Smart Learning Services Based on Smart Cloud Computing

    Directory of Open Access Journals (Sweden)

    Yong-Ik Yoon

    2011-08-01

    Full Text Available Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user’s behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S—smart pull, smart prospect, smart content, and smart push—concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users’ needs by collecting and analyzing users’ behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users’ behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.

  1. A Systematic Mapping Study of Software Architectures for Cloud Based Systems

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    Context: Cloud computing has gained significant attention of researchers and practitioners. This emerging paradigm is being used to provide solutions in multiple domains without huge upfront investment because of its on demand recourse-provisioning model. However, the information about how software...... of this study is to systematically identify and analyze the currently published research on the topics related to software architectures for cloud-based systems in order to identify architecture solutions for achieving quality requirements. Method: We decided to carry out a systematic mapping study to find...... as much peer-reviewed literature on the topics related to software architectures for cloud-based systems as possible. This study has been carried out by following the guidelines for conducting systematic literature reviews and systematic mapping studies as reported in the literature. Based on our paper...

  2. Influence of Ice Cloud Microphysics on Imager-Based Estimates of Earth's Radiation Budget

    Science.gov (United States)

    Loeb, N. G.; Kato, S.; Minnis, P.; Yang, P.; Sun-Mack, S.; Rose, F. G.; Hong, G.; Ham, S. H.

    2016-12-01

    A central objective of the Clouds and the Earth's Radiant Energy System (CERES) is to produce a long-term global climate data record of Earth's radiation budget from the TOA down to the surface along with the associated atmospheric and surface properties that influence it. CERES relies on a number of data sources, including broadband radiometers measuring incoming and reflected solar radiation and OLR, high-resolution spectral imagers, meteorological, aerosol and ozone assimilation data, and snow/sea-ice maps based on microwave radiometer data. While the TOA radiation budget is largely determined directly from accurate broadband radiometer measurements, the surface radiation budget is derived indirectly through radiative transfer model calculations initialized using imager-based cloud and aerosol retrievals and meteorological assimilation data. Because ice cloud particles exhibit a wide range of shapes, sizes and habits that cannot be independently retrieved a priori from passive visible/infrared imager measurements, assumptions about the scattering properties of ice clouds are necessary in order to retrieve ice cloud optical properties (e.g., optical depth) from imager radiances and to compute broadband radiative fluxes. This presentation will examine how the choice of an ice cloud particle model impacts computed shortwave (SW) radiative fluxes at the top-of-atmosphere (TOA) and surface. The ice cloud particle models considered correspond to those from prior, current and future CERES data product versions. During the CERES Edition2 (and Edition3) processing, ice cloud particles were assumed to be smooth hexagonal columns. In the Edition4, roughened hexagonal columns are assumed. The CERES team is now working on implementing in a future version an ice cloud particle model comprised of a two-habit ice cloud model consisting of roughened hexagonal columns and aggregates of roughened columnar elements. In each case, we use the same ice particle model in both the

  3. Cloud-based Jupyter Notebooks for Water Data Analysis

    Science.gov (United States)

    Castronova, A. M.; Brazil, L.; Seul, M.

    2017-12-01

    The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative

  4. Aerosol and cloud properties derived from hyperspectral transmitted light in the southeast Atlantic sampled during field campaign deployments in 2016 and 2017

    Science.gov (United States)

    LeBlanc, S. E.; Redemann, J.; Flynn, C. J.; Segal-Rosenhaimer, M.; Kacenelenbogen, M. S.; Shinozuka, Y.; Pistone, K.; Karol, Y.; Schmidt, S.; Cochrane, S.; Chen, H.; Meyer, K.; Ferrare, R. A.; Burton, S. P.; Hostetler, C. A.; Hair, J. W.

    2017-12-01

    We present aerosol and cloud properties collected from airborne remote-sensing measurements in the southeast Atlantic during the recent NASA ObseRvations of CLouds above Aerosols and their intEractionS (ORACLES) field campaign. During the biomass burning seasons of September 2016 and August 2017, we sampled aerosol layers which overlaid marine stratocumulus clouds off the southwestern coast of Africa. We sampled these aerosol layers and the underlying clouds from the NASA P3 airborne platform with the Spectrometer for Sky-Scanning, Sun-Tracking Atmospheric Research (4STAR). Aerosol optical depth (AOD), along with trace gas content in the atmospheric column (water vapor, NO2, and O3), is obtained from the attenuation in the sun's direct beam, measured at the altitude of the airborne platform. Using hyperspectral transmitted light measurements from 4STAR, in conjunction with hyperspectral hemispheric irradiance measurements from the Solar Spectral Flux Radiometers (SSFR), we also obtained aerosol intensive properties (asymmetry parameter, single scattering albedo), aerosol size distributions, cloud optical depth (COD), cloud particle effective radius, and cloud thermodynamic phase. Aerosol intensive properties are retrieved from measurements of angularly resolved skylight and flight level spectral albedo using the inversion used with measurements from AERONET (Aerosol Robotic Network) that has been modified for airborne use. The cloud properties are obtained from 4STAR measurements of scattered light below clouds. We show a favorable initial comparison of the above-cloud AOD measured by 4STAR to this same product retrieved from measurements by the MODIS instrument on board the TERRA and AQUA satellites. The layer AOD observed above clouds will also be compared to integrated aerosol extinction profile measurements from the High Spectral Resolution Lidar-2 (HSRL-2).

  5. Observed Correlation between Aerosol and Cloud Base Height for Low Clouds at Baltimore and New York, United States

    Directory of Open Access Journals (Sweden)

    Sium Gebremariam

    2018-04-01

    Full Text Available The correlation between aerosol particulate matter with aerodynamic diameter ≤2.5 μ m (PM2.5 and cloud base height (CBH of low clouds (CBH lower than 1.5 km a.g.l. at Baltimore and New York, United States, for an 8 year period (2007–2014 was investigated using information from the Automated Surface Observing System (ASOS observations and collocated U.S. Environmental Protection Agency (EPA observations. The lifting condensation level (LCL heights were calculated and compared with the CBH. The monthly average observations show that PM2.5 decreases from 2007 to 2014 while there is no significant trend found for CBH and LCL. The variability of the LCL height agrees well with CBH but LCL height is systematically lower than CBH (~180 m lower. There was a significant negative correlation found between CBH–LCL and PM2.5. All of the cloud cases were separated into polluted and clean conditions based on the distribution of PM2.5 values. The distributions of CBH–LCL in the two groups show more cloud cases with smaller CBH–LCL in polluted conditions than in clean conditions.

  6. mPano: cloud-based mobile panorama view from single picture

    Science.gov (United States)

    Li, Hongzhi; Zhu, Wenwu

    2013-09-01

    Panorama view provides people an informative and natural user experience to represent the whole scene. The advances on mobile augmented reality, mobile-cloud computing, and mobile internet can enable panorama view on mobile phone with new functionalities, such as anytime anywhere query where a landmark picture is and what the whole scene looks like. To generate and explore panorama view on mobile devices faces significant challenges due to the limitations of computing capacity, battery life, and memory size of mobile phones, as well as the bandwidth of mobile Internet connection. To address the challenges, this paper presents a novel cloud-based mobile panorama view system that can generate and view panorama-view on mobile devices from a single picture, namely "Pano". In our system, first, we propose a novel iterative multi-modal image retrieval (IMIR) approach to get spatially adjacent images using both tag and content information from the single picture. Second, we propose a cloud-based parallel server synthing approach to generate panorama view in cloud, against today's local-client synthing approach that is almost impossible for mobile phones. Third, we propose predictive-cache solution to reduce latency of image delivery from cloud server to the mobile client. We have built a real mobile panorama view system and perform experiments. The experimental results demonstrated the effectiveness of our system and the proposed key component technologies, especially for landmark images.

  7. A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data

    Science.gov (United States)

    Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.

    2017-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  8. Design and implementation of a cloud based lithography illumination pupil processing application

    Science.gov (United States)

    Zhang, Youbao; Ma, Xinghua; Zhu, Jing; Zhang, Fang; Huang, Huijie

    2017-02-01

    Pupil parameters are important parameters to evaluate the quality of lithography illumination system. In this paper, a cloud based full-featured pupil processing application is implemented. A web browser is used for the UI (User Interface), the websocket protocol and JSON format are used for the communication between the client and the server, and the computing part is implemented in the server side, where the application integrated a variety of high quality professional libraries, such as image processing libraries libvips and ImageMagic, automatic reporting system latex, etc., to support the program. The cloud based framework takes advantage of server's superior computing power and rich software collections, and the program could run anywhere there is a modern browser due to its web UI design. Compared to the traditional way of software operation model: purchased, licensed, shipped, downloaded, installed, maintained, and upgraded, the new cloud based approach, which is no installation, easy to use and maintenance, opens up a new way. Cloud based application probably is the future of the software development.

  9. Cloud model construct for transaction-based cooperative systems ...

    African Journals Online (AJOL)

    Cloud model construct for transaction-based cooperative systems. ... procure cutting edge Information Technology infrastructure are some of the problems faced ... Results also reveal that credit cooperatives will benefit from the model by taking ...

  10. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    OpenAIRE

    Sanggoo Kang; Kiwon Lee

    2016-01-01

    Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-bas...

  11. Automated cloud classification using a ground based infra-red camera and texture analysis techniques

    Science.gov (United States)

    Rumi, Emal; Kerr, David; Coupland, Jeremy M.; Sandford, Andrew P.; Brettle, Mike J.

    2013-10-01

    Clouds play an important role in influencing the dynamics of local and global weather and climate conditions. Continuous monitoring of clouds is vital for weather forecasting and for air-traffic control. Convective clouds such as Towering Cumulus (TCU) and Cumulonimbus clouds (CB) are associated with thunderstorms, turbulence and atmospheric instability. Human observers periodically report the presence of CB and TCU clouds during operational hours at airports and observatories; however such observations are expensive and time limited. Robust, automatic classification of cloud type using infrared ground-based instrumentation offers the advantage of continuous, real-time (24/7) data capture and the representation of cloud structure in the form of a thermal map, which can greatly help to characterise certain cloud formations. The work presented here utilised a ground based infrared (8-14 μm) imaging device mounted on a pan/tilt unit for capturing high spatial resolution sky images. These images were processed to extract 45 separate textural features using statistical and spatial frequency based analytical techniques. These features were used to train a weighted k-nearest neighbour (KNN) classifier in order to determine cloud type. Ground truth data were obtained by inspection of images captured simultaneously from a visible wavelength colour camera at the same installation, with approximately the same field of view as the infrared device. These images were classified by a trained cloud observer. Results from the KNN classifier gave an encouraging success rate. A Probability of Detection (POD) of up to 90% with a Probability of False Alarm (POFA) as low as 16% was achieved.

  12. An Analysis of Resilience of a Cloud Based Incident Notification Process

    OpenAIRE

    Vrieze , Paul ,; Xu , Lai

    2015-01-01

    Part 2: Agility and Resilience in Collaborative Networks; International audience; Cloud based Business Process Management (BPM) systems have provided SMEs with BPM in a pay-per-use manner. Previous work has focused on looking at cloud based BPM from the perspectives of distribution of data, activity or/and process engine and related issues, such as scalability of system, security of data, distribution of data and activities. To achieve business agility, business process collaboration needs to...

  13. ROBUST AND EFFICIENT PRIVACY PRESERVING PUBLIC AUDITING FOR REGENERATING-CODE-BASED CLOUD STORAGE

    OpenAIRE

    Tessy Vincent*, Mrs.Krishnaveni.V.V

    2017-01-01

    Cloud computing is gaining more popularity because of its guaranteed services like online data storage and backup solutions, Web-based e-mail services, virtualized infrastructure etc. User is allowed to access the data stored in a cloud anytime, anywhere using internet connected device with low cost. To provide security to outsourced data in cloud storage against various corruptions, adding fault tolerance to cloud storage together with data integrity checking and failure reparation becomes c...

  14. Speciation and Determination of Low Concentration of Iron in Beer Samples by Cloud Point Extraction

    Science.gov (United States)

    Khalafi, Lida; Doolittle, Pamela; Wright, John

    2018-01-01

    A laboratory experiment is described in which students determine the concentration and speciation of iron in beer samples using cloud point extraction and absorbance spectroscopy. The basis of determination is the complexation between iron and 2-(5-bromo-2- pyridylazo)-5-diethylaminophenol (5-Br-PADAP) as a colorimetric reagent in an aqueous…

  15. A Security Monitoring Method Based on Autonomic Computing for the Cloud Platform

    Directory of Open Access Journals (Sweden)

    Jingjie Zhang

    2018-01-01

    Full Text Available With the continuous development of cloud computing, cloud security has become one of the most important issues in cloud computing. For example, data stored in the cloud platform may be attacked, and its security is difficult to be guaranteed. Therefore, we must attach weight to the issue of how to protect the data stored in the cloud. To protect data, data monitoring is a necessary process. Based on autonomic computing, we develop a cloud data monitoring system on the cloud platform, monitoring whether the data is abnormal in the cycle and analyzing the security of the data according to the monitored results. In this paper, the feasibility of the scheme can be verified through simulation. The results show that the proposed method can adapt to the dynamic change of cloud platform load, and it can also accurately evaluate the degree of abnormal data. Meanwhile, by adjusting monitoring frequency automatically, it improves the accuracy and timeliness of monitoring. Furthermore, it can reduce the monitoring cost of the system in normal operation process.

  16. Cloud-Based DDoS HTTP Attack Detection Using Covariance Matrix Approach

    Directory of Open Access Journals (Sweden)

    Abdulaziz Aborujilah

    2017-01-01

    Full Text Available In this era of technology, cloud computing technology has become essential part of the IT services used the daily life. In this regard, website hosting services are gradually moving to the cloud. This adds new valued feature to the cloud-based websites and at the same time introduces new threats for such services. DDoS attack is one such serious threat. Covariance matrix approach is used in this article to detect such attacks. The results were encouraging, according to confusion matrix and ROC descriptors.

  17. CLOUD-BASED VS DESKTOP-BASED PROPERTY MANAGEMENT SYSTEMS IN HOTEL

    Directory of Open Access Journals (Sweden)

    Mustafa\tGULMEZ

    2015-06-01

    Full Text Available Even though keeping up with the modern developments in IT sector is crucial for the success and competitiveness of a hotel, it is usually very hard for new technologies to be accepted and implemented. This is the case with the cloud technology for which the opinions between hoteliers are divided on those who think that it is just another fashion trend, unnecessary to be taken into consideration and those that believe that it helps in performing daily operations more easily, leaving space for more interaction with guests both in virtual and real world. Usage of cloud technology in hotels is still in its beginning phase and hoteliers still have to learn more about its advantages and adequate usage for the benefit of overall hotel operating. On the example of hotel property management system (PMS and comparison between features of its older desktop-version and new web-based programs, this research aims at finding out at which stage and how effective is usage of cloud technology in hotels. For this, qualitative research with semi-structured interviews with hotel mangers that use one of these programs was conducted. Reasons for usage and advantages of each version are discussed.

  18. Point-Cloud Compression for Vehicle-Based Mobile Mapping Systems Using Portable Network Graphics

    Science.gov (United States)

    Kohira, K.; Masuda, H.

    2017-09-01

    A mobile mapping system is effective for capturing dense point-clouds of roads and roadside objects Point-clouds of urban areas, residential areas, and arterial roads are useful for maintenance of infrastructure, map creation, and automatic driving. However, the data size of point-clouds measured in large areas is enormously large. A large storage capacity is required to store such point-clouds, and heavy loads will be taken on network if point-clouds are transferred through the network. Therefore, it is desirable to reduce data sizes of point-clouds without deterioration of quality. In this research, we propose a novel point-cloud compression method for vehicle-based mobile mapping systems. In our compression method, point-clouds are mapped onto 2D pixels using GPS time and the parameters of the laser scanner. Then, the images are encoded in the Portable Networking Graphics (PNG) format and compressed using the PNG algorithm. In our experiments, our method could efficiently compress point-clouds without deteriorating the quality.

  19. POINT-CLOUD COMPRESSION FOR VEHICLE-BASED MOBILE MAPPING SYSTEMS USING PORTABLE NETWORK GRAPHICS

    Directory of Open Access Journals (Sweden)

    K. Kohira

    2017-09-01

    Full Text Available A mobile mapping system is effective for capturing dense point-clouds of roads and roadside objects.Point-clouds of urban areas, residential areas, and arterial roads are useful for maintenance of infrastructure, map creation, and automatic driving. However, the data size of point-clouds measured in large areas is enormously large. A large storage capacity is required to store such point-clouds, and heavy loads will be taken on network if point-clouds are transferred through the network. Therefore, it is desirable to reduce data sizes of point-clouds without deterioration of quality. In this research, we propose a novel point-cloud compression method for vehicle-based mobile mapping systems. In our compression method, point-clouds are mapped onto 2D pixels using GPS time and the parameters of the laser scanner. Then, the images are encoded in the Portable Networking Graphics (PNG format and compressed using the PNG algorithm. In our experiments, our method could efficiently compress point-clouds without deteriorating the quality.

  20. Cloud based emergency health care information service in India.

    Science.gov (United States)

    Karthikeyan, N; Sukanesh, R

    2012-12-01

    A hospital is a health care organization providing patient treatment by expert physicians, surgeons and equipments. A report from a health care accreditation group says that miscommunication between patients and health care providers is the reason for the gap in providing emergency medical care to people in need. In developing countries, illiteracy is the major key root for deaths resulting from uncertain diseases constituting a serious public health problem. Mentally affected, differently abled and unconscious patients can't communicate about their medical history to the medical practitioners. Also, Medical practitioners can't edit or view DICOM images instantly. Our aim is to provide palm vein pattern recognition based medical record retrieval system, using cloud computing for the above mentioned people. Distributed computing technology is coming in the new forms as Grid computing and Cloud computing. These new forms are assured to bring Information Technology (IT) as a service. In this paper, we have described how these new forms of distributed computing will be helpful for modern health care industries. Cloud Computing is germinating its benefit to industrial sectors especially in medical scenarios. In Cloud Computing, IT-related capabilities and resources are provided as services, via the distributed computing on-demand. This paper is concerned with sprouting software as a service (SaaS) by means of Cloud computing with an aim to bring emergency health care sector in an umbrella with physical secured patient records. In framing the emergency healthcare treatment, the crucial thing considered necessary to decide about patients is their previous health conduct records. Thus a ubiquitous access to appropriate records is essential. Palm vein pattern recognition promises a secured patient record access. Likewise our paper reveals an efficient means to view, edit or transfer the DICOM images instantly which was a challenging task for medical practitioners in the

  1. CCN Properties of Organic Aerosol Collected Below and within Marine Stratocumulus Clouds near Monterey, California

    Directory of Open Access Journals (Sweden)

    Akua Asa-Awuku

    2015-10-01

    Full Text Available The composition of aerosol from cloud droplets differs from that below cloud. Its implications for the Cloud Condensation Nuclei (CCN activity are the focus of this study. Water-soluble organic matter from below cloud, and cloud droplet residuals off the coast of Monterey, California were collected; offline chemical composition, CCN activity and surface tension measurements coupled with Köhler Theory Analysis are used to infer the molar volume and surfactant characteristics of organics in both samples. Based on the surface tension depression of the samples, it is unlikely that the aerosol contains strong surfactants. The activation kinetics for all samples examined are consistent with rapid (NH42SO4 calibration aerosol. This is consistent with our current understanding of droplet kinetics for ambient CCN. However, the carbonaceous material in cloud drop residuals is far more hygroscopic than in sub-cloud aerosol, suggestive of the impact of cloud chemistry on the hygroscopic properties of organic matter.

  2. An adaptive process-based cloud infrastructure for space situational awareness applications

    Science.gov (United States)

    Liu, Bingwei; Chen, Yu; Shen, Dan; Chen, Genshe; Pham, Khanh; Blasch, Erik; Rubin, Bruce

    2014-06-01

    Space situational awareness (SSA) and defense space control capabilities are top priorities for groups that own or operate man-made spacecraft. Also, with the growing amount of space debris, there is an increase in demand for contextual understanding that necessitates the capability of collecting and processing a vast amount sensor data. Cloud computing, which features scalable and flexible storage and computing services, has been recognized as an ideal candidate that can meet the large data contextual challenges as needed by SSA. Cloud computing consists of physical service providers and middleware virtual machines together with infrastructure, platform, and software as service (IaaS, PaaS, SaaS) models. However, the typical Virtual Machine (VM) abstraction is on a per operating systems basis, which is at too low-level and limits the flexibility of a mission application architecture. In responding to this technical challenge, a novel adaptive process based cloud infrastructure for SSA applications is proposed in this paper. In addition, the details for the design rationale and a prototype is further examined. The SSA Cloud (SSAC) conceptual capability will potentially support space situation monitoring and tracking, object identification, and threat assessment. Lastly, the benefits of a more granular and flexible cloud computing resources allocation are illustrated for data processing and implementation considerations within a representative SSA system environment. We show that the container-based virtualization performs better than hypervisor-based virtualization technology in an SSA scenario.

  3. A MAS-Based Cloud Service Brokering System to Respond Security Needs of Cloud Customers

    Directory of Open Access Journals (Sweden)

    Jamal Talbi

    2017-03-01

    Full Text Available Cloud computing is becoming a key factor in computer science and an important technology for many organizations to deliver different types of services. The companies which provide services to customers are called as cloud service providers. The cloud users (CUs increase and require secure, reliable and trustworthy cloud service providers (CSPs from the market. So, it’s a challenge for a new customer to choose the highly secure provider. This paper presents a cloud service brokering system in order to analyze and rank the secured cloud service provider among the available providers list. This model uses an autonomous and flexible agent in multi-agent system (MASs that have an intelligent behavior and suitable tools for helping the brokering system to assess the security risks for the group of cloud providers which make decision of the more secured provider and justify the business needs of users in terms of security and reliability.

  4. Investigation of tropical cirrus cloud properties using ground based lidar measurements

    Science.gov (United States)

    Dhaman, Reji K.; Satyanarayana, Malladi; Krishnakumar, V.; Mahadevan Pillai, V. P.; Jayeshlal, G. S.; Raghunath, K.; Venkat Ratnam, M.

    2016-05-01

    Cirrus clouds play a significant role in the Earths radiation budget. Therefore, knowledge of geometrical and optical properties of cirrus cloud is essential for the climate modeling. In this paper, the cirrus clouds microphysical and optical properties are made by using a ground based lidar measurements over an inland tropical station Gadanki (13.5°N, 79.2°E), Andhra Pradesh, India. The variation of cirrus microphysical and optical properties with mid cloud temperature is also studied. The cirrus clouds mean height is generally observed in the range of 9-17km with a peak occurrence at 13- 14km. The cirrus mid cloud temperature ranges from -81°C to -46°C. The cirrus geometrical thickness ranges from 0.9- 4.5km. During the cirrus occurrence days sub-visual, thin and dense cirrus were at 37.5%, 50% and 12.5% respectively. The monthly cirrus optical depth ranges from 0.01-0.47, but most (<80%) of the cirrus have values less than 0.1. Optical depth shows a strong dependence with cirrus geometrical thickness and mid-cloud height. The monthly mean cirrus extinction ranges from 2.8E-06 to 8E-05 and depolarization ratio and lidar ratio varies from 0.13 to 0.77 and 2 to 52 sr respectively. A positive correlation exists for both optical depth and extinction with the mid-cloud temperature. The lidar ratio shows a scattered behavior with mid-cloud temperature.

  5. Cloud blueprints for integrating and managing cloud federations

    NARCIS (Netherlands)

    Papazoglou, M.; Heisel, M.

    2012-01-01

    Contemporary cloud technologies face insurmountable obstacles. They follow a pull-based, producer-centric trajectory to development where cloud consumers have to ‘squeeze and bolt’ applications onto cloud APIs. They also introduce a monolithic SaaS/PaaS/IaaS stack where a one-size-fits-all mentality

  6. Cloud Point Extraction and Determination of Silver Ion in Real Sample using Bis((1H-benzo[d ]imidazol-2ylmethylsulfane

    Directory of Open Access Journals (Sweden)

    Farshid Ahmadi

    2011-01-01

    Full Text Available Bis((1H-benzo[d]imidazol-2ylmethylsulfane (BHIS was used as a complexing agent in cloud point extraction for the first time and applied for selective pre-concentration of trace amounts of silver. The method is based on the extraction of silver at pH 8.0 by using non-ionic surfactant T-X114 and bis((1H-benzo[d]imidazol-2ylmethylsulfane as a chelating agent. The adopted concentrations for BHIS, Triton X-114 and HNO3, bath temperature, centrifuge rate and time were optimized. Detection limits (3SDb/m of 1.7 along with enrichment factor of 39 for silver ion was achieved. The high efficiency of cloud point extraction to carry out the determination of analytes in complex matrices was demonstrated. The proposed method was successfully applied to the ultra-trace determination of silver in real samples.

  7. Stable water isotopologue ratios in fog and cloud droplets of liquid clouds are not size-dependent

    Science.gov (United States)

    Spiegel, J.K.; Aemisegger, F.; Scholl, M.; Wienhold, F.G.; Collett, J.L.; Lee, T.; van Pinxteren, D.; Mertes, S.; Tilgner, A.; Herrmann, H.; Werner, Roland A.; Buchmann, N.; Eugster, W.

    2012-01-01

    In this work, we present the first observations of stable water isotopologue ratios in cloud droplets of different sizes collected simultaneously. We address the question whether the isotope ratio of droplets in a liquid cloud varies as a function of droplet size. Samples were collected from a ground intercepted cloud (= fog) during the Hill Cap Cloud Thuringia 2010 campaign (HCCT-2010) using a three-stage Caltech Active Strand Cloud water Collector (CASCC). An instrument test revealed that no artificial isotopic fractionation occurs during sample collection with the CASCC. Furthermore, we could experimentally confirm the hypothesis that the δ values of cloud droplets of the relevant droplet sizes (μm-range) were not significantly different and thus can be assumed to be in isotopic equilibrium immediately with the surrounding water vapor. However, during the dissolution period of the cloud, when the supersaturation inside the cloud decreased and the cloud began to clear, differences in isotope ratios of the different droplet sizes tended to be larger. This is likely to result from the cloud's heterogeneity, implying that larger and smaller cloud droplets have been collected at different moments in time, delivering isotope ratios from different collection times.

  8. Stable water isotopologue ratios in fog and cloud droplets of liquid clouds are not size-dependent

    Directory of Open Access Journals (Sweden)

    J. K. Spiegel

    2012-10-01

    Full Text Available In this work, we present the first observations of stable water isotopologue ratios in cloud droplets of different sizes collected simultaneously. We address the question whether the isotope ratio of droplets in a liquid cloud varies as a function of droplet size. Samples were collected from a ground intercepted cloud (= fog during the Hill Cap Cloud Thuringia 2010 campaign (HCCT-2010 using a three-stage Caltech Active Strand Cloud water Collector (CASCC. An instrument test revealed that no artificial isotopic fractionation occurs during sample collection with the CASCC. Furthermore, we could experimentally confirm the hypothesis that the δ values of cloud droplets of the relevant droplet sizes (μm-range were not significantly different and thus can be assumed to be in isotopic equilibrium immediately with the surrounding water vapor. However, during the dissolution period of the cloud, when the supersaturation inside the cloud decreased and the cloud began to clear, differences in isotope ratios of the different droplet sizes tended to be larger. This is likely to result from the cloud's heterogeneity, implying that larger and smaller cloud droplets have been collected at different moments in time, delivering isotope ratios from different collection times.

  9. Study of the Effect of Temporal Sampling Frequency on DSCOVR Observations Using the GEOS-5 Nature Run Results. Part II; Cloud Coverage

    Science.gov (United States)

    Holdaway, Daniel; Yang, Yuekui

    2016-01-01

    This is the second part of a study on how temporal sampling frequency affects satellite retrievals in support of the Deep Space Climate Observatory (DSCOVR) mission. Continuing from Part 1, which looked at Earth's radiation budget, this paper presents the effect of sampling frequency on DSCOVR-derived cloud fraction. The output from NASA's Goddard Earth Observing System version 5 (GEOS-5) Nature Run is used as the "truth". The effect of temporal resolution on potential DSCOVR observations is assessed by subsampling the full Nature Run data. A set of metrics, including uncertainty and absolute error in the subsampled time series, correlation between the original and the subsamples, and Fourier analysis have been used for this study. Results show that, for a given sampling frequency, the uncertainties in the annual mean cloud fraction of the sunlit half of the Earth are larger over land than over ocean. Analysis of correlation coefficients between the subsamples and the original time series demonstrates that even though sampling at certain longer time intervals may not increase the uncertainty in the mean, the subsampled time series is further and further away from the "truth" as the sampling interval becomes larger and larger. Fourier analysis shows that the simulated DSCOVR cloud fraction has underlying periodical features at certain time intervals, such as 8, 12, and 24 h. If the data is subsampled at these frequencies, the uncertainties in the mean cloud fraction are higher. These results provide helpful insights for the DSCOVR temporal sampling strategy.

  10. A Multi-agent Supply Chain Information Coordination Mode Based on Cloud Computing

    OpenAIRE

    Wuxue Jiang; Jing Zhang; Junhuai Li

    2013-01-01

     In order to improve the high efficiency and security of supply chain information coordination under cloud computing environment, this paper proposes a supply chain information coordination mode based on cloud computing. This mode has two basic statuses which are online status and offline status. At the online status, cloud computing center is responsible for coordinating the whole supply chain information. At the offline status, information exchange can be realized among different nodes by u...

  11. Subtropical and Polar Cirrus Clouds Characterized by Ground-Based Lidars and CALIPSO/CALIOP Observations

    Directory of Open Access Journals (Sweden)

    Córdoba-Jabonero Carmen

    2016-01-01

    Full Text Available Cirrus clouds are product of weather processes, and then their occurrence and macrophysical/optical properties can vary significantly over different regions of the world. Lidars can provide height-resolved measurements with a relatively good both vertical and temporal resolutions, making them the most suitable instrumentation for high-cloud observations. The aim of this work is to show the potential of lidar observations on Cirrus clouds detection in combination with a recently proposed methodology to retrieve the Cirrus clouds macrophysical and optical features. In this sense, a few case studies of cirrus clouds observed at both subtropical and polar latitudes are examined and compared to CALIPSO/CALIOP observations. Lidar measurements are carried out in two stations: the Metropolitan city of Sao Paulo (MSP, Brazil, 23.3°S 46.4°W, located at subtropical latitudes, and the Belgrano II base (BEL, Argentina, 78ºS 35ºW in the Antarctic continent. Optical (COD-cloud optical depth and LR-Lidar Ratio and macrophysical (top/base heights and thickness properties of both the subtropical and polar cirrus clouds are reported. In general, subtropical Cirrus clouds present lower LR values and are found at higher altitudes than those detected at polar latitudes. In general, Cirrus clouds are detected at similar altitudes by CALIOP. However, a poor agreement is achieved in the LR retrieved between ground-based lidars and space-borne CALIOP measurements, likely due to the use of a fixed (or low-variable LR value in CALIOP inversion procedures.

  12. RenderSelect: a Cloud Broker Framework for Cloud Renderfarm Services

    OpenAIRE

    Ruby, Annette J; Aisha, Banu W; Subash, Chandran P

    2016-01-01

    In the 3D studios the animation scene files undergo a process called as rendering, where the 3D wire frame models are converted into 3D photorealistic images. As the rendering process is both a computationally intensive and a time consuming task, the cloud services based rendering in cloud render farms is gaining popularity among the animators. Though cloud render farms offer many benefits, the animators hesitate to move from their traditional offline rendering to cloud services based render ...

  13. A keyword searchable attribute-based encryption scheme with attribute update for cloud storage.

    Science.gov (United States)

    Wang, Shangping; Ye, Jian; Zhang, Yaling

    2018-01-01

    Ciphertext-policy attribute-based encryption (CP-ABE) scheme is a new type of data encryption primitive, which is very suitable for data cloud storage for its fine-grained access control. Keyword-based searchable encryption scheme enables users to quickly find interesting data stored in the cloud server without revealing any information of the searched keywords. In this work, we provide a keyword searchable attribute-based encryption scheme with attribute update for cloud storage, which is a combination of attribute-based encryption scheme and keyword searchable encryption scheme. The new scheme supports the user's attribute update, especially in our new scheme when a user's attribute need to be updated, only the user's secret key related with the attribute need to be updated, while other user's secret key and the ciphertexts related with this attribute need not to be updated with the help of the cloud server. In addition, we outsource the operation with high computation cost to cloud server to reduce the user's computational burden. Moreover, our scheme is proven to be semantic security against chosen ciphertext-policy and chosen plaintext attack in the general bilinear group model. And our scheme is also proven to be semantic security against chosen keyword attack under bilinear Diffie-Hellman (BDH) assumption.

  14. An Anomaly Detection Algorithm of Cloud Platform Based on Self-Organizing Maps

    Directory of Open Access Journals (Sweden)

    Jun Liu

    2016-01-01

    Full Text Available Virtual machines (VM on a Cloud platform can be influenced by a variety of factors which can lead to decreased performance and downtime, affecting the reliability of the Cloud platform. Traditional anomaly detection algorithms and strategies for Cloud platforms have some flaws in their accuracy of detection, detection speed, and adaptability. In this paper, a dynamic and adaptive anomaly detection algorithm based on Self-Organizing Maps (SOM for virtual machines is proposed. A unified modeling method based on SOM to detect the machine performance within the detection region is presented, which avoids the cost of modeling a single virtual machine and enhances the detection speed and reliability of large-scale virtual machines in Cloud platform. The important parameters that affect the modeling speed are optimized in the SOM process to significantly improve the accuracy of the SOM modeling and therefore the anomaly detection accuracy of the virtual machine.

  15. Buildings and Terrain of Urban Area Point Cloud Segmentation based on PCL

    International Nuclear Information System (INIS)

    Liu, Ying; Zhong, Ruofei

    2014-01-01

    One current problem with laser radar point data classification is building and urban terrain segmentation, this paper proposes a point cloud segmentation method base on PCL libraries. PCL is a large cross-platform open source C++ programming library, which implements a large number of point cloud related efficient data structures and generic algorithms involving point cloud retrieval, filtering, segmentation, registration, feature extraction and curved surface reconstruction, visualization, etc. Due to laser radar point cloud characteristics with large amount of data, unsymmetrical distribution, this paper proposes using the data structure of kd-tree to organize data; then using Voxel Grid filter for point cloud resampling, namely to reduce the amount of point cloud data, and at the same time keep the point cloud shape characteristic; use PCL Segmentation Module, we use a Euclidean Cluster Extraction class with Europe clustering for buildings and ground three-dimensional point cloud segmentation. The experimental results show that this method avoids the multiple copy system existing data needs, saves the program storage space through the call of PCL library method and class, shortens the program compiled time and improves the running speed of the program

  16. Machine learning based cloud mask algorithm driven by radiative transfer modeling

    Science.gov (United States)

    Chen, N.; Li, W.; Tanikawa, T.; Hori, M.; Shimada, R.; Stamnes, K. H.

    2017-12-01

    Cloud detection is a critically important first step required to derive many satellite data products. Traditional threshold based cloud mask algorithms require a complicated design process and fine tuning for each sensor, and have difficulty over snow/ice covered areas. With the advance of computational power and machine learning techniques, we have developed a new algorithm based on a neural network classifier driven by extensive radiative transfer modeling. Statistical validation results obtained by using collocated CALIOP and MODIS data show that its performance is consistent over different ecosystems and significantly better than the MODIS Cloud Mask (MOD35 C6) during the winter seasons over mid-latitude snow covered areas. Simulations using a reduced number of satellite channels also show satisfactory results, indicating its flexibility to be configured for different sensors.

  17. H31G-1596: DeepSAT's CloudCNN: A Deep Neural Network for Rapid Cloud Detection from Geostationary Satellites

    Science.gov (United States)

    Kalia, Subodh; Ganguly, Sangram; Li, Shuang; Nemani, Ramakrishna R.

    2017-01-01

    Cloud and cloud shadow detection has important applications in weather and climate studies. It is even more crucial when we introduce geostationary satellites into the field of terrestrial remote sensing. With the challenges associated with data acquired in very high frequency (10-15 mins per scan), the ability to derive an accurate cloud shadow mask from geostationary satellite data is critical. The key to the success for most of the existing algorithms depends on spatially and temporally varying thresholds,which better capture local atmospheric and surface effects.However, the selection of proper threshold is difficult and may lead to erroneous results. In this work, we propose a deep neural network based approach called CloudCNN to classify cloudshadow from Himawari-8 AHI and GOES-16 ABI multispectral data. DeepSAT's CloudCNN consists of an encoderdecoder based architecture for binary-class pixel wise segmentation. We train CloudCNN on multi-GPU Nvidia Devbox cluster, and deploy the prediction pipeline on NASA Earth Exchange (NEX) Pleiades supercomputer. We achieved an overall accuracy of 93.29% on test samples. Since, the predictions take only a few seconds to segment a full multispectral GOES-16 or Himawari-8 Full Disk image, the developed framework can be used for real-time cloud detection, cyclone detection, or extreme weather event predictions.

  18. Satellite retrieval of cloud condensation nuclei concentrations by using clouds as CCN chambers

    Science.gov (United States)

    Rosenfeld, Daniel; Zheng, Youtong; Hashimshoni, Eyal; Pöhlker, Mira L.; Jefferson, Anne; Pöhlker, Christopher; Yu, Xing; Zhu, Yannian; Liu, Guihua; Yue, Zhiguo; Fischman, Baruch; Li, Zhanqing; Giguzin, David; Goren, Tom; Artaxo, Paulo; Pöschl, Ulrich

    2016-01-01

    Quantifying the aerosol/cloud-mediated radiative effect at a global scale requires simultaneous satellite retrievals of cloud condensation nuclei (CCN) concentrations and cloud base updraft velocities (Wb). Hitherto, the inability to do so has been a major cause of high uncertainty regarding anthropogenic aerosol/cloud-mediated radiative forcing. This can be addressed by the emerging capability of estimating CCN and Wb of boundary layer convective clouds from an operational polar orbiting weather satellite. Our methodology uses such clouds as an effective analog for CCN chambers. The cloud base supersaturation (S) is determined by Wb and the satellite-retrieved cloud base drop concentrations (Ndb), which is the same as CCN(S). Validation against ground-based CCN instruments at Oklahoma, at Manaus, and onboard a ship in the northeast Pacific showed a retrieval accuracy of ±25% to ±30% for individual satellite overpasses. The methodology is presently limited to boundary layer not raining convective clouds of at least 1 km depth that are not obscured by upper layer clouds, including semitransparent cirrus. The limitation for small solar backscattering angles of <25° restricts the satellite coverage to ∼25% of the world area in a single day. PMID:26944081

  19. Exploring the relationship between a ground-based network and airborne CCN spectra observed at the cloud level

    Science.gov (United States)

    Corrigan, C.; Roberts, G. C.; Ritchie, J.; Creamean, J.; White, A. B.

    2011-12-01

    better understanding of the ability of ground-based measurements of CCN to reflect what is happening at cloud level and if selective placement of these ground sampling sites might better capture cloud relevant data.

  20. TUNNEL POINT CLOUD FILTERING METHOD BASED ON ELLIPTIC CYLINDRICAL MODEL

    Directory of Open Access Journals (Sweden)

    N. Zhu

    2016-06-01

    Full Text Available The large number of bolts and screws that attached to the subway shield ring plates, along with the great amount of accessories of metal stents and electrical equipments mounted on the tunnel walls, make the laser point cloud data include lots of non-tunnel section points (hereinafter referred to as non-points, therefore affecting the accuracy for modeling and deformation monitoring. This paper proposed a filtering method for the point cloud based on the elliptic cylindrical model. The original laser point cloud data was firstly projected onto a horizontal plane, and a searching algorithm was given to extract the edging points of both sides, which were used further to fit the tunnel central axis. Along the axis the point cloud was segmented regionally, and then fitted as smooth elliptic cylindrical surface by means of iteration. This processing enabled the automatic filtering of those inner wall non-points. Experiments of two groups showed coincident results, that the elliptic cylindrical model based method could effectively filter out the non-points, and meet the accuracy requirements for subway deformation monitoring. The method provides a new mode for the periodic monitoring of tunnel sections all-around deformation in subways routine operation and maintenance.

  1. CloudAligner: A fast and full-featured MapReduce based tool for sequence mapping

    Directory of Open Access Journals (Sweden)

    Shi Weisong

    2011-06-01

    Full Text Available Abstract Background Research in genetics has developed rapidly recently due to the aid of next generation sequencing (NGS. However, massively-parallel NGS produces enormous amounts of data, which leads to storage, compatibility, scalability, and performance issues. The Cloud Computing and MapReduce framework, which utilizes hundreds or thousands of shared computers to map sequencing reads quickly and efficiently to reference genome sequences, appears to be a very promising solution for these issues. Consequently, it has been adopted by many organizations recently, and the initial results are very promising. However, since these are only initial steps toward this trend, the developed software does not provide adequate primary functions like bisulfite, pair-end mapping, etc., in on-site software such as RMAP or BS Seeker. In addition, existing MapReduce-based applications were not designed to process the long reads produced by the most recent second-generation and third-generation NGS instruments and, therefore, are inefficient. Last, it is difficult for a majority of biologists untrained in programming skills to use these tools because most were developed on Linux with a command line interface. Results To urge the trend of using Cloud technologies in genomics and prepare for advances in second- and third-generation DNA sequencing, we have built a Hadoop MapReduce-based application, CloudAligner, which achieves higher performance, covers most primary features, is more accurate, and has a user-friendly interface. It was also designed to be able to deal with long sequences. The performance gain of CloudAligner over Cloud-based counterparts (35 to 80% mainly comes from the omission of the reduce phase. In comparison to local-based approaches, the performance gain of CloudAligner is from the partition and parallel processing of the huge reference genome as well as the reads. The source code of CloudAligner is available at http

  2. CloudAligner: A fast and full-featured MapReduce based tool for sequence mapping.

    Science.gov (United States)

    Nguyen, Tung; Shi, Weisong; Ruden, Douglas

    2011-06-06

    Research in genetics has developed rapidly recently due to the aid of next generation sequencing (NGS). However, massively-parallel NGS produces enormous amounts of data, which leads to storage, compatibility, scalability, and performance issues. The Cloud Computing and MapReduce framework, which utilizes hundreds or thousands of shared computers to map sequencing reads quickly and efficiently to reference genome sequences, appears to be a very promising solution for these issues. Consequently, it has been adopted by many organizations recently, and the initial results are very promising. However, since these are only initial steps toward this trend, the developed software does not provide adequate primary functions like bisulfite, pair-end mapping, etc., in on-site software such as RMAP or BS Seeker. In addition, existing MapReduce-based applications were not designed to process the long reads produced by the most recent second-generation and third-generation NGS instruments and, therefore, are inefficient. Last, it is difficult for a majority of biologists untrained in programming skills to use these tools because most were developed on Linux with a command line interface. To urge the trend of using Cloud technologies in genomics and prepare for advances in second- and third-generation DNA sequencing, we have built a Hadoop MapReduce-based application, CloudAligner, which achieves higher performance, covers most primary features, is more accurate, and has a user-friendly interface. It was also designed to be able to deal with long sequences. The performance gain of CloudAligner over Cloud-based counterparts (35 to 80%) mainly comes from the omission of the reduce phase. In comparison to local-based approaches, the performance gain of CloudAligner is from the partition and parallel processing of the huge reference genome as well as the reads. The source code of CloudAligner is available at http://cloudaligner.sourceforge.net/ and its web version is at http

  3. Extracting valley-ridge lines from point-cloud-based 3D fingerprint models.

    Science.gov (United States)

    Pang, Xufang; Song, Zhan; Xie, Wuyuan

    2013-01-01

    3D fingerprinting is an emerging technology with the distinct advantage of touchless operation. More important, 3D fingerprint models contain more biometric information than traditional 2D fingerprint images. However, current approaches to fingerprint feature detection usually must transform the 3D models to a 2D space through unwrapping or other methods, which might introduce distortions. A new approach directly extracts valley-ridge features from point-cloud-based 3D fingerprint models. It first applies the moving least-squares method to fit a local paraboloid surface and represent the local point cloud area. It then computes the local surface's curvatures and curvature tensors to facilitate detection of the potential valley and ridge points. The approach projects those points to the most likely valley-ridge lines, using statistical means such as covariance analysis and cross correlation. To finally extract the valley-ridge lines, it grows the polylines that approximate the projected feature points and removes the perturbations between the sampled points. Experiments with different 3D fingerprint models demonstrate this approach's feasibility and performance.

  4. Batch Attribute-Based Encryption for Secure Clouds

    Directory of Open Access Journals (Sweden)

    Chen Yang

    2015-10-01

    Full Text Available Cloud storage is widely used by organizations due to its advantage of allowing universal access with low cost. Attribute-based encryption (ABE is a kind of public key encryption suitable for cloud storage. The secret key of each user and the ciphertext are associated with an access policy and an attribute set, respectively; in addition to holding a secret key, one can decrypt a ciphertext only if the associated attributes match the predetermined access policy, which allows one to enforce fine-grained access control on outsourced files. One issue in existing ABE schemes is that they are designed for the users of a single organization. When one wants to share the data with the users of different organizations, the owner needs to encrypt the messages to the receivers of one organization and then repeats this process for another organization. This situation is deteriorated with more and more mobile devices using cloud services, as the ABE encryption process is time consuming and may exhaust the power supplies of the mobile devices quickly. In this paper, we propose a batch attribute-based encryption (BABE approach to address this problem in a provably-secure way. With our approach, the data owner can outsource data in batches to the users of different organizations simultaneously. The data owner is allowed to decide the receiving organizations and the attributes required for decryption. Theoretical and experimental analyses show that our approach is more efficient than traditional encryption implementations in computation and communication.

  5. Urbanization Causes Increased Cloud Base Height and Decreased Fog in Coastal Southern California

    Science.gov (United States)

    Williams, A. Park; Schwartz, Rachel E.; Iacobellis, Sam; Seager, Richard; Cook, Benjamin I.; Still, Christopher J.; Husak, Gregory; Michaelsen, Joel

    2015-01-01

    Subtropical marine stratus clouds regulate coastal and global climate, but future trends in these clouds are uncertain. In coastal Southern California (CSCA), interannual variations in summer stratus cloud occurrence are spatially coherent across 24 airfields and dictated by positive relationships with stability above the marine boundary layer (MBL) and MBL height. Trends, however, have been spatially variable since records began in the mid-1900s due to differences in nighttime warming. Among CSCA airfields, differences in nighttime warming, but not daytime warming, are strongly and positively related to fraction of nearby urban cover, consistent with an urban heat island effect. Nighttime warming raises the near-surface dew point depression, which lifts the altitude of condensation and cloud base height, thereby reducing fog frequency. Continued urban warming, rising cloud base heights, and associated effects on energy and water balance would profoundly impact ecological and human systems in highly populated and ecologically diverse CSCA.

  6. Cloud Computing Task Scheduling Based on Cultural Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Li Jian-Wen

    2016-01-01

    Full Text Available The task scheduling strategy based on cultural genetic algorithm(CGA is proposed in order to improve the efficiency of task scheduling in the cloud computing platform, which targets at minimizing the total time and cost of task scheduling. The improved genetic algorithm is used to construct the main population space and knowledge space under cultural framework which get independent parallel evolution, forming a mechanism of mutual promotion to dispatch the cloud task. Simultaneously, in order to prevent the defects of the genetic algorithm which is easy to fall into local optimum, the non-uniform mutation operator is introduced to improve the search performance of the algorithm. The experimental results show that CGA reduces the total time and lowers the cost of the scheduling, which is an effective algorithm for the cloud task scheduling.

  7. Effects of Cloud-Based m-Learning on Student Creative Performance in Engineering Design

    Science.gov (United States)

    Chang, Yu-Shan; Chen, Si-Yi; Yu, Kuang-Chao; Chu, Yih-Hsien; Chien, Yu-Hung

    2017-01-01

    This study explored the effects of cloud-based m-learning on students' creative processes and products in engineering design. A nonequivalent pretest-posttest design was adopted, and 62 university students from Taipei City, Taiwan, were recruited as research participants in the study. The results showed that cloud-based m-learning had a positive…

  8. Zen of cloud learning cloud computing by examples on Microsoft Azure

    CERN Document Server

    Bai, Haishi

    2014-01-01

    Zen of Cloud: Learning Cloud Computing by Examples on Microsoft Azure provides comprehensive coverage of the essential theories behind cloud computing and the Windows Azure cloud platform. Sharing the author's insights gained while working at Microsoft's headquarters, it presents nearly 70 end-to-end examples with step-by-step guidance on implementing typical cloud-based scenarios.The book is organized into four sections: cloud service fundamentals, cloud solutions, devices and cloud, and system integration and project management. Each chapter contains detailed exercises that provide readers w

  9. Characterization of AVHRR global cloud detection sensitivity based on CALIPSO-CALIOP cloud optical thickness information: demonstration of results based on the CM SAF CLARA-A2 climate data record

    Science.gov (United States)

    Karlsson, Karl-Göran; Håkansson, Nina

    2018-02-01

    The sensitivity in detecting thin clouds of the cloud screening method being used in the CM SAF cloud, albedo and surface radiation data set from AVHRR data (CLARA-A2) cloud climate data record (CDR) has been evaluated using cloud information from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) onboard the CALIPSO satellite. The sensitivity, including its global variation, has been studied based on collocations of Advanced Very High Resolution Radiometer (AVHRR) and CALIOP measurements over a 10-year period (2006-2015). The cloud detection sensitivity has been defined as the minimum cloud optical thickness for which 50 % of clouds could be detected, with the global average sensitivity estimated to be 0.225. After using this value to reduce the CALIOP cloud mask (i.e. clouds with optical thickness below this threshold were interpreted as cloud-free cases), cloudiness results were found to be basically unbiased over most of the globe except over the polar regions where a considerable underestimation of cloudiness could be seen during the polar winter. The overall probability of detecting clouds in the polar winter could be as low as 50 % over the highest and coldest parts of Greenland and Antarctica, showing that a large fraction of optically thick clouds also remains undetected here. The study included an in-depth analysis of the probability of detecting a cloud as a function of the vertically integrated cloud optical thickness as well as of the cloud's geographical position. Best results were achieved over oceanic surfaces at mid- to high latitudes where at least 50 % of all clouds with an optical thickness down to a value of 0.075 were detected. Corresponding cloud detection sensitivities over land surfaces outside of the polar regions were generally larger than 0.2 with maximum values of approximately 0.5 over the Sahara and the Arabian Peninsula. For polar land surfaces the values were close to 1 or higher with maximum values of 4.5 for the parts

  10. Individual aerosol particles in and below clouds along a Mt. Fuji slope: Modification of sea-salt-containing particles by in-cloud processing

    Science.gov (United States)

    Ueda, S.; Hirose, Y.; Miura, K.; Okochi, H.

    2014-02-01

    Sizes and compositions of atmospheric aerosol particles can be altered by in-cloud processing by absorption/adsorption of gaseous and particulate materials and drying of aerosol particles that were formerly activated as cloud condensation nuclei. To elucidate differences of aerosol particles before and after in-cloud processing, aerosols were observed along a slope of Mt. Fuji, Japan (3776 m a.s.l.) during the summer in 2011 and 2012 using a portable laser particle counter (LPC) and an aerosol sampler. Aerosol samples for analyses of elemental compositions were obtained using a cascade impactor at top-of-cloud, in-cloud, and below-cloud altitudes. To investigate composition changes via in-cloud processing, individual particles (0.5-2 μm diameter) of samples from five cases (days) collected at different altitudes under similar backward air mass trajectory conditions were analyzed using a transmission electron microscope (TEM) equipped with an energy dispersive X-ray analyzer. For most cases (four cases), most particles at all altitudes mainly comprised sea salts: mainly Na with some S and/or Cl. Of those, in two cases, sea-salt-containing particles with Cl were found in below-cloud samples, although sea-salt-containing particles in top-of-cloud samples did not contain Cl. This result suggests that Cl in the sea salt was displaced by other cloud components. In the other two cases, sea-salt-containing particles on samples at all altitudes were without Cl. However, molar ratios of S to Na (S/Na) of the sea-salt-containing particles of top-of-cloud samples were higher than those of below-cloud samples, suggesting that sulfuric acid or sulfate was added to sea-salt-containing particles after complete displacement of Cl by absorption of SO2 or coagulation with sulfate. The additional volume of sulfuric acid in clouds for the two cases was estimated using the observed S/Na values of sea-salt-containing particles. The estimation revealed that size changes by in-cloud

  11. AN INTERACTIVE WEB-BASED ANALYSIS FRAMEWORK FOR REMOTE SENSING CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    X. Z. Wang

    2015-07-01

    Full Text Available Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users’ private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook

  12. An Interactive Web-Based Analysis Framework for Remote Sensing Cloud Computing

    Science.gov (United States)

    Wang, X. Z.; Zhang, H. M.; Zhao, J. H.; Lin, Q. H.; Zhou, Y. C.; Li, J. H.

    2015-07-01

    Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users' private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook. Users can write

  13. Moving HammerCloud to CERN's private cloud

    CERN Document Server

    Barrand, Quentin

    2013-01-01

    HammerCloud is a testing framework for the Worldwide LHC Computing Grid. Currently deployed on about 20 hand-managed machines, it was desirable to move it to the Agile Infrastructure, CERN's OpenStack-based private cloud.

  14. FPFH-based graph matching for 3D point cloud registration

    Science.gov (United States)

    Zhao, Jiapeng; Li, Chen; Tian, Lihua; Zhu, Jihua

    2018-04-01

    Correspondence detection is a vital step in point cloud registration and it can help getting a reliable initial alignment. In this paper, we put forward an advanced point feature-based graph matching algorithm to solve the initial alignment problem of rigid 3D point cloud registration with partial overlap. Specifically, Fast Point Feature Histograms are used to determine the initial possible correspondences firstly. Next, a new objective function is provided to make the graph matching more suitable for partially overlapping point cloud. The objective function is optimized by the simulated annealing algorithm for final group of correct correspondences. Finally, we present a novel set partitioning method which can transform the NP-hard optimization problem into a O(n3)-solvable one. Experiments on the Stanford and UWA public data sets indicates that our method can obtain better result in terms of both accuracy and time cost compared with other point cloud registration methods.

  15. Space Science Cloud: a Virtual Space Science Research Platform Based on Cloud Model

    Science.gov (United States)

    Hu, Xiaoyan; Tong, Jizhou; Zou, Ziming

    Through independent and co-operational science missions, Strategic Pioneer Program (SPP) on Space Science, the new initiative of space science program in China which was approved by CAS and implemented by National Space Science Center (NSSC), dedicates to seek new discoveries and new breakthroughs in space science, thus deepen the understanding of universe and planet earth. In the framework of this program, in order to support the operations of space science missions and satisfy the demand of related research activities for e-Science, NSSC is developing a virtual space science research platform based on cloud model, namely the Space Science Cloud (SSC). In order to support mission demonstration, SSC integrates interactive satellite orbit design tool, satellite structure and payloads layout design tool, payload observation coverage analysis tool, etc., to help scientists analyze and verify space science mission designs. Another important function of SSC is supporting the mission operations, which runs through the space satellite data pipelines. Mission operators can acquire and process observation data, then distribute the data products to other systems or issue the data and archives with the services of SSC. In addition, SSC provides useful data, tools and models for space researchers. Several databases in the field of space science are integrated and an efficient retrieve system is developing. Common tools for data visualization, deep processing (e.g., smoothing and filtering tools), analysis (e.g., FFT analysis tool and minimum variance analysis tool) and mining (e.g., proton event correlation analysis tool) are also integrated to help the researchers to better utilize the data. The space weather models on SSC include magnetic storm forecast model, multi-station middle and upper atmospheric climate model, solar energetic particle propagation model and so on. All the services above-mentioned are based on the e-Science infrastructures of CAS e.g. cloud storage and

  16. Privacy-Preserving and Scalable Service Recommendation Based on SimHash in a Distributed Cloud Environment

    Directory of Open Access Journals (Sweden)

    Yanwei Xu

    2017-01-01

    Full Text Available With the increasing volume of web services in the cloud environment, Collaborative Filtering- (CF- based service recommendation has become one of the most effective techniques to alleviate the heavy burden on the service selection decisions of a target user. However, the service recommendation bases, that is, historical service usage data, are often distributed in different cloud platforms. Two challenges are present in such a cross-cloud service recommendation scenario. First, a cloud platform is often not willing to share its data to other cloud platforms due to privacy concerns, which decreases the feasibility of cross-cloud service recommendation severely. Second, the historical service usage data recorded in each cloud platform may update over time, which reduces the recommendation scalability significantly. In view of these two challenges, a novel privacy-preserving and scalable service recommendation approach based on SimHash, named SerRecSimHash, is proposed in this paper. Finally, through a set of experiments deployed on a real distributed service quality dataset WS-DREAM, we validate the feasibility of our proposal in terms of recommendation accuracy and efficiency while guaranteeing privacy-preservation.

  17. Thermodynamic phase profiles of optically thin midlatitude cloud and their relation to temperature

    Energy Technology Data Exchange (ETDEWEB)

    Naud, C. M.; Del Genio, Anthony D.; Haeffelin, M.; Morille, Y.; Noel, V.; Dupont, Jean-Charles; Turner, David D.; Lo, Chaomei; Comstock, Jennifer M.

    2010-06-03

    Winter cloud phase and temperature profiles derived from ground-based lidar depolarization and radiosonde measurements are analyzed for two midlatitude locations: the United States Atmospheric Radiation Measurement Program Southern Great Plains (SGP) site and the Site Instrumental de Recherche par Télédétection Atmosphérique (SIRTA) in France. Because lidars are attenuated in optically thick clouds, the dataset only includes optically thin clouds (optical thickness < 3). At SGP, 57% of the clouds observed with the lidar in the temperature range 233-273 K are either completely liquid or completely glaciated, while at SIRTA only 42% of the observed clouds are single phase, based on a depolarization ratio threshold of 11% for differentiating liquid from ice. Most optically thin mixed phase clouds show an ice layer at cloud top, and clouds with liquid at cloud top are less frequent. The relationship between ice phase occurrence and temperature only slightly changes between cloud base and top. At both sites liquid is more prevalent at colder temperatures than has been found previously in aircraft flights through frontal clouds of greater optical thicknesses. Liquid in clouds persists to colder temperatures at SGP than SIRTA. This information on the average temperatures of mixed phase clouds at both locations complements earlier passive satellite remote sensing measurements that sample cloud phase near cloud top and for a wider range of cloud optical thicknesses.

  18. Cloud Based Earth Observation Data Exploitation Platforms

    Science.gov (United States)

    Romeo, A.; Pinto, S.; Loekken, S.; Marin, A.

    2017-12-01

    In the last few years data produced daily by several private and public Earth Observation (EO) satellites reached the order of tens of Terabytes, representing for scientists and commercial application developers both a big opportunity for their exploitation and a challenge for their management. New IT technologies, such as Big Data and cloud computing, enable the creation of web-accessible data exploitation platforms, which offer to scientists and application developers the means to access and use EO data in a quick and cost effective way. RHEA Group is particularly active in this sector, supporting the European Space Agency (ESA) in the Exploitation Platforms (EP) initiative, developing technology to build multi cloud platforms for the processing and analysis of Earth Observation data, and collaborating with larger European initiatives such as the European Plate Observing System (EPOS) and the European Open Science Cloud (EOSC). An EP is a virtual workspace, providing a user community with access to (i) large volume of data, (ii) algorithm development and integration environment, (iii) processing software and services (e.g. toolboxes, visualization routines), (iv) computing resources, (v) collaboration tools (e.g. forums, wiki, etc.). When an EP is dedicated to a specific Theme, it becomes a Thematic Exploitation Platform (TEP). Currently, ESA has seven TEPs in a pre-operational phase dedicated to geo-hazards monitoring and prevention, costal zones, forestry areas, hydrology, polar regions, urban areas and food security. On the technology development side, solutions like the multi cloud EO data processing platform provides the technology to integrate ICT resources and EO data from different vendors in a single platform. In particular it offers (i) Multi-cloud data discovery, (ii) Multi-cloud data management and access and (iii) Multi-cloud application deployment. This platform has been demonstrated with the EGI Federated Cloud, Innovation Platform Testbed Poland

  19. CURB-BASED STREET FLOOR EXTRACTION FROM MOBILE TERRESTRIAL LIDAR POINT CLOUD

    Directory of Open Access Journals (Sweden)

    S. Ibrahim

    2012-07-01

    Full Text Available Mobile terrestrial laser scanners (MTLS produce huge 3D point clouds describing the terrestrial surface, from which objects like different street furniture can be generated. Extraction and modelling of the street curb and the street floor from MTLS point clouds is important for many applications such as right-of-way asset inventory, road maintenance and city planning. The proposed pipeline for the curb and street floor extraction consists of a sequence of five steps: organizing the 3D point cloud and nearest neighbour search; 3D density-based segmentation to segment the ground; morphological analysis to refine out the ground segment; derivative of Gaussian filtering to detect the curb; solving the travelling salesman problem to form a closed polygon of the curb and point-inpolygon test to extract the street floor. Two mobile laser scanning datasets of different scenes are tested with the proposed pipeline. The results of the extracted curb and street floor are evaluated based on a truth data. The obtained detection rates for the extracted street floor for the datasets are 95% and 96.53%. This study presents a novel approach to the detection and extraction of the road curb and the street floor from unorganized 3D point clouds captured by MTLS. It utilizes only the 3D coordinates of the point cloud.

  20. THE MAGELLANIC MOPRA ASSESSMENT (MAGMA). I. THE MOLECULAR CLOUD POPULATION OF THE LARGE MAGELLANIC CLOUD

    International Nuclear Information System (INIS)

    Wong, Tony; Chu, You-Hua; Gruendl, Robert A.; Looney, Leslie W.; Seale, Jonathan; Welty, Daniel E.; Hughes, Annie; Maddison, Sarah; Ott, Jürgen; Muller, Erik; Fukui, Yasuo; Kawamura, Akiko; Mizuno, Yoji; Pineda, Jorge L.; Bernard, Jean-Philippe; Paradis, Deborah; Henkel, Christian; Klein, Ulrich

    2011-01-01

    We present the properties of an extensive sample of molecular clouds in the Large Magellanic Cloud (LMC) mapped at 11 pc resolution in the CO(1-0) line. Targets were chosen based on a limiting CO flux and peak brightness as measured by the NANTEN survey. The observations were conducted with the ATNF Mopra Telescope as part of the Magellanic Mopra Assessment. We identify clouds as regions of connected CO emission and find that the distributions of cloud sizes, fluxes, and masses are sensitive to the choice of decomposition parameters. In all cases, however, the luminosity function of CO clouds is steeper than dN/dL∝L –2 , suggesting that a substantial fraction of mass is in low-mass clouds. A correlation between size and linewidth, while apparent for the largest emission structures, breaks down when those structures are decomposed into smaller structures. We argue that the correlation between virial mass and CO luminosity is the result of comparing two covariant quantities, with the correlation appearing tighter on larger scales where a size-linewidth relation holds. The virial parameter (the ratio of a cloud's kinetic to self-gravitational energy) shows a wide range of values and exhibits no clear trends with the CO luminosity or the likelihood of hosting young stellar object (YSO) candidates, casting further doubt on the assumption of virialization for molecular clouds in the LMC. Higher CO luminosity increases the likelihood of a cloud harboring a YSO candidate, and more luminous YSOs are more likely to be coincident with detectable CO emission, confirming the close link between giant molecular clouds and massive star formation.

  1. Reconciling Ground-Based and Space-Based Estimates of the Frequency of Occurrence and Radiative Effect of Clouds around Darwin, Australia

    Energy Technology Data Exchange (ETDEWEB)

    Protat, Alain; Young, Stuart; McFarlane, Sally A.; L' Ecuyer, Tristan; Mace, Gerald G.; Comstock, Jennifer M.; Long, Charles N.; Berry, Elizabeth; Delanoe, Julien

    2014-02-01

    The objective of this paper is to investigate whether estimates of the cloud frequency of occurrence and associated cloud radiative forcing as derived from ground-based and satellite active remote sensing and radiative transfer calculations can be reconciled over a well instrumented active remote sensing site located in Darwin, Australia, despite the very different viewing geometry and instrument characteristics. It is found that the ground-based radar-lidar combination at Darwin does not detect most of the cirrus clouds above 10 km (due to limited lidar detection capability and signal obscuration by low-level clouds) and that the CloudSat radar - Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) combination underreports the hydrometeor frequency of occurrence below 2 km height, due to instrument limitations at these heights. The radiative impact associated with these differences in cloud frequency of occurrence is large on the surface downwelling shortwave fluxes (ground and satellite) and the top-of atmosphere upwelling shortwave and longwave fluxes (ground). Good agreement is found for other radiative fluxes. Large differences in radiative heating rate as derived from ground and satellite radar-lidar instruments and RT calculations are also found above 10 km (up to 0.35 Kday-1 for the shortwave and 0.8 Kday-1 for the longwave). Given that the ground-based and satellite estimates of cloud frequency of occurrence and radiative impact cannot be fully reconciled over Darwin, caution should be exercised when evaluating the representation of clouds and cloud-radiation interactions in large-scale models and limitations of each set of instrumentation should be considered when interpreting model-observations differences.

  2. Feature Extraction from 3D Point Cloud Data Based on Discrete Curves

    Directory of Open Access Journals (Sweden)

    Yi An

    2013-01-01

    Full Text Available Reliable feature extraction from 3D point cloud data is an important problem in many application domains, such as reverse engineering, object recognition, industrial inspection, and autonomous navigation. In this paper, a novel method is proposed for extracting the geometric features from 3D point cloud data based on discrete curves. We extract the discrete curves from 3D point cloud data and research the behaviors of chord lengths, angle variations, and principal curvatures at the geometric features in the discrete curves. Then, the corresponding similarity indicators are defined. Based on the similarity indicators, the geometric features can be extracted from the discrete curves, which are also the geometric features of 3D point cloud data. The threshold values of the similarity indicators are taken from [0,1], which characterize the relative relationship and make the threshold setting easier and more reasonable. The experimental results demonstrate that the proposed method is efficient and reliable.

  3. Clustering, randomness, and regularity in cloud fields: 2. Cumulus cloud fields

    Science.gov (United States)

    Zhu, T.; Lee, J.; Weger, R. C.; Welch, R. M.

    1992-12-01

    During the last decade a major controversy has been brewing concerning the proper characterization of cumulus convection. The prevailing view has been that cumulus clouds form in clusters, in which cloud spacing is closer than that found for the overall cloud field and which maintains its identity over many cloud lifetimes. This "mutual protection hypothesis" of Randall and Huffman (1980) has been challenged by the "inhibition hypothesis" of Ramirez et al. (1990) which strongly suggests that the spatial distribution of cumuli must tend toward a regular distribution. A dilemma has resulted because observations have been reported to support both hypotheses. The present work reports a detailed analysis of cumulus cloud field spatial distributions based upon Landsat, Advanced Very High Resolution Radiometer, and Skylab data. Both nearest-neighbor and point-to-cloud cumulative distribution function statistics are investigated. The results show unequivocally that when both large and small clouds are included in the cloud field distribution, the cloud field always has a strong clustering signal. The strength of clustering is largest at cloud diameters of about 200-300 m, diminishing with increasing cloud diameter. In many cases, clusters of small clouds are found which are not closely associated with large clouds. As the small clouds are eliminated from consideration, the cloud field typically tends towards regularity. Thus it would appear that the "inhibition hypothesis" of Ramirez and Bras (1990) has been verified for the large clouds. However, these results are based upon the analysis of point processes. A more exact analysis also is made which takes into account the cloud size distributions. Since distinct clouds are by definition nonoverlapping, cloud size effects place a restriction upon the possible locations of clouds in the cloud field. The net effect of this analysis is that the large clouds appear to be randomly distributed, with only weak tendencies towards

  4. Web-based CERES Clouds QC Property Viewing Tool

    Science.gov (United States)

    Smith, R. A.; Chu, C.; Sun-Mack, S.; Chen, Y.; Heckert, E.; Minnis, P.

    2014-12-01

    This presentation will display the capabilities of a web-based CERES cloud property viewer. Terra data will be chosen for examples. It will demonstrate viewing of cloud properties in gridded global maps, histograms, time series displays, latitudinal zonal images, binned data charts, data frequency graphs, and ISCCP plots. Images can be manipulated by the user to narrow boundaries of the map as well as color bars and value ranges, compare datasets, view data values, and more. Other atmospheric studies groups will be encouraged to put their data into the underlying NetCDF data format and view their data with the tool. A laptop will hopefully be available to allow conference attendees to try navigating the tool.

  5. Blueprint template support for engineering cloud-based services

    NARCIS (Netherlands)

    Nguyen, D.K.; Lelli, F.; Taher, Y.; Parkin, M.S.; Papazoglou, M.; van den Heuvel, W.J.A.M.; Abramowicz, W.; Martín Llorente, I.; Surridge, M.; Zisman, A.; Vayssière, J.

    2011-01-01

    Current cloud-based service offerings are often provided as one-size-fits-all solutions and give little or no room for customization. This limits the ability for application developers to pick and choose offerings from multiple software, platform, infrastructure service providers and configure them

  6. Foundations of Blueprint for Cloud-based Service Engineering

    NARCIS (Netherlands)

    Nguyen, D.K.

    2011-01-01

    Current cloud-based service offerings are often provided as one-size-fits-all solution and give little or no room for customization. This limits the ability for application developers to pick and choose offerings from multiple software, platform and infrastructure service providers and configure

  7. Application of Micro-cloud point extraction for spectrophotometric determination of Malachite green, Crystal violet and Rhodamine B in aqueous samples

    Science.gov (United States)

    Ghasemi, Elham; Kaykhaii, Massoud

    2016-07-01

    A novel, green, simple and fast method was developed for spectrophotometric determination of Malachite green, Crystal violet, and Rhodamine B in water samples based on Micro-cloud Point extraction (MCPE) at room temperature. This is the first report on the application of MCPE on dyes. In this method, to reach the cloud point at room temperature, the MCPE procedure was carried out in brine using Triton X-114 as a non-ionic surfactant. The factors influencing the extraction efficiency were investigated and optimized. Under the optimized condition, calibration curves were found to be linear in the concentration range of 0.06-0.60 mg/L, 0.10-0.80 mg/L, and 0.03-0.30 mg/L with the enrichment factors of 29.26, 85.47 and 28.36, respectively for Malachite green, Crystal violet, and Rhodamine B. Limit of detections were between 2.2 and 5.1 μg/L.

  8. The Geospatial Data Cloud: An Implementation of Applying Cloud Computing in Geosciences

    Directory of Open Access Journals (Sweden)

    Xuezhi Wang

    2014-11-01

    Full Text Available The rapid growth in the volume of remote sensing data and its increasing computational requirements bring huge challenges for researchers as traditional systems cannot adequately satisfy the huge demand for service. Cloud computing has the advantage of high scalability and reliability, which can provide firm technical support. This paper proposes a highly scalable geospatial cloud platform named the Geospatial Data Cloud, which is constructed based on cloud computing. The architecture of the platform is first introduced, and then two subsystems, the cloud-based data management platform and the cloud-based data processing platform, are described.  ––– This paper was presented at the First Scientific Data Conference on Scientific Research, Big Data, and Data Science, organized by CODATA-China and held in Beijing on 24-25 February, 2014.

  9. Outdoor Illegal Construction Identification Algorithm Based on 3D Point Cloud Segmentation

    Science.gov (United States)

    An, Lu; Guo, Baolong

    2018-03-01

    Recently, various illegal constructions occur significantly in our surroundings, which seriously restrict the orderly development of urban modernization. The 3D point cloud data technology is used to identify the illegal buildings, which could address the problem above effectively. This paper proposes an outdoor illegal construction identification algorithm based on 3D point cloud segmentation. Initially, in order to save memory space and reduce processing time, a lossless point cloud compression method based on minimum spanning tree is proposed. Then, a ground point removing method based on the multi-scale filtering is introduced to increase accuracy. Finally, building clusters on the ground can be obtained using a region growing method, as a result, the illegal construction can be marked. The effectiveness of the proposed algorithm is verified using a publicly data set collected from the International Society for Photogrammetry and Remote Sensing (ISPRS).

  10. A Review on Broker Based Cloud Service Model

    Directory of Open Access Journals (Sweden)

    Nagarajan Rajganesh

    2016-09-01

    Full Text Available Cloud computing emerged as a utility oriented computing that facilitates resource sharing under pay-as-you-go model. Nowadays, cloud offerings are not limited to range of services and anything can be shared as a service through the Internet. In this work, a detailed literature survey with respect to cloud service discovery and composition has been accounted. A proposed architecture with the inclusion of cloud broker is presented in our work. It focuses the importance of suitable service selection and its ranking towards fulfilling the customer’s service requirements. The proposed cloud broker advocates techniques such as reasoning and decision making capabilities for the improved cloud service selection and composition.

  11. A Reference Architecture for a Cloud-Based Tools as a Service Workspace

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali; Sheng, Quan Z.

    2015-01-01

    Software Architecture (SA) plays a critical role in developing and evolving cloud-based applications. We present a Reference Architecture (RA) for designing Cloud-based Tools as a service work SPACE (TSPACE) - a platform for provisioning chain of tools following the Software as a Service (SaaS...... evaluate the RA in terms of completeness and feasibility. Our proposed RA can provide valuable guidance and insights for designing and implementing concrete software architectures of TSPACE....

  12. Cross layer optimization for cloud-based radio over optical fiber networks

    Science.gov (United States)

    Shao, Sujie; Guo, Shaoyong; Qiu, Xuesong; Yang, Hui; Meng, Luoming

    2016-07-01

    To adapt the 5G communication, the cloud radio access network is a paradigm introduced by operators which aggregates all base stations computational resources into a cloud BBU pool. The interaction between RRH and BBU or resource schedule among BBUs in cloud have become more frequent and complex with the development of system scale and user requirement. It can promote the networking demand among RRHs and BBUs, and force to form elastic optical fiber switching and networking. In such network, multiple stratum resources of radio, optical and BBU processing unit have interweaved with each other. In this paper, we propose a novel multiple stratum optimization (MSO) architecture for cloud-based radio over optical fiber networks (C-RoFN) with software defined networking. Additionally, a global evaluation strategy (GES) is introduced in the proposed architecture. MSO can enhance the responsiveness to end-to-end user demands and globally optimize radio frequency, optical spectrum and BBU processing resources effectively to maximize radio coverage. The feasibility and efficiency of the proposed architecture with GES strategy are experimentally verified on OpenFlow-enabled testbed in terms of resource occupation and path provisioning latency.

  13. A Cloud Theory-Based Trust Computing Model in Social Networks

    Directory of Open Access Journals (Sweden)

    Fengming Liu

    2016-12-01

    Full Text Available How to develop a trust management model and then to efficiently control and manage nodes is an important issue in the scope of social network security. In this paper, a trust management model based on a cloud model is proposed. The cloud model uses a specific computation operator to achieve the transformation from qualitative concepts to quantitative computation. Additionally, this can also be used to effectively express the fuzziness, randomness and the relationship between them of the subjective trust. The node trust is divided into reputation trust and transaction trust. In addition, evaluation methods are designed, respectively. Firstly, the two-dimension trust cloud evaluation model is designed based on node’s comprehensive and trading experience to determine the reputation trust. The expected value reflects the average trust status of nodes. Then, entropy and hyper-entropy are used to describe the uncertainty of trust. Secondly, the calculation methods of the proposed direct transaction trust and the recommendation transaction trust involve comprehensively computation of the transaction trust of each node. Then, the choosing strategies were designed for node to trade based on trust cloud. Finally, the results of a simulation experiment in P2P network file sharing on an experimental platform directly reflect the objectivity, accuracy and robustness of the proposed model, and could also effectively identify the malicious or unreliable service nodes in the system. In addition, this can be used to promote the service reliability of the nodes with high credibility, by which the stability of the whole network is improved.

  14. Analysis of the security and privacy requirements of cloud-based electronic health records systems.

    Science.gov (United States)

    Rodrigues, Joel J P C; de la Torre, Isabel; Fernández, Gonzalo; López-Coronado, Miguel

    2013-08-21

    The Cloud Computing paradigm offers eHealth systems the opportunity to enhance the features and functionality that they offer. However, moving patients' medical information to the Cloud implies several risks in terms of the security and privacy of sensitive health records. In this paper, the risks of hosting Electronic Health Records (EHRs) on the servers of third-party Cloud service providers are reviewed. To protect the confidentiality of patient information and facilitate the process, some suggestions for health care providers are made. Moreover, security issues that Cloud service providers should address in their platforms are considered. To show that, before moving patient health records to the Cloud, security and privacy concerns must be considered by both health care providers and Cloud service providers. Security requirements of a generic Cloud service provider are analyzed. To study the latest in Cloud-based computing solutions, bibliographic material was obtained mainly from Medline sources. Furthermore, direct contact was made with several Cloud service providers. Some of the security issues that should be considered by both Cloud service providers and their health care customers are role-based access, network security mechanisms, data encryption, digital signatures, and access monitoring. Furthermore, to guarantee the safety of the information and comply with privacy policies, the Cloud service provider must be compliant with various certifications and third-party requirements, such as SAS70 Type II, PCI DSS Level 1, ISO 27001, and the US Federal Information Security Management Act (FISMA). Storing sensitive information such as EHRs in the Cloud means that precautions must be taken to ensure the safety and confidentiality of the data. A relationship built on trust with the Cloud service provider is essential to ensure a transparent process. Cloud service providers must make certain that all security mechanisms are in place to avoid unauthorized access

  15. Cloud-Based Virtual Laboratory for Network Security Education

    Science.gov (United States)

    Xu, Le; Huang, Dijiang; Tsai, Wei-Tek

    2014-01-01

    Hands-on experiments are essential for computer network security education. Existing laboratory solutions usually require significant effort to build, configure, and maintain and often do not support reconfigurability, flexibility, and scalability. This paper presents a cloud-based virtual laboratory education platform called V-Lab that provides a…

  16. Aerosol and Cloud Properties during the Cloud Cheju ABC Plume -Asian Monsoon Experiment (CAPMEX) 2008: Linking between Ground-based and UAV Measurements

    Science.gov (United States)

    Kim, S.; Yoon, S.; Venkata Ramana, M.; Ramanathan, V.; Nguyen, H.; Park, S.; Kim, M.

    2009-12-01

    Cheju Atmospheric Brown Cloud (ABC) Plume-Monsoon Experiment (CAPMEX), comprehsensive ground-based measurements and a series of data-gathering flights by specially equipped autonomous unmanned aerial vehicles (AUAVs) for aerosol and cloud, had conducted at Jeju (formerly, Cheju), South Korea during August-September 2008, to improve our understanding of how the reduction of anthropogenic emissions in China (so-called “great shutdown” ) during and after the Summer Beijing Olympic Games 2008 effcts on the air quliaty and radiation budgets and how atmospheric brown clouds (ABCs) influences solar radiation budget off Asian continent. Large numbers of in-situ and remote sensing instruments at the Gosan ABC observatory and miniaturized instruments on the aircraft measure a range of properties such as the quantity of soot, size-segregated aerosol particle numbers, total particle numbers, size-segregated cloud droplet numbers (only AUAV), aerosol scattering properties (only ground), aerosol vertical distribution, column-integrated aerosol properties, and meteorological variables. By integrating ground-level and high-elevation AUAV measurements with NASA-satellite observations (e.g., MODIS, CALIPSO), we investigate the long range transport of aerosols, the impact of ABCs on clouds, and the role of biogenic and anthropogenic aerosols on cloud condensation nuclei (CCN). In this talk, we will present the results from CAPMEX focusing on: (1) the characteristics of aerosol optical, physical and chemical properties at Gosan observatory, (2) aerosol solar heating calculated from the ground-based micro-pulse lidar and AERONET sun/sky radiometer synergy, and comparison with direct measurements from UAV, and (3) aerosol-cloud interactions in conjunction with measurements by satellites and Gosan observatory.

  17. Development of a cloud-point extraction method for copper and nickel determination in food samples

    International Nuclear Information System (INIS)

    Azevedo Lemos, Valfredo; Selis Santos, Moacy; Teixeira David, Graciete; Vasconcelos Maciel, Mardson; Almeida Bezerra, Marcos de

    2008-01-01

    A new, simple and versatile cloud-point extraction (CPE) methodology has been developed for the separation and preconcentration of copper and nickel. The metals in the initial aqueous solution were complexed with 2-(2'-benzothiazolylazo)-5-(N,N-diethyl)aminophenol (BDAP) and Triton X-114 was added as surfactant. Dilution of the surfactant-rich phase with acidified methanol was performed after phase separation, and the copper and nickel contents were measured by flame atomic absorption spectrometry. The variables affecting the cloud-point extraction were optimized using a Box-Behnken design. Under the optimum experimental conditions, enrichment factors of 29 and 25 were achieved for copper and nickel, respectively. The accuracy of the method was evaluated and confirmed by analysis of the followings certified reference materials: Apple Leaves, Spinach Leaves and Tomato Leaves. The limits of detection expressed to solid sample analysis were 0.1 μg g -1 (Cu) and 0.4 μg g -1 (Ni). The precision for 10 replicate measurements of 75 μg L -1 Cu or Ni was 6.4 and 1.0, respectively. The method has been successfully applied to the analysis of food samples

  18. Intrusion detection in cloud computing based attack patterns and risk assessment

    Directory of Open Access Journals (Sweden)

    Ben Charhi Youssef

    2017-05-01

    Full Text Available This paper is an extension of work originally presented in SYSCO CONF.We extend our previous work by presenting the initial results of the implementation of intrusion detection based on risk assessment on cloud computing. The idea focuses on a novel approach for detecting cyber-attacks on the cloud environment by analyzing attacks pattern using risk assessment methodologies. The aim of our solution is to combine evidences obtained from Intrusion Detection Systems (IDS deployed in a cloud with risk assessment related to each attack pattern. Our approach presents a new qualitative solution for analyzing each symptom, indicator and vulnerability analyzing impact and likelihood of distributed and multi-steps attacks directed to cloud environments. The implementation of this approach will reduce the number of false alerts and will improve the performance of the IDS.

  19. High-Precision Registration of Point Clouds Based on Sphere Feature Constraints

    Directory of Open Access Journals (Sweden)

    Junhui Huang

    2016-12-01

    Full Text Available Point cloud registration is a key process in multi-view 3D measurements. Its precision affects the measurement precision directly. However, in the case of the point clouds with non-overlapping areas or curvature invariant surface, it is difficult to achieve a high precision. A high precision registration method based on sphere feature constraint is presented to overcome the difficulty in the paper. Some known sphere features with constraints are used to construct virtual overlapping areas. The virtual overlapping areas provide more accurate corresponding point pairs and reduce the influence of noise. Then the transformation parameters between the registered point clouds are solved by an optimization method with weight function. In that case, the impact of large noise in point clouds can be reduced and a high precision registration is achieved. Simulation and experiments validate the proposed method.

  20. MOMCC: Market-Oriented Architecture for Mobile Cloud Computing Based on Service Oriented Architecture

    OpenAIRE

    Abolfazli, Saeid; Sanaei, Zohreh; Gani, Abdullah; Shiraz, Muhammad

    2012-01-01

    The vision of augmenting computing capabilities of mobile devices, especially smartphones with least cost is likely transforming to reality leveraging cloud computing. Cloud exploitation by mobile devices breeds a new research domain called Mobile Cloud Computing (MCC). However, issues like portability and interoperability should be addressed for mobile augmentation which is a non-trivial task using component-based approaches. Service Oriented Architecture (SOA) is a promising design philosop...

  1. Characteristics of bacterial community in cloud water at Mt Tai: similarity and disparity under polluted and non-polluted cloud episodes

    Science.gov (United States)

    Wei, Min; Xu, Caihong; Chen, Jianmin; Zhu, Chao; Li, Jiarong; Lv, Ganglin

    2017-04-01

    cloud water and PM2. 5 in the atmosphere have a negative impact on bacteria, playing a vital role in shaping microbial community structure. The major ions might provide nutrition to bacteria and directly influence the bacterial community, whereas PM2. 5 in air has an indirect impact on bacterial community structure. During wet deposition, soluble particulate matter was dissolved in water droplets resulting in elevated concentration in cloud water. PM2. 5 was possibly associated with different origins and pathways of air mass as determined using source tracking by the backward trajectory, mainly related to long-range transport. This work enhanced our understanding of the characteristics of bacterial ecology in the atmospheric aqueous phase, highlighting the potential influence of environmental variables on the bacterial community in cloud processes. It may provide fundamental information of the bacterial community response in cloud water under increasing pollution. However, due to the limited sample size (13 samples) collected at the summit of Mt Tai, these issues need in-depth discussion. Further studies based on an annual series of field observation experiments and laboratory simulations will continue to track these issues.

  2. Cloud-Based Architectures for Auto-Scalable Web Geoportals towards the Cloudification of the GeoVITe Swiss Academic Geoportal

    Directory of Open Access Journals (Sweden)

    Ionuț Iosifescu-Enescu

    2017-06-01

    Full Text Available Cloud computing has redefined the way in which Spatial Data Infrastructures (SDI and Web geoportals are designed, managed, and maintained. The cloudification of a geoportal represents the migration of a full-stack geoportal application to an internet-based private or public cloud. This work introduces two generic and open cloud-based architectures for auto-scalable Web geoportals, illustrated with the use case of the cloudification efforts of the Swiss academic geoportal GeoVITe. The presented cloud-based architectural designs for auto-scalable Web geoportals consider the most important functional and non-functional requirements and are adapted to both public and private clouds. The availability of such generic cloud-based architectures advances the cloudification of academic SDIs and geoportals.

  3. CLOUD COMPUTING BASED INFORMATION SYSTEMS -PRESENT AND FUTURE

    Directory of Open Access Journals (Sweden)

    Maximilian ROBU

    2012-12-01

    Full Text Available The current economic crisis and the global recession have affected the IT market as well. A solution camefrom the Cloud Computing area by optimizing IT budgets and eliminating different types of expenses (servers, licenses,and so on. Cloud Computing is an exciting and interesting phenomenon, because of its relative novelty and explodinggrowth. Because of its raise in popularity and usage Cloud Computing has established its role as a research topic.However the tendency is to focus on the technical aspects of Cloud Computing, thus leaving the potential that thistechnology offers unexplored. With the help of this technology new market player arise and they manage to break thetraditional value chain of service provision. The main focus of this paper is the business aspects of Cloud. In particularwe will talk about the economic aspects that cover using Cloud Computing (when, why and how to use, and theimpacts on the infrastructure, the legalistic issues that come from using Cloud Computing; the scalability and partiallyunclear legislation.

  4. A distributed cloud-based cyberinfrastructure framework for integrated bridge monitoring

    Science.gov (United States)

    Jeong, Seongwoon; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.

    2017-04-01

    This paper describes a cloud-based cyberinfrastructure framework for the management of the diverse data involved in bridge monitoring. Bridge monitoring involves various hardware systems, software tools and laborious activities that include, for examples, a structural health monitoring (SHM), sensor network, engineering analysis programs and visual inspection. Very often, these monitoring systems, tools and activities are not coordinated, and the collected information are not shared. A well-designed integrated data management framework can support the effective use of the data and, thereby, enhance bridge management and maintenance operations. The cloud-based cyberinfrastructure framework presented herein is designed to manage not only sensor measurement data acquired from the SHM system, but also other relevant information, such as bridge engineering model and traffic videos, in an integrated manner. For the scalability and flexibility, cloud computing services and distributed database systems are employed. The information stored can be accessed through standard web interfaces. For demonstration, the cyberinfrastructure system is implemented for the monitoring of the bridges located along the I-275 Corridor in the state of Michigan.

  5. Cloud-Based Collaborative Writing and the Common Core Standards

    Science.gov (United States)

    Yim, Soobin; Warschauer, Mark; Zheng, Binbin; Lawrence, Joshua F.

    2014-01-01

    The Common Core State Standards emphasize the integration of technology skills into English Language Arts (ELA) instruction, recognizing the demand for technology-based literacy skills to be college- and career- ready. This study aims to examine how collaborative cloud-based writing is used in in a Colorado school district, where one-to-one…

  6. Service quality of cloud-based applications

    CERN Document Server

    Bauer, Eric

    2014-01-01

    This book explains why applications running on cloud might not deliver the same service reliability, availability, latency and overall quality to end users as they do when the applications are running on traditional (non-virtualized, non-cloud) configurations, and explains what can be done to mitigate that risk.

  7. A privacy-preserving framework for outsourcing location-based services to the cloud

    OpenAIRE

    Zhu, Xiaojie; Ayday, Erman; Vitenberg, Roman

    2018-01-01

    Thanks to the popularity of mobile devices a large number of location-based services (LBS) have emerged. While a large number of privacy-preserving solutions for LBS have been proposed, most of these solutions do not consider the fact that LBS are typically cloud-based nowadays. Outsourcing data and computation to the cloud raises a number of significant challenges related to data confidentiality, user identity and query privacy, fine-grain access control, and query expressiveness. In this wo...

  8. Impact of Arctic sea-ice retreat on the recent change in cloud-base height during autumn

    Science.gov (United States)

    Sato, K.; Inoue, J.; Kodama, Y.; Overland, J. E.

    2012-12-01

    Cloud-base observations over the ice-free Chukchi and Beaufort Seas in autumn were conducted using a shipboard ceilometer and radiosondes during the 1999-2010 cruises of the Japanese R/V Mirai. To understand the recent change in cloud base height over the Arctic Ocean, these cloud-base height data were compared with the observation data under ice-covered situation during SHEBA (the Surface Heat Budget of the Arctic Ocean project in 1998). Our ice-free results showed a 30 % decrease (increase) in the frequency of low clouds with a ceiling below (above) 500 m. Temperature profiles revealed that the boundary layer was well developed over the ice-free ocean in the 2000s, whereas a stable layer dominated during the ice-covered period in 1998. The change in surface boundary conditions likely resulted in the difference in cloud-base height, although it had little impact on air temperatures in the mid- and upper troposphere. Data from the 2010 R/V Mirai cruise were investigated in detail in terms of air-sea temperature difference. This suggests that stratus cloud over the sea ice has been replaced as stratocumulus clouds with low cloud fraction due to the decrease in static stability induced by the sea-ice retreat. The relationship between cloud-base height and air-sea temperature difference (SST-Ts) was analyzed in detail using special section data during 2010 cruise data. Stratus clouds near the sea surface were predominant under a warm advection situation, whereas stratocumulus clouds with a cloud-free layer were significant under a cold advection situation. The threshold temperature difference between sea surface and air temperatures for distinguishing the dominant cloud types was 3 K. Anomalous upward turbulent heat fluxes associated with the sea-ice retreat have likely contributed to warming of the lower troposphere. Frequency distribution of the cloud-base height (km) detected by a ceilometer/lidar (black bars) and radiosondes (gray bars), and profiles of potential

  9. A cloud computing based platform for sleep behavior and chronic diseases collaborative research.

    Science.gov (United States)

    Kuo, Mu-Hsing; Borycki, Elizabeth; Kushniruk, Andre; Huang, Yueh-Min; Hung, Shu-Hui

    2014-01-01

    The objective of this study is to propose a Cloud Computing based platform for sleep behavior and chronic disease collaborative research. The platform consists of two main components: (1) a sensing bed sheet with textile sensors to automatically record patient's sleep behaviors and vital signs, and (2) a service-oriented cloud computing architecture (SOCCA) that provides a data repository and allows for sharing and analysis of collected data. Also, we describe our systematic approach to implementing the SOCCA. We believe that the new cloud-based platform can provide nurse and other health professional researchers located in differing geographic locations with a cost effective, flexible, secure and privacy-preserved research environment.

  10. Traffic Flow Prediction Model for Large-Scale Road Network Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhaosheng Yang

    2014-01-01

    Full Text Available To increase the efficiency and precision of large-scale road network traffic flow prediction, a genetic algorithm-support vector machine (GA-SVM model based on cloud computing is proposed in this paper, which is based on the analysis of the characteristics and defects of genetic algorithm and support vector machine. In cloud computing environment, firstly, SVM parameters are optimized by the parallel genetic algorithm, and then this optimized parallel SVM model is used to predict traffic flow. On the basis of the traffic flow data of Haizhu District in Guangzhou City, the proposed model was verified and compared with the serial GA-SVM model and parallel GA-SVM model based on MPI (message passing interface. The results demonstrate that the parallel GA-SVM model based on cloud computing has higher prediction accuracy, shorter running time, and higher speedup.

  11. HOLIMO II: a digital holographic instrument for ground-based in situ observations of microphysical properties of mixed-phase clouds

    Science.gov (United States)

    Henneberger, J.; Fugal, J. P.; Stetzer, O.; Lohmann, U.

    2013-11-01

    Measurements of the microphysical properties of mixed-phase clouds with high spatial resolution are important to understand the processes inside these clouds. This work describes the design and characterization of the newly developed ground-based field instrument HOLIMO II (HOLographic Imager for Microscopic Objects II). HOLIMO II uses digital in-line holography to in situ image cloud particles in a well-defined sample volume. By an automated algorithm, two-dimensional images of single cloud particles between 6 and 250 μm in diameter are obtained and the size spectrum, the concentration and water content of clouds are calculated. By testing the sizing algorithm with monosized beads a systematic overestimation near the resolution limit was found, which has been used to correct the measurements. Field measurements from the high altitude research station Jungfraujoch, Switzerland, are presented. The measured number size distributions are in good agreement with parallel measurements by a fog monitor (FM-100, DMT, Boulder USA). The field data shows that HOLIMO II is capable of measuring the number size distribution with a high spatial resolution and determines ice crystal shape, thus providing a method of quantifying variations in microphysical properties. A case study over a period of 8 h has been analyzed, exploring the transition from a liquid to a mixed-phase cloud, which is the longest observation of a cloud with a holographic device. During the measurement period, the cloud does not completely glaciate, contradicting earlier assumptions of the dominance of the Wegener-Bergeron-Findeisen (WBF) process.

  12. HOLIMO II: a digital holographic instrument for ground-based in-situ observations of microphysical properties of mixed-phase clouds

    Science.gov (United States)

    Henneberger, J.; Fugal, J. P.; Stetzer, O.; Lohmann, U.

    2013-05-01

    Measurements of the microphysical properties of mixed-phase clouds with high spatial resolution are important to understand the processes inside these clouds. This work describes the design and characterization of the newly developed ground-based field instrument HOLIMO II (HOLographic Imager for Microscopic Objects II). HOLIMO II uses digital in-line holography to in-situ image cloud particles in a well defined sample volume. By an automated algorithm, two-dimensional images of single cloud particles between 6 and 250 μm in diameter are obtained and the size spectrum, the concentration and water content of clouds are calculated. By testing the sizing algorithm with monosized beads a systematic overestimation near the resolution limit was found, which has been used to correct the measurements. Field measurements from the high altitude research station Jungfraujoch, Switzerland, are presented. The measured number size distributions are in good agreement with parallel measurements by a fog monitor (FM-100, DMT, Boulder USA). The field data shows that HOLIMO II is capable of measuring the number size distribution with a high spatial resolution and determines ice crystal shape, thus providing a method of quantifying variations in microphysical properties. A case study over a period of 8 h has been analyzed, exploring the transition from a liquid to a mixed-phase cloud, which is the longest observation of a cloud with a holographic device. During the measurement period, the cloud does not completely glaciate, contradicting earlier assumptions of the dominance of the Wegener-Bergeron-Findeisen (WBF) process.

  13. Validation of quasi-invariant ice cloud radiative quantities with MODIS satellite-based cloud property retrievals

    International Nuclear Information System (INIS)

    Ding, Jiachen; Yang, Ping; Kattawar, George W.; King, Michael D.; Platnick, Steven; Meyer, Kerry G.

    2017-01-01

    Similarity relations applied to ice cloud radiance calculations are theoretically analyzed and numerically validated. If τ(1–ϖ) and τ(1–ϖg) are conserved where τ is optical thickness, ϖ the single-scattering albedo, and g the asymmetry factor, it is possible that substantially different phase functions may give rise to similar radiances in both conservative and non-conservative scattering cases, particularly in the case of large optical thicknesses. In addition to theoretical analysis, this study uses operational ice cloud optical thickness retrievals from the Moderate Resolution Imaging Spectroradiometer (MODIS) Level 2 Collection 5 (C5) and Collection 6 (C6) cloud property products to verify radiative similarity relations. It is found that, if the MODIS C5 and C6 ice cloud optical thickness values are multiplied by their respective (1–ϖg) factors, the resultant products referred to as the effective optical thicknesses become similar with their ratio values around unity. Furthermore, the ratios of the C5 and C6 ice cloud effective optical thicknesses display an angular variation pattern similar to that of the corresponding ice cloud phase function ratios. The MODIS C5 and C6 values of ice cloud similarity parameter, defined as [(1–ϖ)/(1–ϖg)]"1"/"2, also tend to be similar. - Highlights: • Similarity relations are theoretically analyzed and validated. • Similarity relations are verified with the MODIS Level 2 Collection 5 and 6 ice cloud property products. • The product of ice cloud optical thickness and (1–ϖg) is approximately invariant. • The similarity parameter derived from the MODIS ice cloud effective radius retrieval tends to be invariant.

  14. A Cloud-Based System for Automatic Hazard Monitoring from Sentinel-1 SAR Data

    Science.gov (United States)

    Meyer, F. J.; Arko, S. A.; Hogenson, K.; McAlpin, D. B.; Whitley, M. A.

    2017-12-01

    Despite the all-weather capabilities of Synthetic Aperture Radar (SAR), and its high performance in change detection, the application of SAR for operational hazard monitoring was limited in the past. This has largely been due to high data costs, slow product delivery, and limited temporal sampling associated with legacy SAR systems. Only since the launch of ESA's Sentinel-1 sensors have routinely acquired and free-of-charge SAR data become available, allowing—for the first time—for a meaningful contribution of SAR to disaster monitoring. In this paper, we present recent technical advances of the Sentinel-1-based SAR processing system SARVIEWS, which was originally built to generate hazard products for volcano monitoring centers. We outline the main functionalities of SARVIEWS including its automatic database interface to Sentinel-1 holdings of the Alaska Satellite Facility (ASF), and its set of automatic processing techniques. Subsequently, we present recent system improvements that were added to SARVIEWS and allowed for a vast expansion of its hazard services; specifically: (1) In early 2017, the SARVIEWS system was migrated into the Amazon Cloud, providing access to cloud capabilities such as elastic scaling of compute resources and cloud-based storage; (2) we co-located SARVIEWS with ASF's cloud-based Sentinel-1 archive, enabling the efficient and cost effective processing of large data volumes; (3) we integrated SARVIEWS with ASF's HyP3 system (http://hyp3.asf.alaska.edu/), providing functionality such as subscription creation via API or map interface as well as automatic email notification; (4) we automated the production chains for seismic and volcanic hazards by integrating SARVIEWS with the USGS earthquake notification service (ENS) and the USGS eruption alert system. Email notifications from both services are parsed and subscriptions are automatically created when certain event criteria are met; (5) finally, SARVIEWS-generated hazard products are now

  15. On Cloud-Based Engineering of Dependable Systems

    OpenAIRE

    Alajrami, Sami

    2014-01-01

    The cloud computing paradigm is being adopted by many organizations in different application domains as it is cost effective and offers a virtually unlimited pool of resources. Engineering critical systems can benefit from clouds in attaining all dependability means: fault tolerance, fault prevention, fault removal and fault forecasting. Our research aims to investigate the potential of supporting engineering of dependable software systems with cloud computing and proposes an open, extensible...

  16. Symmetrical compression distance for arrhythmia discrimination in cloud-based big-data services.

    Science.gov (United States)

    Lillo-Castellano, J M; Mora-Jiménez, I; Santiago-Mozos, R; Chavarría-Asso, F; Cano-González, A; García-Alberola, A; Rojo-Álvarez, J L

    2015-07-01

    The current development of cloud computing is completely changing the paradigm of data knowledge extraction in huge databases. An example of this technology in the cardiac arrhythmia field is the SCOOP platform, a national-level scientific cloud-based big data service for implantable cardioverter defibrillators. In this scenario, we here propose a new methodology for automatic classification of intracardiac electrograms (EGMs) in a cloud computing system, designed for minimal signal preprocessing. A new compression-based similarity measure (CSM) is created for low computational burden, so-called weighted fast compression distance, which provides better performance when compared with other CSMs in the literature. Using simple machine learning techniques, a set of 6848 EGMs extracted from SCOOP platform were classified into seven cardiac arrhythmia classes and one noise class, reaching near to 90% accuracy when previous patient arrhythmia information was available and 63% otherwise, hence overcoming in all cases the classification provided by the majority class. Results show that this methodology can be used as a high-quality service of cloud computing, providing support to physicians for improving the knowledge on patient diagnosis.

  17. Future Directions of Applying Healthcare Cloud for Home-based Chronic Disease Care

    OpenAIRE

    Hu, Yan; Eriksén, Sara; Lundberg, Jenny

    2017-01-01

    The care of chronic disease has become the main challenge for healthcare institutions around the world. To meet the growing needs of patients, moving the front desk of healthcare from hospital to home is essential. Recently, cloud computing has been applied to healthcare domain; however, adapting to and using this technology effectively for home-based care is still in its initial phase. We have proposed a conceptual hybrid cloud model for home-based chronic disease care, and have evaluated it...

  18. Determination of the impact of RGB points cloud attribute quality on color-based segmentation process

    Directory of Open Access Journals (Sweden)

    Bartłomiej Kraszewski

    2015-06-01

    Full Text Available The article presents the results of research on the effect that radiometric quality of point cloud RGB attributes have on color-based segmentation. In the research, a point cloud with a resolution of 5 mm, received from FAROARO Photon 120 scanner, described the fragment of an office’s room and color images were taken by various digital cameras. The images were acquired by SLR Nikon D3X, and SLR Canon D200 integrated with the laser scanner, compact camera Panasonic TZ-30 and a mobile phone digital camera. Color information from images was spatially related to point cloud in FAROARO Scene software. The color-based segmentation of testing data was performed with the use of a developed application named “RGB Segmentation”. The application was based on public Point Cloud Libraries (PCL and allowed to extract subsets of points fulfilling the criteria of segmentation from the source point cloud using region growing method.Using the developed application, the segmentation of four tested point clouds containing different RGB attributes from various images was performed. Evaluation of segmentation process was performed based on comparison of segments acquired using the developed application and extracted manually by an operator. The following items were compared: the number of obtained segments, the number of correctly identified objects and the correctness of segmentation process. The best correctness of segmentation and most identified objects were obtained using the data with RGB attribute from Nikon D3X images. Based on the results it was found that quality of RGB attributes of point cloud had impact only on the number of identified objects. In case of correctness of the segmentation, as well as its error no apparent relationship between the quality of color information and the result of the process was found.[b]Keywords[/b]: terrestrial laser scanning, color-based segmentation, RGB attribute, region growing method, digital images, points cloud

  19. Depolarization Lidar Determination of Cloud-Base Microphysical Properties

    NARCIS (Netherlands)

    Donovan, D.P.; Klein Baltink, H; Henzing, J. S.; de Roode, S.R.; Siebesma, A.P.

    2016-01-01

    The links between multiple-scattering induced depolarization and cloud microphysical properties (e.g. cloud particle number density, effective radius, water content) have long been recognised. Previous efforts to use depolarization information in a quantitative manner to retrieve cloud

  20. Cloud Computing Value Chains: Understanding Businesses and Value Creation in the Cloud

    Science.gov (United States)

    Mohammed, Ashraf Bany; Altmann, Jörn; Hwang, Junseok

    Based on the promising developments in Cloud Computing technologies in recent years, commercial computing resource services (e.g. Amazon EC2) or software-as-a-service offerings (e.g. Salesforce. com) came into existence. However, the relatively weak business exploitation, participation, and adoption of other Cloud Computing services remain the main challenges. The vague value structures seem to be hindering business adoption and the creation of sustainable business models around its technology. Using an extensive analyze of existing Cloud business models, Cloud services, stakeholder relations, market configurations and value structures, this Chapter develops a reference model for value chains in the Cloud. Although this model is theoretically based on porter's value chain theory, the proposed Cloud value chain model is upgraded to fit the diversity of business service scenarios in the Cloud computing markets. Using this model, different service scenarios are explained. Our findings suggest new services, business opportunities, and policy practices for realizing more adoption and value creation paths in the Cloud.

  1. Study of the relations between cloud properties and atmospheric conditions using ground-based digital images

    Science.gov (United States)

    Bakalova, Kalinka

    The aerosol constituents of the earth atmosphere are of great significance for the radiation budget and global climate of the planet. They are the precursors of clouds that in turn play an essential role in these processes and in the hydrological cycle of the Earth. Understanding the complex aerosol-cloud interactions requires a detailed knowledge of the dynamical processes moving the water vapor through the atmosphere, and of the physical mechanisms involved in the formation and growth of cloud particles. Ground-based observations on regional and short time scale provide valuable detailed information about atmospheric dynamics and cloud properties, and are used as a complementary tool to the global satellite observations. The objective of the present paper is to study the physical properties of clouds as displayed in ground-based visible images, and juxtapose them to the specific surface and atmospheric meteorological conditions. The observations are being carried out over the urban area of the city of Sofia, Bulgaria. The data obtained from visible images of clouds enable a quantitative description of texture and morphological features of clouds such as shape, thickness, motion, etc. These characteristics are related to cloud microphysical properties. The changes of relative humidity and the horizontal visibility are considered to be representative of the variations of the type (natural/manmade) and amount of the atmospheric aerosols near the earth surface, and potentially, the cloud drop number concentration. The atmospheric dynamics is accounted for by means of the values of the atmospheric pressure, temperature, wind velocity, etc., observed at the earth's surface. The advantage of ground-based observations of clouds compared to satellite ones is in the high spatial and temporal resolution of the obtained data about the lowermost cloud layer, which in turn is sensitive to the meteorological regimes that determine cloud formation and evolution. It turns out

  2. Cloud Detection from Satellite Imagery: A Comparison of Expert-Generated and Automatically-Generated Decision Trees

    Science.gov (United States)

    Shiffman, Smadar

    2004-01-01

    Automated cloud detection and tracking is an important step in assessing global climate change via remote sensing. Cloud masks, which indicate whether individual pixels depict clouds, are included in many of the data products that are based on data acquired on- board earth satellites. Many cloud-mask algorithms have the form of decision trees, which employ sequential tests that scientists designed based on empirical astrophysics studies and astrophysics simulations. Limitations of existing cloud masks restrict our ability to accurately track changes in cloud patterns over time. In this study we explored the potential benefits of automatically-learned decision trees for detecting clouds from images acquired using the Advanced Very High Resolution Radiometer (AVHRR) instrument on board the NOAA-14 weather satellite of the National Oceanic and Atmospheric Administration. We constructed three decision trees for a sample of 8km-daily AVHRR data from 2000 using a decision-tree learning procedure provided within MATLAB(R), and compared the accuracy of the decision trees to the accuracy of the cloud mask. We used ground observations collected by the National Aeronautics and Space Administration Clouds and the Earth s Radiant Energy Systems S COOL project as the gold standard. For the sample data, the accuracy of automatically learned decision trees was greater than the accuracy of the cloud masks included in the AVHRR data product.

  3. Point Cloud Based Relative Pose Estimation of a Satellite in Close Range

    Directory of Open Access Journals (Sweden)

    Lujiang Liu

    2016-06-01

    Full Text Available Determination of the relative pose of satellites is essential in space rendezvous operations and on-orbit servicing missions. The key problems are the adoption of suitable sensor on board of a chaser and efficient techniques for pose estimation. This paper aims to estimate the pose of a target satellite in close range on the basis of its known model by using point cloud data generated by a flash LIDAR sensor. A novel model based pose estimation method is proposed; it includes a fast and reliable pose initial acquisition method based on global optimal searching by processing the dense point cloud data directly, and a pose tracking method based on Iterative Closest Point algorithm. Also, a simulation system is presented in this paper in order to evaluate the performance of the sensor and generate simulated sensor point cloud data. It also provides truth pose of the test target so that the pose estimation error can be quantified. To investigate the effectiveness of the proposed approach and achievable pose accuracy, numerical simulation experiments are performed; results demonstrate algorithm capability of operating with point cloud directly and large pose variations. Also, a field testing experiment is conducted and results show that the proposed method is effective.

  4. Optical fibre multi-parameter sensing with secure cloud based signal capture and processing

    Science.gov (United States)

    Newe, Thomas; O'Connell, Eoin; Meere, Damien; Yuan, Hongwei; Leen, Gabriel; O'Keeffe, Sinead; Lewis, Elfed

    2016-05-01

    Recent advancements in cloud computing technologies in the context of optical and optical fibre based systems are reported. The proliferation of real time and multi-channel based sensor systems represents significant growth in data volume. This coupled with a growing need for security presents many challenges and presents a huge opportunity for an evolutionary step in the widespread application of these sensing technologies. A tiered infrastructural system approach is adopted that is designed to facilitate the delivery of Optical Fibre-based "SENsing as a Service- SENaaS". Within this infrastructure, novel optical sensing platforms, deployed within different environments, are interfaced with a Cloud-based backbone infrastructure which facilitates the secure collection, storage and analysis of real-time data. Feedback systems, which harness this data to affect a change within the monitored location/environment/condition, are also discussed. The cloud based system presented here can also be used with chemical and physical sensors that require real-time data analysis, processing and feedback.

  5. Cloud-Based Parameter-Driven Statistical Services and Resource Allocation in a Heterogeneous Platform on Enterprise Environment

    Directory of Open Access Journals (Sweden)

    Sungju Lee

    2016-09-01

    Full Text Available A fundamental key for enterprise users is a cloud-based parameter-driven statistical service and it has become a substantial impact on companies worldwide. In this paper, we demonstrate the statistical analysis for some certain criteria that are related to data and applied to the cloud server for a comparison of results. In addition, we present a statistical analysis and cloud-based resource allocation method for a heterogeneous platform environment by performing a data and information analysis with consideration of the application workload and the server capacity, and subsequently propose a service prediction model using a polynomial regression model. In particular, our aim is to provide stable service in a given large-scale enterprise cloud computing environment. The virtual machines (VMs for cloud-based services are assigned to each server with a special methodology to satisfy the uniform utilization distribution model. It is also implemented between users and the platform, which is a main idea of our cloud computing system. Based on the experimental results, we confirm that our prediction model can provide sufficient resources for statistical services to large-scale users while satisfying the uniform utilization distribution.

  6. An Efficient and Privacy-Preserving Multiuser Cloud-Based LBS Query Scheme

    Directory of Open Access Journals (Sweden)

    Lu Ou

    2018-01-01

    Full Text Available Location-based services (LBSs are increasingly popular in today’s society. People reveal their location information to LBS providers to obtain personalized services such as map directions, restaurant recommendations, and taxi reservations. Usually, LBS providers offer user privacy protection statement to assure users that their private location information would not be given away. However, many LBSs run on third-party cloud infrastructures. It is challenging to guarantee user location privacy against curious cloud operators while still permitting users to query their own location information data. In this paper, we propose an efficient privacy-preserving cloud-based LBS query scheme for the multiuser setting. We encrypt LBS data and LBS queries with a hybrid encryption mechanism, which can efficiently implement privacy-preserving search over encrypted LBS data and is very suitable for the multiuser setting with secure and effective user enrollment and user revocation. This paper contains security analysis and performance experiments to demonstrate the privacy-preserving properties and efficiency of our proposed scheme.

  7. Design of Technical Support System for Retail Company Based on Cloud

    Directory of Open Access Journals (Sweden)

    Shao Ping

    2017-01-01

    Full Text Available With the retail side of the market in China, the sale of electricity companies as a new source of power retail, they participate in the electricity market business. National and local governments subsequently introduced the corresponding policies and rules, the technical support system becomes one of the necessary conditions for the access of the retail company. Retail electricity companies have started the system construction, but has not yet formed a standardized, complete architecture. This paper analyzes the business and data interaction requirements of retail electricity companies, and then designs the functional architecture based on basic application, advanced application and value-added application, and the technical architecture based on “cloud”. On this basis, the paper discusses the selection of private cloud, public cloud and mixed cloud model, and the rationalization suggestion of system construction. Which can provide reference for the construction of the technical support system of the domestic retail enterprises.

  8. International inter-rater agreement in scoring acne severity utilizing cloud-based image sharing of mobile phone photographs.

    Science.gov (United States)

    Foolad, Negar; Ornelas, Jennifer N; Clark, Ashley K; Ali, Ifrah; Sharon, Victoria R; Al Mubarak, Luluah; Lopez, Andrés; Alikhan, Ali; Al Dabagh, Bishr; Firooz, Alireza; Awasthi, Smita; Liu, Yu; Li, Chin-Shang; Sivamani, Raja K

    2017-09-01

    Cloud-based image sharing technology allows facilitated sharing of images. Cloud-based image sharing technology has not been well-studied for acne assessments or treatment preferences, among international evaluators. We evaluated inter-rater variability of acne grading and treatment recommendations among an international group of dermatologists that assessed photographs. This is a prospective, single visit photographic study to assess inter-rater agreement of acne photographs shared through an integrated mobile device, cloud-based, and HIPAA-compliant platform. Inter-rater agreements for global acne assessment and acne lesion counts were evaluated by the Kendall's coefficient of concordance while correlations between treatment recommendations and acne severity were calculated by Spearman's rank correlation coefficient. There was good agreement for the evaluation of inflammatory lesions (KCC = 0.62, P cloud-based image sharing for acne assessment. Cloud-based sharing may facilitate acne care and research among international collaborators. © 2017 The International Society of Dermatology.

  9. Design Thinking and Cloud Manufacturing: A Study of Cloud Model Sharing Platform Based on Separated Data Log

    Directory of Open Access Journals (Sweden)

    Zhe Wei

    2013-01-01

    Full Text Available To solve the product data consistency problem which is caused by the portable system that cannot conduct real-time update of product data in mobile environment under the mass customization production mode, a new product data optimistic replication method based on log is presented. This paper focuses on the design thinking provider, probing into a manufacturing resource design thinking cloud platform based on manufacturing resource-locating technologies, and also discuss several application scenarios of cloud locating technologies in the manufacturing environment. The actual demand of manufacturing creates a new mode which is service-oriented and has high efficiency and low consumption. Finally, they are different from the crowd-sourcing application model of Local-Motors. The sharing platform operator is responsible for a master plan for the platform, proposing a open interface standard and establishing a service operation mode.

  10. Efficient Multi-keyword Ranked Search over Outsourced Cloud Data based on Homomorphic Encryption

    Directory of Open Access Journals (Sweden)

    Nie Mengxi

    2016-01-01

    Full Text Available With the development of cloud computing, more and more data owners are motivated to outsource their data to the cloud server for great flexibility and less saving expenditure. Because the security of outsourced data must be guaranteed, some encryption methods should be used which obsoletes traditional data utilization based on plaintext, e.g. keyword search. To solve the search of encrypted data, some schemes were proposed to solve the search of encrypted data, e.g. top-k single or multiple keywords retrieval. However, the efficiency of these proposed schemes is not high enough to be impractical in the cloud computing. In this paper, we propose a new scheme based on homomorphic encryption to solve this challenging problem of privacy-preserving efficient multi-keyword ranked search over outsourced cloud data. In our scheme, the inner product is adopted to measure the relevance scores and the technique of relevance feedback is used to reflect the search preference of the data users. Security analysis shows that the proposed scheme can meet strict privacy requirements for such a secure cloud data utilization system. Performance evaluation demonstrates that the proposed scheme can achieve low overhead on both computation and communication.

  11. Factors influencing the organizational adoption of cloud computing: a survey among cloud workers

    Directory of Open Access Journals (Sweden)

    Mark Stieninger

    2018-01-01

    Full Text Available Cloud computing presents an opportunity for organizations to leverage affordable, scalable, and agile technologies. However, even with the demonstrated value of cloud computing, organizations have been hesitant to adopt such technologies. Based on a multi-theoretical research model, this paper provides an empirical study targeted to better understand the adoption of cloud services. An online survey addressing the factors derived from literature for three specific popular cloud application types (cloud storage, cloud mail and cloud office was undertaken. The research model was analyzed by using variance-based structural equation modelling. Results show that the factors of compatibility, relative advantage, security and trust, as well as, a lower level of complexity lead to a more positive attitude towards cloud adoption. Complexity, compatibility, image and security and trust have direct and indirect effects on relative advantage. These factors further explain a large part of the attitude towards cloud adoption but not of its usage.

  12. A secure medical data exchange protocol based on cloud environment.

    Science.gov (United States)

    Chen, Chin-Ling; Yang, Tsai-Tung; Shih, Tzay-Farn

    2014-09-01

    In recent years, health care technologies already became matured such as electronic medical records that can be easily stored. However, how to get medical resources more convenient is currently concern issue. In spite of many literatures discussed about medical systems, but these literatures should face many security challenges. The most important issue is patients' privacy. Therefore, we propose a secure medical data exchange protocol based on cloud environment. In our scheme, we use mobile device's characteristics, allowing peoples use medical resources on the cloud environment to seek medical advice conveniently.

  13. The Impact of Cloud Computing on Information Systems Agility

    Directory of Open Access Journals (Sweden)

    Mohamed Sawas

    2015-09-01

    Full Text Available As businesses are encountering frequent harsh economic conditions, concepts such as outsourcing, agile and lean management, change management and cost reduction are constantly gaining more attention. This is because these concepts are all aimed at saving on budgets and facing unexpected changes. Latest technologies like cloud computing promise to turn IT, that has always been viewed as a cost centre, into a source of saving money and driving flexibility and agility to the business. The purpose of this paper is to first compile a set of attributes that govern the agility benefits added to information systems by cloud computing and then develop a survey-based instrument to measure these agility benefits. Our research analysis employs non-probability sampling based on a combination of convenience and judgment. This approach was used to obtain a representative sample of participants from potential companies belonging to various industries such as oil & gas, banking, private, government and semi-governmental organizations. This research will enable decision makers to measure agility enhancements and hence compare the agility of Information Systems before and after deploying cloud computing.

  14. Cloud-based preoperative planning for total hip arthroplasty: a study of accuracy, efficiency, and compliance.

    Science.gov (United States)

    Maratt, Joseph D; Srinivasan, Ramesh C; Dahl, William J; Schilling, Peter L; Urquhart, Andrew G

    2012-08-01

    As digital radiography becomes more prevalent, several systems for digital preoperative planning have become available. The purpose of this study was to evaluate the accuracy and efficiency of an inexpensive, cloud-based digital templating system, which is comparable with acetate templating. However, cloud-based templating is substantially faster and more convenient than acetate templating or locally installed software. Although this is a practical solution for this particular medical application, regulatory changes are necessary before the tremendous advantages of cloud-based storage and computing can be realized in medical research and clinical practice. Copyright 2012, SLACK Incorporated.

  15. Aerosols correction of the OMI tropospheric NO2 retrievals over cloud-free scenes: Different methodologies based on the O2-O2 477 nm band

    Science.gov (United States)

    Chimot, Julien; Vlemmix, Tim; Veefkind, Pepijn; Levelt, Pieternel

    2016-04-01

    Numerous studies have drawn attention to the complexities related to the retrievals of tropospheric NO2 columns derived from satellite UltraViolet-Visible (UV-Vis) measurements in the presence of aerosols. Correction for aerosol effects will remain a challenge for the next generation of air quality satellite instruments such as TROPOMI on Sentinel-5 Precursor, Sentinel-4 and Sentinel-5. The Ozone Monitoring Instrument (OMI) instrument has provided daily global measurements of tropospheric NO2 for more than a decade. However, aerosols are not explicitly taken into account in the current operational OMI tropospheric NO2 retrieval chain (DOMINO v2 [Boersma et al., 2011]). Our study analyses 2 approaches for an operational aerosol correction, based on the use of the O2-O2 477 nm band. The 1st approach is the cloud-model based aerosol correction, also named "implicit aerosol correction", and already used in the operational chain. The OMI O2-O2 cloud retrieval algorithm, based on the Differential Optical Absorption Spectroscopy (DOAS) approach, is applied both to cloudy and to cloud-free scenes with aerosols present. Perturbation of the OMI cloud retrievals over scenes dominated by aerosols has been observed in recent studies led by [Castellanos et al., 2015; Lin et al., 2015; Lin et al., 2014]. We investigated the causes of these perturbations by: (1) confronting the OMI tropospheric NO2, clouds and MODIS AQUA aerosol products; (2) characterizing the key drivers of the aerosol net effects, compared to a signal from clouds, in the UV-Vis spectra. This study has focused on large industrialised areas like East-China, over cloud-free scenes. One of the key findings is the limitation due to the coarse sampling of the employed cloud Look-Up Table (LUT) to convert the results of the applied DOAS fit into effective cloud fraction and pressure. This leads to an underestimation of tropospheric NO2 amount in cases of particles located at elevated altitude. A higher sampling of the

  16. A possible role of ground-based microorganisms on cloud formation in the atmosphere

    Science.gov (United States)

    Ekström, S.; Nozière, B.; Hultberg, M.; Alsberg, T.; Magnér, J.; Nilsson, E. D.; Artaxo, P.

    2010-01-01

    The formation of clouds is an important process for the atmosphere, the hydrological cycle, and climate, but some aspects of it are not completely understood. In this work, we show that microorganisms might affect cloud formation without leaving the Earth's surface by releasing biological surfactants (or biosurfactants) in the environment, that make their way into atmospheric aerosols and could significantly enhance their activation into cloud droplets. In the first part of this work, the cloud-nucleating efficiency of standard biosurfactants was characterized and found to be better than that of any aerosol material studied so far, including inorganic salts. These results identify molecular structures that give organic compounds exceptional cloud-nucleating properties. In the second part, atmospheric aerosols were sampled at different locations: a temperate coastal site, a marine site, a temperate forest, and a tropical forest. Their surface tension was measured and found to be below 30 mN/m, the lowest reported for aerosols, to our knowledge. This very low surface tension was attributed to the presence of biosurfactants, the only natural substances able to reach to such low values. The presence of strong microbial surfactants in aerosols would be consistent with the organic fractions of exceptional cloud-nucleating efficiency recently found in aerosols, and with the correlations between algae bloom and cloud cover reported in the Southern Ocean. The results of this work also suggest that biosurfactants might be common in aerosols and thus of global relevance. If this is confirmed, a new role for microorganisms on the atmosphere and climate could be identified.

  17. Perceptions of Peer Review Using Cloud-Based Software

    Science.gov (United States)

    Andrichuk, Gjoa

    2016-01-01

    This study looks at the change in perception regarding the effect of peer feedback on writing skills using cloud-based software. Pre- and post-surveys were given. The students peer reviewed drafts of five sections of scientific reports using Google Docs. While students reported that they did not perceive their writing ability improved by being…

  18. Cloud-Based Technologies: Faculty Development, Support, and Implementation

    Science.gov (United States)

    Diaz, Veronica

    2011-01-01

    The number of instructional offerings in higher education that are online, blended, or web-enhanced, including courses and programs, continues to grow exponentially. Alongside the growth of e-learning, higher education has witnessed the explosion of cloud-based or Web 2.0 technologies, a term that refers to the vast array of socially oriented,…

  19. Radiative budget and cloud radiative effect over the Atlantic from ship-based observations

    Directory of Open Access Journals (Sweden)

    J. Kalisch

    2012-10-01

    Full Text Available The aim of this study is to determine cloud-type resolved cloud radiative budgets and cloud radiative effects from surface measurements of broadband radiative fluxes over the Atlantic Ocean. Furthermore, based on simultaneous observations of the state of the cloudy atmosphere, a radiative closure study has been performed by means of the ECHAM5 single column model in order to identify the model's ability to realistically reproduce the effects of clouds on the climate system.

    An extensive database of radiative and atmospheric measurements has been established along five meridional cruises of the German research icebreaker Polarstern. Besides pyranometer and pyrgeometer for downward broadband solar and thermal radiative fluxes, a sky imager and a microwave radiometer have been utilized to determine cloud fraction and cloud type on the one hand and temperature and humidity profiles as well as liquid water path for warm non-precipitating clouds on the other hand.

    Averaged over all cruise tracks, we obtain a total net (solar + thermal radiative flux of 144 W m−2 that is dominated by the solar component. In general, the solar contribution is large for cirrus clouds and small for stratus clouds. No significant meridional dependencies were found for the surface radiation budgets and cloud effects. The strongest surface longwave cloud effects were shown in the presence of low level clouds. Clouds with a high optical density induce strong negative solar radiative effects under high solar altitudes. The mean surface net cloud radiative effect is −33 W m−2.

    For the purpose of quickly estimating the mean surface longwave, shortwave and net cloud effects in moderate, subtropical and tropical climate regimes, a new parameterisation was created, considering the total cloud amount and the solar zenith angle.

    The ECHAM5 single column model provides a surface net cloud effect that is more

  20. Method for validating cloud mask obtained from satellite measurements using ground-based sky camera.

    Science.gov (United States)

    Letu, Husi; Nagao, Takashi M; Nakajima, Takashi Y; Matsumae, Yoshiaki

    2014-11-01

    Error propagation in Earth's atmospheric, oceanic, and land surface parameters of the satellite products caused by misclassification of the cloud mask is a critical issue for improving the accuracy of satellite products. Thus, characterizing the accuracy of the cloud mask is important for investigating the influence of the cloud mask on satellite products. In this study, we proposed a method for validating multiwavelength satellite data derived cloud masks using ground-based sky camera (GSC) data. First, a cloud cover algorithm for GSC data has been developed using sky index and bright index. Then, Moderate Resolution Imaging Spectroradiometer (MODIS) satellite data derived cloud masks by two cloud-screening algorithms (i.e., MOD35 and CLAUDIA) were validated using the GSC cloud mask. The results indicate that MOD35 is likely to classify ambiguous pixels as "cloudy," whereas CLAUDIA is likely to classify them as "clear." Furthermore, the influence of error propagations caused by misclassification of the MOD35 and CLAUDIA cloud masks on MODIS derived reflectance, brightness temperature, and normalized difference vegetation index (NDVI) in clear and cloudy pixels was investigated using sky camera data. It shows that the influence of the error propagation by the MOD35 cloud mask on the MODIS derived monthly mean reflectance, brightness temperature, and NDVI for clear pixels is significantly smaller than for the CLAUDIA cloud mask; the influence of the error propagation by the CLAUDIA cloud mask on MODIS derived monthly mean cloud products for cloudy pixels is significantly smaller than that by the MOD35 cloud mask.

  1. CIMS: A Context-Based Intelligent Multimedia System for Ubiquitous Cloud Computing

    Directory of Open Access Journals (Sweden)

    Abhilash Sreeramaneni

    2015-06-01

    Full Text Available Mobile users spend a tremendous amount of time surfing multimedia contents over the Internet to pursue their interests. A resource-constrained smart device demands more intensive computing tasks and lessens the battery life. To address the resource limitations (i.e., memory, lower maintenance cost, easier access, computing tasks in mobile devices, mobile cloud computing is needed. Several approaches have been proposed to confront the challenges of mobile cloud computing, but difficulties still remain. However, in the coming years, context collecting, processing, and interchanging the results on a heavy network will cause vast computations and reduce the battery life in mobiles. In this paper, we propose a “context-based intelligent multimedia system” (CIMS for ubiquitous cloud computing. The main goal of this research is to lessen the computing percentage, storage complexities, and battery life for mobile users by using pervasive cloud computing. Moreover, to reduce the computing and storage concerns in mobiles, the cloud server collects several groups of user profiles with similarities by executing K-means clustering on users’ data (context and multimedia contents. The distribution process conveys real-time notifications to smartphone users, according to what is stated in his/her profile. We considered a mobile cloud offloading system, which decides the offloading actions to/from cloud servers. Context-aware decision-making (CAD customizes the mobile device performance with different specifications such as short response time and lesser energy consumption. The analysis says that our CIMS takes advantage of cost-effective features to produce high-quality information for mobile (or smart device users in real time. Moreover, our CIMS lessens the computation and storage complexities for mobile users, as well as cloud servers. Simulation analysis suggests that our approach is more efficient than existing domains.

  2. Efficient Resources Provisioning Based on Load Forecasting in Cloud

    Directory of Open Access Journals (Sweden)

    Rongdong Hu

    2014-01-01

    Full Text Available Cloud providers should ensure QoS while maximizing resources utilization. One optimal strategy is to timely allocate resources in a fine-grained mode according to application’s actual resources demand. The necessary precondition of this strategy is obtaining future load information in advance. We propose a multi-step-ahead load forecasting method, KSwSVR, based on statistical learning theory which is suitable for the complex and dynamic characteristics of the cloud computing environment. It integrates an improved support vector regression algorithm and Kalman smoother. Public trace data taken from multitypes of resources were used to verify its prediction accuracy, stability, and adaptability, comparing with AR, BPNN, and standard SVR. Subsequently, based on the predicted results, a simple and efficient strategy is proposed for resource provisioning. CPU allocation experiment indicated it can effectively reduce resources consumption while meeting service level agreements requirements.

  3. Cooperative Optimization QoS Cloud Routing Protocol Based on Bacterial Opportunistic Foraging and Chemotaxis Perception for Mobile Internet

    Directory of Open Access Journals (Sweden)

    Shujuan Wang

    2015-01-01

    Full Text Available In order to strengthen the mobile Internet mobility management and cloud platform resources utilization, optimizing the cloud routing efficiency is established, based on opportunistic bacterial foraging bionics, and puts forward a chemotaxis perception of collaborative optimization QoS (Quality of Services cloud routing mechanism. The cloud routing mechanism is based on bacterial opportunity to feed and bacterial motility and to establish the data transmission and forwarding of the bacterial population behavior characteristics. This mechanism is based on the characteristics of drug resistance of bacteria and the structure of the field, and through many iterations of the individual behavior and population behavior the bacteria can be spread to the food gathering area with a certain probability. Finally, QoS cloud routing path would be selected and optimized based on bacterial bionic optimization and hedge mapping relationship between mobile Internet node and bacterial population evolution iterations. Experimental results show that, compared with the standard dynamic routing schemes, the proposed scheme has shorter transmission delay, lower packet error ratio, QoS cloud routing loading, and QoS cloud route request overhead.

  4. Statistical retrieval of thin liquid cloud microphysical properties using ground-based infrared and microwave observations

    Science.gov (United States)

    Marke, Tobias; Ebell, Kerstin; Löhnert, Ulrich; Turner, David D.

    2016-12-01

    In this article, liquid water cloud microphysical properties are retrieved by a combination of microwave and infrared ground-based observations. Clouds containing liquid water are frequently occurring in most climate regimes and play a significant role in terms of interaction with radiation. Small perturbations in the amount of liquid water contained in the cloud can cause large variations in the radiative fluxes. This effect is enhanced for thin clouds (liquid water path, LWP cloud properties crucial. Due to large relative errors in retrieving low LWP values from observations in the microwave domain and a high sensitivity for infrared methods when the LWP is low, a synergistic retrieval based on a neural network approach is built to estimate both LWP and cloud effective radius (reff). These statistical retrievals can be applied without high computational demand but imply constraints like prior information on cloud phase and cloud layering. The neural network retrievals are able to retrieve LWP and reff for thin clouds with a mean relative error of 9% and 17%, respectively. This is demonstrated using synthetic observations of a microwave radiometer (MWR) and a spectrally highly resolved infrared interferometer. The accuracy and robustness of the synergistic retrievals is confirmed by a low bias in a radiative closure study for the downwelling shortwave flux, even for marginally invalid scenes. Also, broadband infrared radiance observations, in combination with the MWR, have the potential to retrieve LWP with a higher accuracy than a MWR-only retrieval.

  5. Quantitative Measures of Immersion in Cloud and the Biogeography of Cloud Forests

    Science.gov (United States)

    Lawton, R. O.; Nair, U. S.; Ray, D.; Regmi, A.; Pounds, J. A.; Welch, R. M.

    2010-01-01

    Sites described as tropical montane cloud forests differ greatly, in part because observers tend to differ in their opinion as to what constitutes frequent and prolonged immersion in cloud. This definitional difficulty interferes with hydrologic analyses, assessments of environmental impacts on ecosystems, and biogeographical analyses of cloud forest communities and species. Quantitative measurements of cloud immersion can be obtained on site, but the observations are necessarily spatially limited, although well-placed observers can examine 10 50 km of a mountain range under rainless conditions. Regional analyses, however, require observations at a broader scale. This chapter discusses remote sensing and modeling approaches that can provide quantitative measures of the spatiotemporal patterns of cloud cover and cloud immersion in tropical mountain ranges. These approaches integrate remote sensing tools of various spatial resolutions and frequencies of observation, digital elevation models, regional atmospheric models, and ground-based observations to provide measures of cloud cover, cloud base height, and the intersection of cloud and terrain. This combined approach was applied to the Monteverde region of northern Costa Rica to illustrate how the proportion of time the forest is immersed in cloud may vary spatially and temporally. The observed spatial variation was largely due to patterns of airflow over the mountains. The temporal variation reflected the diurnal rise and fall of the orographic cloud base, which was influenced in turn by synoptic weather conditions, the seasonal movement of the Intertropical Convergence Zone and the north-easterly trade winds. Knowledge of the proportion of the time that sites are immersed in clouds should facilitate ecological comparisons and biogeographical analyses, as well as land use planning and hydrologic assessments in areas where intensive on-site work is not feasible.

  6. On Designing a Generic Framework for Cloud-based Big Data Analytics

    OpenAIRE

    Khan, Samiya; Alam, Mansaf

    2017-01-01

    Big data analytics has gathered immense research attention lately because of its ability to harness useful information from heaps of data. Cloud computing has been adjudged as one of the best infrastructural solutions for implementation of big data analytics. This research paper proposes a five-layer model for cloud-based big data analytics that uses dew computing and edge computing concepts. Besides this, the paper also presents an approach for creation of custom big data stack by selecting ...

  7. PC-Cluster based Storage System Architecture for Cloud Storage

    OpenAIRE

    Yee, Tin Tin; Naing, Thinn Thu

    2011-01-01

    Design and architecture of cloud storage system plays a vital role in cloud computing infrastructure in order to improve the storage capacity as well as cost effectiveness. Usually cloud storage system provides users to efficient storage space with elasticity feature. One of the challenges of cloud storage system is difficult to balance the providing huge elastic capacity of storage and investment of expensive cost for it. In order to solve this issue in the cloud storage infrastructure, low ...

  8. Mobile Agent based Market Basket Analysis on Cloud

    OpenAIRE

    Waghmare, Vijayata; Mukhopadhyay, Debajyoti

    2014-01-01

    This paper describes the design and development of a location-based mobile shopping application for bakery product shops. Whole application is deployed on cloud. The three-tier architecture consists of, front-end, middle-ware and back-end. The front-end level is a location-based mobile shopping application for android mobile devices, for purchasing bakery products of nearby places. Front-end level also displays association among the purchased products. The middle-ware level provides a web ser...

  9. Architecture Design of Healthcare Software-as-a-Service Platform for Cloud-Based Clinical Decision Support Service.

    Science.gov (United States)

    Oh, Sungyoung; Cha, Jieun; Ji, Myungkyu; Kang, Hyekyung; Kim, Seok; Heo, Eunyoung; Han, Jong Soo; Kang, Hyunggoo; Chae, Hoseok; Hwang, Hee; Yoo, Sooyoung

    2015-04-01

    To design a cloud computing-based Healthcare Software-as-a-Service (SaaS) Platform (HSP) for delivering healthcare information services with low cost, high clinical value, and high usability. We analyzed the architecture requirements of an HSP, including the interface, business services, cloud SaaS, quality attributes, privacy and security, and multi-lingual capacity. For cloud-based SaaS services, we focused on Clinical Decision Service (CDS) content services, basic functional services, and mobile services. Microsoft's Azure cloud computing for Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) was used. The functional and software views of an HSP were designed in a layered architecture. External systems can be interfaced with the HSP using SOAP and REST/JSON. The multi-tenancy model of the HSP was designed as a shared database, with a separate schema for each tenant through a single application, although healthcare data can be physically located on a cloud or in a hospital, depending on regulations. The CDS services were categorized into rule-based services for medications, alert registration services, and knowledge services. We expect that cloud-based HSPs will allow small and mid-sized hospitals, in addition to large-sized hospitals, to adopt information infrastructures and health information technology with low system operation and maintenance costs.

  10. Cloud residues and interstitial aerosols from non-precipitating clouds over an industrial and urban area in northern China

    Science.gov (United States)

    Li, Weijun; Li, Peiren; Sun, Guode; Zhou, Shengzhen; Yuan, Qi; Wang, Wenxing

    2011-05-01

    Most studies of aerosol-cloud interactions have been conducted in remote locations; few have investigated the characterization of cloud condensation nuclei (CCN) over highly polluted urban and industrial areas. The present work, based on samples collected at Mt. Tai, a site in northern China affected by nearby urban and industrial air pollutant emissions, illuminates CCN properties in a polluted atmosphere. High-resolution transmission electron microscopy (TEM) was used to obtain the size, composition, and mixing state of individual cloud residues and interstitial aerosols. Most of the cloud residues displayed distinct rims which were found to consist of soluble organic matter (OM). Nearly all (91.7%) cloud residues were attributed to sulfate-related salts (the remainder was mostly coarse crustal dust particles with nitrate coatings). Half the salt particles were internally mixed with two or more refractory particles (e.g., soot, fly ash, crustal dust, CaSO 4, and OM). A comparison between cloud residues and interstitial particles shows that the former contained more salts and were of larger particle size than the latter. In addition, a somewhat high number scavenging ratio of 0.54 was observed during cloud formation. Therefore, the mixtures of salts with OMs account for most of the cloud-nucleating ability of the entire aerosol population in the polluted air of northern China. We advocate that both size and composition - the two influential, controlling factors for aerosol activation - should be built into all regional climate models of China.

  11. Continued rise of the cloud advances and trends in cloud computing

    CERN Document Server

    Mahmood, Zaigham

    2014-01-01

    Cloud computing is no-longer a novel paradigm, but instead an increasingly robust and established technology, yet new developments continue to emerge in this area. Continued Rise of the Cloud: Advances and Trends in Cloud Computing captures the state of the art in cloud technologies, infrastructures, and service delivery and deployment models. The book provides guidance and case studies on the development of cloud-based services and infrastructures from an international selection of expert researchers and practitioners. A careful analysis is provided of relevant theoretical frameworks, prac

  12. The Pose Estimation of Mobile Robot Based on Improved Point Cloud Registration

    Directory of Open Access Journals (Sweden)

    Yanzi Miao

    2016-03-01

    Full Text Available Due to GPS restrictions, an inertial sensor is usually used to estimate the location of indoor mobile robots. However, it is difficult to achieve high-accuracy localization and control by inertial sensors alone. In this paper, a new method is proposed to estimate an indoor mobile robot pose with six degrees of freedom based on an improved 3D-Normal Distributions Transform algorithm (3D-NDT. First, point cloud data are captured by a Kinect sensor and segmented according to the distance to the robot. After the segmentation, the input point cloud data are processed by the Approximate Voxel Grid Filter algorithm in different sized voxel grids. Second, the initial registration and precise registration are performed respectively according to the distance to the sensor. The most distant point cloud data use the 3D-Normal Distributions Transform algorithm (3D-NDT with large-sized voxel grids for initial registration, based on the transformation matrix from the odometry method. The closest point cloud data use the 3D-NDT algorithm with small-sized voxel grids for precise registration. After the registrations above, a final transformation matrix is obtained and coordinated. Based on this transformation matrix, the pose estimation problem of the indoor mobile robot is solved. Test results show that this method can obtain accurate robot pose estimation and has better robustness.

  13. A simple dynamic rising nuclear cloud based model of ground radioactive fallout for atmospheric nuclear explosion

    International Nuclear Information System (INIS)

    Zheng Yi

    2008-01-01

    A simple dynamic rising nuclear cloud based model for atmospheric nuclear explosion radioactive prediction was presented. The deposition of particles and initial cloud radius changing with time before the cloud stabilization was considered. Large-scale relative diffusion theory was used after cloud stabilization. The model was considered reasonable and dependable in comparison with four U.S. nuclear test cases and DELFIC model results. (authors)

  14. Implementation of Cloud based next generation sequencing data analysis in a clinical laboratory.

    Science.gov (United States)

    Onsongo, Getiria; Erdmann, Jesse; Spears, Michael D; Chilton, John; Beckman, Kenneth B; Hauge, Adam; Yohe, Sophia; Schomaker, Matthew; Bower, Matthew; Silverstein, Kevin A T; Thyagarajan, Bharat

    2014-05-23

    The introduction of next generation sequencing (NGS) has revolutionized molecular diagnostics, though several challenges remain limiting the widespread adoption of NGS testing into clinical practice. One such difficulty includes the development of a robust bioinformatics pipeline that can handle the volume of data generated by high-throughput sequencing in a cost-effective manner. Analysis of sequencing data typically requires a substantial level of computing power that is often cost-prohibitive to most clinical diagnostics laboratories. To address this challenge, our institution has developed a Galaxy-based data analysis pipeline which relies on a web-based, cloud-computing infrastructure to process NGS data and identify genetic variants. It provides additional flexibility, needed to control storage costs, resulting in a pipeline that is cost-effective on a per-sample basis. It does not require the usage of EBS disk to run a sample. We demonstrate the validation and feasibility of implementing this bioinformatics pipeline in a molecular diagnostics laboratory. Four samples were analyzed in duplicate pairs and showed 100% concordance in mutations identified. This pipeline is currently being used in the clinic and all identified pathogenic variants confirmed using Sanger sequencing further validating the software.

  15. ASTER cloud coverage reassessment using MODIS cloud mask products

    Science.gov (United States)

    Tonooka, Hideyuki; Omagari, Kunjuro; Yamamoto, Hirokazu; Tachikawa, Tetsushi; Fujita, Masaru; Paitaer, Zaoreguli

    2010-10-01

    In the Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER) Project, two kinds of algorithms are used for cloud assessment in Level-1 processing. The first algorithm based on the LANDSAT-5 TM Automatic Cloud Cover Assessment (ACCA) algorithm is used for a part of daytime scenes observed with only VNIR bands and all nighttime scenes, and the second algorithm based on the LANDSAT-7 ETM+ ACCA algorithm is used for most of daytime scenes observed with all spectral bands. However, the first algorithm does not work well for lack of some spectral bands sensitive to cloud detection, and the two algorithms have been less accurate over snow/ice covered areas since April 2008 when the SWIR subsystem developed troubles. In addition, they perform less well for some combinations of surface type and sun elevation angle. We, therefore, have developed the ASTER cloud coverage reassessment system using MODIS cloud mask (MOD35) products, and have reassessed cloud coverage for all ASTER archived scenes (>1.7 million scenes). All of the new cloud coverage data are included in Image Management System (IMS) databases of the ASTER Ground Data System (GDS) and NASA's Land Process Data Active Archive Center (LP DAAC) and used for ASTER product search by users, and cloud mask images are distributed to users through Internet. Daily upcoming scenes (about 400 scenes per day) are reassessed and inserted into the IMS databases in 5 to 7 days after each scene observation date. Some validation studies for the new cloud coverage data and some mission-related analyses using those data are also demonstrated in the present paper.

  16. Dynamic virtual machine allocation policy in cloud computing complying with service level agreement using CloudSim

    Science.gov (United States)

    Aneri, Parikh; Sumathy, S.

    2017-11-01

    Cloud computing provides services over the internet and provides application resources and data to the users based on their demand. Base of the Cloud Computing is consumer provider model. Cloud provider provides resources which consumer can access using cloud computing model in order to build their application based on their demand. Cloud data center is a bulk of resources on shared pool architecture for cloud user to access. Virtualization is the heart of the Cloud computing model, it provides virtual machine as per application specific configuration and those applications are free to choose their own configuration. On one hand, there is huge number of resources and on other hand it has to serve huge number of requests effectively. Therefore, resource allocation policy and scheduling policy play very important role in allocation and managing resources in this cloud computing model. This paper proposes the load balancing policy using Hungarian algorithm. Hungarian Algorithm provides dynamic load balancing policy with a monitor component. Monitor component helps to increase cloud resource utilization by managing the Hungarian algorithm by monitoring its state and altering its state based on artificial intelligent. CloudSim used in this proposal is an extensible toolkit and it simulates cloud computing environment.

  17. Seasonal Bias of Retrieved Ice Cloud Optical Properties Based on MISR and MODIS Measurements

    Science.gov (United States)

    Wang, Y.; Hioki, S.; Yang, P.; Di Girolamo, L.; Fu, D.

    2017-12-01

    The precise estimation of two important cloud optical and microphysical properties, cloud particle optical thickness and cloud particle effective radius, is fundamental in the study of radiative energy budget and hydrological cycle. In retrieving these two properties, an appropriate selection of ice particle surface roughness is important because it substantially affects the single-scattering properties. At present, using a predetermined ice particle shape without spatial and temporal variations is a common practice in satellite-based retrieval. This approach leads to substantial uncertainties in retrievals. The cloud radiances measured by each of the cameras of the Multi-angle Imaging SpectroRadiometer (MISR) instrument are used to estimate spherical albedo values at different scattering angles. By analyzing the directional distribution of estimated spherical albedo values, the degree of ice particle surface roughness is estimated. With an optimal degree of ice particle roughness, cloud optical thickness and effective radius are retrieved based on a bi-spectral shortwave technique in conjunction with two Moderate Resolution Imaging Spectroradiometer (MODIS) bands centered at 0.86 and 2.13 μm. The seasonal biases of retrieved cloud optical and microphysical properties, caused by the uncertainties in ice particle roughness, are investigated by using one year of MISR-MODIS fused data.

  18. Architecture Design of Healthcare Software-as-a-Service Platform for Cloud-Based Clinical Decision Support Service

    OpenAIRE

    Oh, Sungyoung; Cha, Jieun; Ji, Myungkyu; Kang, Hyekyung; Kim, Seok; Heo, Eunyoung; Han, Jong Soo; Kang, Hyunggoo; Chae, Hoseok; Hwang, Hee; Yoo, Sooyoung

    2015-01-01

    Objectives To design a cloud computing-based Healthcare Software-as-a-Service (SaaS) Platform (HSP) for delivering healthcare information services with low cost, high clinical value, and high usability. Methods We analyzed the architecture requirements of an HSP, including the interface, business services, cloud SaaS, quality attributes, privacy and security, and multi-lingual capacity. For cloud-based SaaS services, we focused on Clinical Decision Service (CDS) content services, basic functi...

  19. Development of cloud point extraction - UV-visible spectrophotometric method for vanadium (V) determination in hydrogeochemical samples

    International Nuclear Information System (INIS)

    Durani, Smeer; Mathur, Neerja; Chowdary, G.S.

    2007-01-01

    The cloud point extraction behavior (CPE) of vanadium (V) using 5,7 dibromo 8-hydroxyquinoline (DBHQ) and triton X 100 was investigated. Vanadium (V) was extracted with 4 ml of 0.5 mg/ml DBHQ and 6 ml of 8% (V/V) triton X 100 at the pH 3.7. A few hydrogeochemical samples were analysed for vanadium using the above method. (author)

  20. Replicas Strategy and Cache Optimization of Video Surveillance Systems Based on Cloud Storage

    Directory of Open Access Journals (Sweden)

    Rongheng Li

    2018-04-01

    Full Text Available With the rapid development of video surveillance technology, especially the popularity of cloud-based video surveillance applications, video data begins to grow explosively. However, in the cloud-based video surveillance system, replicas occupy an amount of storage space. Also, the slow response to video playback constrains the performance of the system. In this paper, considering the characteristics of video data comprehensively, we propose a dynamic redundant replicas mechanism based on security levels that can dynamically adjust the number of replicas. Based on the location correlation between cameras, this paper also proposes a data cache strategy to improve the response speed of data reading. Experiments illustrate that: (1 our dynamic redundant replicas mechanism can save storage space while ensuring data security; (2 the cache mechanism can predict the playback behaviors of the users in advance and improve the response speed of data reading according to the location and time correlation of the front-end cameras; and (3 in terms of cloud-based video surveillance, our proposed approaches significantly outperform existing methods.

  1. Development of a cloud-based Bioinformatics Training Platform.

    Science.gov (United States)

    Revote, Jerico; Watson-Haigh, Nathan S; Quenette, Steve; Bethwaite, Blair; McGrath, Annette; Shang, Catherine A

    2017-05-01

    The Bioinformatics Training Platform (BTP) has been developed to provide access to the computational infrastructure required to deliver sophisticated hands-on bioinformatics training courses. The BTP is a cloud-based solution that is in active use for delivering next-generation sequencing training to Australian researchers at geographically dispersed locations. The BTP was built to provide an easy, accessible, consistent and cost-effective approach to delivering workshops at host universities and organizations with a high demand for bioinformatics training but lacking the dedicated bioinformatics training suites required. To support broad uptake of the BTP, the platform has been made compatible with multiple cloud infrastructures. The BTP is an open-source and open-access resource. To date, 20 training workshops have been delivered to over 700 trainees at over 10 venues across Australia using the BTP. © The Author 2016. Published by Oxford University Press.

  2. Development of Web-Based Remote Desktop to Provide Adaptive User Interfaces in Cloud Platform

    OpenAIRE

    Shuen-Tai Wang; Hsi-Ya Chang

    2014-01-01

    Cloud virtualization technologies are becoming more and more prevalent, cloud users usually encounter the problem of how to access to the virtualized remote desktops easily over the web without requiring the installation of special clients. To resolve this issue, we took advantage of the HTML5 technology and developed web-based remote desktop. It permits users to access the terminal which running in our cloud platform from anywhere. We implemented a sketch of web interfac...

  3. Task Classification Based Energy-Aware Consolidation in Clouds

    Directory of Open Access Journals (Sweden)

    HeeSeok Choi

    2016-01-01

    Full Text Available We consider a cloud data center, in which the service provider supplies virtual machines (VMs on hosts or physical machines (PMs to its subscribers for computation in an on-demand fashion. For the cloud data center, we propose a task consolidation algorithm based on task classification (i.e., computation-intensive and data-intensive and resource utilization (e.g., CPU and RAM. Furthermore, we design a VM consolidation algorithm to balance task execution time and energy consumption without violating a predefined service level agreement (SLA. Unlike the existing research on VM consolidation or scheduling that applies none or single threshold schemes, we focus on a double threshold (upper and lower scheme, which is used for VM consolidation. More specifically, when a host operates with resource utilization below the lower threshold, all the VMs on the host will be scheduled to be migrated to other hosts and then the host will be powered down, while when a host operates with resource utilization above the upper threshold, a VM will be migrated to avoid using 100% of resource utilization. Based on experimental performance evaluations with real-world traces, we prove that our task classification based energy-aware consolidation algorithm (TCEA achieves a significant energy reduction without incurring predefined SLA violations.

  4. A cloud shadow detection method combined with cloud height iteration and spectral analysis for Landsat 8 OLI data

    Science.gov (United States)

    Sun, Lin; Liu, Xinyan; Yang, Yikun; Chen, TingTing; Wang, Quan; Zhou, Xueying

    2018-04-01

    Although enhanced over prior Landsat instruments, Landsat 8 OLI can obtain very high cloud detection precisions, but for the detection of cloud shadows, it still faces great challenges. Geometry-based cloud shadow detection methods are considered the most effective and are being improved constantly. The Function of Mask (Fmask) cloud shadow detection method is one of the most representative geometry-based methods that has been used for cloud shadow detection with Landsat 8 OLI. However, the Fmask method estimates cloud height employing fixed temperature rates, which are highly uncertain, and errors of large area cloud shadow detection can be caused by errors in estimations of cloud height. This article improves the geometry-based cloud shadow detection method for Landsat OLI from the following two aspects. (1) Cloud height no longer depends on the brightness temperature of the thermal infrared band but uses a possible dynamic range from 200 m to 12,000 m. In this case, cloud shadow is not a specific location but a possible range. Further analysis was carried out in the possible range based on the spectrum to determine cloud shadow location. This effectively avoids the cloud shadow leakage caused by the error in the height determination of a cloud. (2) Object-based and pixel spectral analyses are combined to detect cloud shadows, which can realize cloud shadow detection from two aspects of target scale and pixel scale. Based on the analysis of the spectral differences between the cloud shadow and typical ground objects, the best cloud shadow detection bands of Landsat 8 OLI were determined. The combined use of spectrum and shape can effectively improve the detection precision of cloud shadows produced by thin clouds. Several cloud shadow detection experiments were carried out, and the results were verified by the results of artificial recognition. The results of these experiments indicated that this method can identify cloud shadows in different regions with correct

  5. A cloud computing based 12-lead ECG telemedicine service.

    Science.gov (United States)

    Hsieh, Jui-Chien; Hsu, Meng-Wei

    2012-07-28

    Due to the great variability of 12-lead ECG instruments and medical specialists' interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists' decision making support in emergency telecardiology. We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG) service to realize ubiquitous 12-lead ECG tele-diagnosis. This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan.

  6. A Comparison of MODIS/VIIRS Cloud Masks over Ice-Bearing River: On Achieving Consistent Cloud Masking and Improved River Ice Mapping

    Directory of Open Access Journals (Sweden)

    Simon Kraatz

    2017-03-01

    Full Text Available The capability of frequently and accurately monitoring ice on rivers is important, since it may be possible to timely identify ice accumulations corresponding to ice jams. Ice jams are dam-like structures formed from arrested ice floes, and may cause rapid flooding. To inform on this potential hazard, the CREST River Ice Observing System (CRIOS produces ice cover maps based on MODIS and VIIRS overpass data at several locations, including the Susquehanna River. CRIOS uses the respective platform’s automatically produced cloud masks to discriminate ice/snow covered grid cells from clouds. However, since cloud masks are produced using each instrument’s data, and owing to differences in detector performance, it is quite possible that identical algorithms applied to even nearly identical instruments may produce substantially different cloud masks. Besides detector performance, cloud identification can be biased due to local (e.g., land cover, viewing geometry, and transient conditions (snow and ice. Snow/cloud confusions and large view angles can result in substantial overestimates of clouds and ice. This impacts algorithms, such as CRIOS, since false cloud cover precludes the determination of whether an otherwise reasonably cloud free grid consists of water or ice. Especially for applications aiming to frequently classify or monitor a location it is important to evaluate cloud masking, including false cloud detections. We present an assessment of three cloud masks via the parameter of effective revisit time. A 100 km stretch of up to 1.6 km wide river was examined with daily data sampled at 500 m resolution, examined over 317 days during winter. Results show that there are substantial differences between each of the cloud mask products, especially while the river bears ice. A contrast-based cloud screening approach was found to provide improved and consistent cloud and ice identification within the reach (95%–99% correlations, and 3%–7% mean

  7. Multi-wavelength study of two possible cloud-cloud collision regions: IRAS 02459+6029 and IRAS 22528+5936

    International Nuclear Information System (INIS)

    Li Nan; Wang Junjie

    2012-01-01

    Based on observations of 12 CO (J=2–1), we select targets from archived Infrared Astronomical Satellite (IRAS) data of IRAS 02459+6029 and IRAS 22528+5936 as samples of cloud-cloud collision, according to the criteria given by Vallee. Then we use the Midcourse Space Experiment (MSX) A band (8.28 μm) images and the NRAO VLA Sky Survey (NVSS) (1.4 GHz) continuum images to investigate the association between molecular clouds traced by the CO contour maps. The distribution of dust and ionized hydrogen shows an obvious association with the CO contour maps toward IRAS 02459+6029. However, in the possible collision region of IRAS 22528+5936, NVSS continuum radiation is not detected and the MSX sources are merely associated with the central star. The velocity fields of the two regions indicate the direction of the pressure and interaction. In addition, we have identified candidates of young stellar objects (YSOs) by using data from the Two Micron All Sky Survey (2MASS) in JHK bands expressed in a color-color diagram. The distribution of YSOs shows that the possible collision region is denser than other regions. All the evidence suggests that IRAS 02459+6029 could be an example of cloud-cloud collision, and that IRAS 22528+5936 could be two separate non-colliding clouds. (research papers)

  8. Cloud database development and management

    CERN Document Server

    Chao, Lee

    2013-01-01

    Nowadays, cloud computing is almost everywhere. However, one can hardly find a textbook that utilizes cloud computing for teaching database and application development. This cloud-based database development book teaches both the theory and practice with step-by-step instructions and examples. This book helps readers to set up a cloud computing environment for teaching and learning database systems. The book will cover adequate conceptual content for students and IT professionals to gain necessary knowledge and hands-on skills to set up cloud based database systems.

  9. Comparison of Monthly Mean Cloud Fraction and Cloud Optical depth Determined from Surface Cloud Radar, TOVS, AVHRR, and MODIS over Barrow, Alaska

    Science.gov (United States)

    Uttal, Taneil; Frisch, Shelby; Wang, Xuan-Ji; Key, Jeff; Schweiger, Axel; Sun-Mack, Sunny; Minnis, Patrick

    2005-01-01

    A one year comparison is made of mean monthly values of cloud fraction and cloud optical depth over Barrow, Alaska (71 deg 19.378 min North, 156 deg 36.934 min West) between 35 GHz radar-based retrievals, the TOVS Pathfinder Path-P product, the AVHRR APP-X product, and a MODIS based cloud retrieval product from the CERES-Team. The data sets represent largely disparate spatial and temporal scales, however, in this paper, the focus is to provide a preliminary analysis of how the mean monthly values derived from these different data sets compare, and determine how they can best be used separately, and in combination to provide reliable estimates of long-term trends of changing cloud properties. The radar and satellite data sets described here incorporate Arctic specific modifications that account for cloud detection challenges specific to the Arctic environment. The year 2000 was chosen for this initial comparison because the cloud radar data was particularly continuous and reliable that year, and all of the satellite retrievals of interest were also available for the year 2000. Cloud fraction was chosen as a comparison variable as accurate detection of cloud is the primary product that is necessary for any other cloud property retrievals. Cloud optical depth was additionally selected as it is likely the single cloud property that is most closely correlated to cloud influences on surface radiation budgets.

  10. An approach of point cloud denoising based on improved bilateral filtering

    Science.gov (United States)

    Zheng, Zeling; Jia, Songmin; Zhang, Guoliang; Li, Xiuzhi; Zhang, Xiangyin

    2018-04-01

    An omnidirectional mobile platform is designed for building point cloud based on an improved filtering algorithm which is employed to handle the depth image. First, the mobile platform can move flexibly and the control interface is convenient to control. Then, because the traditional bilateral filtering algorithm is time-consuming and inefficient, a novel method is proposed which called local bilateral filtering (LBF). LBF is applied to process depth image obtained by the Kinect sensor. The results show that the effect of removing noise is improved comparing with the bilateral filtering. In the condition of off-line, the color images and processed images are used to build point clouds. Finally, experimental results demonstrate that our method improves the speed of processing time of depth image and the effect of point cloud which has been built.

  11. Business Process as a Service Model Based Business and IT Cloud Alignment as a Cloud Offering

    OpenAIRE

    Robert Woitsch; Wilfrid Utz

    2015-01-01

    Cloud computing proved to offer flexible IT solutions. Although large enterprises may benefit from this technology, SMEs are falling behind in cloud usage due to missing ITcompetence and hence lose the ability to efficiently adapt their IT to their business needs. This paper introduces the project idea of the H2020 project CloudSocket, by elaborating the idea of Business Processes as a Service (BPaaS), where concept models and semantics are applied to align business processes with Cloud deplo...

  12. Comparisons of Satellite-Deduced Overlapping Cloud Properties and CALIPSO CloudSat Data

    Science.gov (United States)

    Chang, Fu-Lung; Minnis, Patrick; Lin, Bing; Sun-Mack, Sunny

    2010-01-01

    Introduction to the overlapped cloud properties derived from polar-orbiting (MODIS) and geostationary (GOES-12, -13, Meteosat-8, -9, etc.) meteorological satellites, which are produced at the NASA Langley Research Center (LaRC) cloud research & development team (NASA lead scientist: Dr. Patrick Minnis). Comparison of the LaRC CERES MODIS Edition-3 overlapped cloud properties to the CALIPSO and the CloudSat active sensing data. High clouds and overlapped clouds occur frequently as deduced by CALIPSO (44 & 25%), CloudSat (25 & 4%), and MODIS (37 & 6%). Large fractions of optically-thin cirrus and overlapped clouds are deduced from CALIPSO, but much smaller fractions are from CloudSat and MODIS. For overlapped clouds, the averaged upper-layer CTHs are about 12.8 (CALIPSO), 10.9 (CloudSat) and 10 km (MODIS), and the averaged lower-layer CTHs are about 3.6 (CALIPSO), 3.2 (CloudSat) and 3.9 km (MODIS). Based on comparisons of upper and lower-layer cloud properties as deduced from the MODIS, CALIPSO and CloudSat data, more enhanced passive satellite methods for retrieving thin cirrus and overlapped cloud properties are needed and are under development.

  13. Building a cloud based distributed active archive data center

    Science.gov (United States)

    Ramachandran, Rahul; Baynes, Katie; Murphy, Kevin

    2017-04-01

    NASA's Earth Science Data System (ESDS) Program serves as a central cog in facilitating the implementation of NASA's Earth Science strategic plan. Since 1994, the ESDS Program has committed to the full and open sharing of Earth science data obtained from NASA instruments to all users. One of the key responsibilities of the ESDS Program is to continuously evolve the entire data and information system to maximize returns on the collected NASA data. An independent review was conducted in 2015 to holistically review the EOSDIS in order to identify gaps. The review recommendations were to investigate two areas: one, whether commercial cloud providers offer potential for storage, processing, and operational efficiencies, and two, the potential development of new data access and analysis paradigms. In response, ESDS has initiated several prototypes investigating the advantages and risks of leveraging cloud computing. This poster will provide an overview of one such prototyping activity, "Cumulus". Cumulus is being designed and developed as a "native" cloud-based data ingest, archive and management system that can be used for all future NASA Earth science data streams. The long term vision for Cumulus, its requirements, overall architecture, and implementation details, as well as lessons learned from the completion of the first phase of this prototype will be covered. We envision Cumulus will foster design of new analysis/visualization tools to leverage collocated data from all of the distributed DAACs as well as elastic cloud computing resources to open new research opportunities.

  14. Wireless-Uplinks-Based Energy-Efficient Scheduling in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Xing Liu

    2015-01-01

    Full Text Available Mobile cloud computing (MCC combines cloud computing and mobile internet to improve the computational capabilities of resource-constrained mobile devices (MDs. In MCC, mobile users could not only improve the computational capability of MDs but also save operation consumption by offloading the mobile applications to the cloud. However, MCC faces the problem of energy efficiency because of time-varying channels when the offloading is being executed. In this paper, we address the issue of energy-efficient scheduling for wireless uplink in MCC. By introducing Lyapunov optimization, we first propose a scheduling algorithm that can dynamically choose channel to transmit data based on queue backlog and channel statistics. Then, we show that the proposed scheduling algorithm can make a tradeoff between queue backlog and energy consumption in a channel-aware MCC system. Simulation results show that the proposed scheduling algorithm can reduce the time average energy consumption for offloading compared to the existing algorithm.

  15. Architecture Design of Healthcare Software-as-a-Service Platform for Cloud-Based Clinical Decision Support Service

    Science.gov (United States)

    Oh, Sungyoung; Cha, Jieun; Ji, Myungkyu; Kang, Hyekyung; Kim, Seok; Heo, Eunyoung; Han, Jong Soo; Kang, Hyunggoo; Chae, Hoseok; Hwang, Hee

    2015-01-01

    Objectives To design a cloud computing-based Healthcare Software-as-a-Service (SaaS) Platform (HSP) for delivering healthcare information services with low cost, high clinical value, and high usability. Methods We analyzed the architecture requirements of an HSP, including the interface, business services, cloud SaaS, quality attributes, privacy and security, and multi-lingual capacity. For cloud-based SaaS services, we focused on Clinical Decision Service (CDS) content services, basic functional services, and mobile services. Microsoft's Azure cloud computing for Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) was used. Results The functional and software views of an HSP were designed in a layered architecture. External systems can be interfaced with the HSP using SOAP and REST/JSON. The multi-tenancy model of the HSP was designed as a shared database, with a separate schema for each tenant through a single application, although healthcare data can be physically located on a cloud or in a hospital, depending on regulations. The CDS services were categorized into rule-based services for medications, alert registration services, and knowledge services. Conclusions We expect that cloud-based HSPs will allow small and mid-sized hospitals, in addition to large-sized hospitals, to adopt information infrastructures and health information technology with low system operation and maintenance costs. PMID:25995962

  16. Delivering Unidata Technology via the Cloud

    Science.gov (United States)

    Fisher, Ward; Oxelson Ganter, Jennifer

    2016-04-01

    Over the last two years, Docker has emerged as the clear leader in open-source containerization. Containerization technology provides a means by which software can be pre-configured and packaged into a single unit, i.e. a container. This container can then be easily deployed either on local or remote systems. Containerization is particularly advantageous when moving software into the cloud, as it simplifies the process. Unidata is adopting containerization as part of our commitment to migrate our technologies to the cloud. We are using a two-pronged approach in this endeavor. In addition to migrating our data-portal services to a cloud environment, we are also exploring new and novel ways to use cloud-specific technology to serve our community. This effort has resulted in several new cloud/Docker-specific projects at Unidata: "CloudStream," "CloudIDV," and "CloudControl." CloudStream is a docker-based technology stack for bringing legacy desktop software to new computing environments, without the need to invest significant engineering/development resources. CloudStream helps make it easier to run existing software in a cloud environment via a technology called "Application Streaming." CloudIDV is a CloudStream-based implementation of the Unidata Integrated Data Viewer (IDV). CloudIDV serves as a practical example of application streaming, and demonstrates how traditional software can be easily accessed and controlled via a web browser. Finally, CloudControl is a web-based dashboard which provides administrative controls for running docker-based technologies in the cloud, as well as providing user management. In this work we will give an overview of these three open-source technologies and the value they offer to our community.

  17. Key Based Mutual Authentication (KBMA Mechanism for Secured Access in MobiCloud Environment

    Directory of Open Access Journals (Sweden)

    Donald A. Cecil

    2016-01-01

    Full Text Available Mobile Cloud Computing (MCC fuels innovation in Mobile Computing and opens new pathways between mobile devices and infrastructures. There are several issues in MCC environment as it integrates various technologies. Among all issues, security lies on the top where many users are not willing to adopt the cloud services. This paper focuses on the authentication. The objective of this paper is to provide a mechanism for authenticating all the entities involved in accessing the cloud services. A mechanism called Key Based Mutual Authentication (KBMA is proposed which is divided into two processes namely registration and authentication. Registration is a one-time process where the users are registered for accessing the cloud services by giving the desired unique information. Authentication process is carried out mutually to verify the identities of Device and Cloud Service Provider (CSP. Scyther tool is used for analysing the vulnerability in terms of attacks. The result claims show that the proposed mechanism is resilient against various attacks.

  18. A cloud-based production system for information and service integration: an internet of things case study on waste electronics

    Science.gov (United States)

    Wang, Xi Vincent; Wang, Lihui

    2017-08-01

    Cloud computing is the new enabling technology that offers centralised computing, flexible data storage and scalable services. In the manufacturing context, it is possible to utilise the Cloud technology to integrate and provide industrial resources and capabilities in terms of Cloud services. In this paper, a function block-based integration mechanism is developed to connect various types of production resources. A Cloud-based architecture is also deployed to offer a service pool which maintains these resources as production services. The proposed system provides a flexible and integrated information environment for the Cloud-based production system. As a specific type of manufacturing, Waste Electrical and Electronic Equipment (WEEE) remanufacturing experiences difficulties in system integration, information exchange and resource management. In this research, WEEE is selected as the example of Internet of Things to demonstrate how the obstacles and bottlenecks are overcome with the help of Cloud-based informatics approach. In the case studies, the WEEE recycle/recovery capabilities are also integrated and deployed as flexible Cloud services. Supporting mechanisms and technologies are presented and evaluated towards the end of the paper.

  19. Chemical composition and mixing-state of ice residuals sampled within mixed phase clouds

    Science.gov (United States)

    Ebert, M.; Worringen, A.; Benker, N.; Mertes, S.; Weingartner, E.; Weinbruch, S.

    2010-10-01

    During an intensive campaign at the high alpine research station Jungfraujoch, Switzerland, in February/March 2006 ice particle residuals within mixed-phase clouds were sampled using the Ice-counterflow virtual impactor (Ice-CVI). Size, morphology, chemical composition, mineralogy and mixing state of the ice residual and the interstitial (i.e., non-activated) aerosol particles were analyzed by scanning and transmission electron microscopy. Ice nuclei (IN) were identified from the significant enrichment of particle groups in the ice residual (IR) samples relative to the interstitial aerosol. In terms of number lead-bearing particles are enriched by a factor of approximately 25, complex internal mixtures with silicates or metal oxides as major components by a factor of 11, and mixtures of secondary aerosol and soot (C-O-S particles) by a factor of 2. Other particle groups (sulfates, sea salt, Ca-rich particles, external silicates) observed in the ice-residual samples cannot be assigned unambiguously as IN. Between 9 and 24% of all IR are Pb-bearing particles. Pb was found as major component in around 10% of these particles (PbO, PbCl2). In the other particles, Pb was found as some 100 nm sized agglomerates consisting of 3-8 nm sized primary particles (PbS, elemental Pb). C-O-S particles are present in the IR at an abundance of 17-27%. The soot component within these particles is strongly aged. Complex internal mixtures occur in the IR at an abundance of 9-15%. Most IN identified at the Jungfraujoch station are internal mixtures containing anthropogenic components (either as main or minor constituent), and it is concluded that admixture of the anthropogenic component is responsible for the increased IN efficiency within mixed phase clouds. The mixing state appears to be a key parameter for the ice nucleation behaviour that cannot be predicted from the separate components contained within the individual particles.

  20. Chemical composition and mixing-state of ice residuals sampled within mixed phase clouds

    Directory of Open Access Journals (Sweden)

    M. Ebert

    2011-03-01

    Full Text Available During an intensive campaign at the high alpine research station Jungfraujoch, Switzerland, in February/March 2006 ice particle residuals within mixed-phase clouds were sampled using the Ice-counterflow virtual impactor (Ice-CVI. Size, morphology, chemical composition, mineralogy and mixing state of the ice residual and the interstitial (i.e., non-activated aerosol particles were analyzed by scanning and transmission electron microscopy. Ice nuclei (IN were identified from the significant enrichment of particle groups in the ice residual (IR samples relative to the interstitial aerosol. In terms of number lead-bearing particles are enriched by a factor of approximately 25, complex internal mixtures with silicates or metal oxides as major components by a factor of 11, and mixtures of secondary aerosol and carbonaceous material (C-O-S particles by a factor of 2. Other particle groups (sulfates, sea salt, Ca-rich particles, external silicates observed in the ice-residual samples cannot be assigned unambiguously as IN. Between 9 and 24% of all IR are Pb-bearing particles. Pb was found as major component in around 10% of these particles (PbO, PbCl2. In the other particles, Pb was found as some 100 nm sized agglomerates consisting of 3–8 nm sized primary particles (PbS, elemental Pb. C-O-S particles are present in the IR at an abundance of 17–27%. The soot component within these particles is strongly aged. Complex internal mixtures occur in the IR at an abundance of 9–15%. Most IN identified at the Jungfraujoch station are internal mixtures containing anthropogenic components (either as main or minor constituent, and it is concluded that admixture of the anthropogenic component is responsible for the increased IN efficiency within mixed phase clouds. The mixing state appears to be a key parameter for the ice nucleation behaviour that cannot be predicted from the sole knowledge of the main component of an individual particle.

  1. Telemedicine Based on Mobile Devices and Mobile Cloud Computing

    OpenAIRE

    Lidong Wang; Cheryl Ann Alexander

    2014-01-01

    Mobile devices such as smartphones and tablets support kinds of mobile computing and services. They can access to the cloud or offload the computation-intensive part to the cloud computing resources. Mobile cloud computing (MCC) integrates the cloud computing into the mobile environment, which extends mobile devices’ battery lifetime, improves their data storage capacity and processing power, and improves their reliability and information security. In this paper, the applications of smartphon...

  2. Cloud computing strategies

    CERN Document Server

    Chorafas, Dimitris N

    2011-01-01

    A guide to managing cloud projects, Cloud Computing Strategies provides the understanding required to evaluate the technology and determine how it can be best applied to improve business and enhance your overall corporate strategy. Based on extensive research, it examines the opportunities and challenges that loom in the cloud. It explains exactly what cloud computing is, what it has to offer, and calls attention to the important issues management needs to consider before passing the point of no return regarding financial commitments.

  3. CIMIDx: Prototype for a Cloud-Based System to Support Intelligent Medical Image Diagnosis With Efficiency.

    Science.gov (United States)

    Bhavani, Selvaraj Rani; Senthilkumar, Jagatheesan; Chilambuchelvan, Arul Gnanaprakasam; Manjula, Dhanabalachandran; Krishnamoorthy, Ramasamy; Kannan, Arputharaj

    2015-03-27

    The Internet has greatly enhanced health care, helping patients stay up-to-date on medical issues and general knowledge. Many cancer patients use the Internet for cancer diagnosis and related information. Recently, cloud computing has emerged as a new way of delivering health services but currently, there is no generic and fully automated cloud-based self-management intervention for breast cancer patients, as practical guidelines are lacking. We investigated the prevalence and predictors of cloud use for medical diagnosis among women with breast cancer to gain insight into meaningful usage parameters to evaluate the use of generic, fully automated cloud-based self-intervention, by assessing how breast cancer survivors use a generic self-management model. The goal of this study was implemented and evaluated with a new prototype called "CIMIDx", based on representative association rules that support the diagnosis of medical images (mammograms). The proposed Cloud-Based System Support Intelligent Medical Image Diagnosis (CIMIDx) prototype includes two modules. The first is the design and development of the CIMIDx training and test cloud services. Deployed in the cloud, the prototype can be used for diagnosis and screening mammography by assessing the cancers detected, tumor sizes, histology, and stage of classification accuracy. To analyze the prototype's classification accuracy, we conducted an experiment with data provided by clients. Second, by monitoring cloud server requests, the CIMIDx usage statistics were recorded for the cloud-based self-intervention groups. We conducted an evaluation of the CIMIDx cloud service usage, in which browsing functionalities were evaluated from the end-user's perspective. We performed several experiments to validate the CIMIDx prototype for breast health issues. The first set of experiments evaluated the diagnostic performance of the CIMIDx framework. We collected medical information from 150 breast cancer survivors from hospitals

  4. Location-Based Services and Privacy Protection Under Mobile Cloud Computing

    OpenAIRE

    Yan, Yan; Xiaohong, Hao; Wanjun, Wang

    2015-01-01

    Location-based services can provide personalized services based on location information of moving objects and have already been widely used in public safety services, transportation, entertainment and many other areas. With the rapid development of mobile communication technology and popularization of intelligent terminals, there will be great commercial prospects to provide location-based services under mobile cloud computing environment. However, the high adhesion degree of mobile terminals...

  5. Comparisons of cloud ice mass content retrieved from the radar-infrared radiometer method with aircraft data during the second international satellite cloud climatology project regional experiment (FIRE-II)

    Energy Technology Data Exchange (ETDEWEB)

    Matrosov, S.Y. [Univ. of Colorado, Boulder, CO (United States)]|[National Oceanic and Atmospheric Administration Environmental Technology Lab., Boulder, CO (United States); Heymsfield, A.J. [National Center for Atmospheric Research, Boulder, CO (United States); Kropfli, R.A.; Snider, J.B. [National Oceanic and Atmospheric Administration Environmental Technology Lab., Boulder, CO (United States)

    1996-04-01

    Comparisons of remotely sensed meteorological parameters with in situ direct measurements always present a challenge. Matching sampling volumes is one of the main problems for such comparisons. Aircraft usually collect data when flying along a horizontal leg at a speed of about 100 m/sec (or even greater). The usual sampling time of 5 seconds provides an average horizontal resolution of the order of 500 m. Estimations of vertical profiles of cloud microphysical parameters from aircraft measurements are hampered by sampling a cloud at various altitudes at different times. This paper describes the accuracy of aircraft horizontal and vertical coordinates relative to the location of the ground-based instruments.

  6. ORGANIZATION OF CLOUD COMPUTING INFRASTRUCTURE BASED ON SDN NETWORK

    Directory of Open Access Journals (Sweden)

    Alexey A. Efimenko

    2013-01-01

    Full Text Available The article presents the main approaches to cloud computing infrastructure based on the SDN network in present data processing centers (DPC. The main indexes of management effectiveness of network infrastructure of DPC are determined. The examples of solutions for the creation of virtual network devices are provided.

  7. Evaluation of the MiKlip decadal prediction system using satellite based cloud products

    Directory of Open Access Journals (Sweden)

    Thomas Spangehl

    2016-12-01

    Full Text Available The decadal hindcast simulations performed for the Mittelfristige Klimaprognosen (MiKlip project are evaluated using satellite-retrieved cloud parameters from the CM SAF cLoud, Albedo and RAdiation dataset from AVHRR data (CLARA-A1 provided by the EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF and from the International Satellite Cloud Climatology Project (ISCCP. The forecast quality of two sets of hindcasts, Baseline-1-LR and Baseline-0, which use differing initialisations, is assessed. Basic evaluation focuses on multi-year ensemble mean fields and cloud-type histograms utilizing satellite simulator output. Additionally, ensemble evaluation employing analysis of variance (ANOVA, analysis rank histograms (ARH and a deterministic correlation score is performed. Satellite simulator output is available for a subset of the full hindcast ensembles only. Therefore, the raw model cloud cover is complementary used. The new Baseline-1-LR hindcasts are closer to satellite data with respect to the simulated tropical/subtropical mean cloud cover pattern than the reference hindcasts (Baseline-0 emphasizing improvements of the new MiKlip initialisation procedure. A slightly overestimated occurrence rate of optically thick cloud-types is analysed for different experiments including hindcasts and simulations using realistic sea surface boundaries according to the Atmospheric Model Intercomparison Project (AMIP. By contrast, the evaluation of cirrus and cirrostratus clouds is complicated by observational based uncertainties. Time series of the 3-year mean total cloud cover averaged over the tropical warm pool (TWP region show some correlation with the CLARA-A1 cloud fractional cover. Moreover, ensemble evaluation of the Baseline-1-LR hindcasts reveals potential predictability of the 2–5 lead year averaged total cloud cover for a large part of this region when regarding the full observational period. However, the hindcasts show only

  8. A Knowledge Base for Automatic Feature Recognition from Point Clouds in an Urban Scene

    Directory of Open Access Journals (Sweden)

    Xu-Feng Xing

    2018-01-01

    Full Text Available LiDAR technology can provide very detailed and highly accurate geospatial information on an urban scene for the creation of Virtual Geographic Environments (VGEs for different applications. However, automatic 3D modeling and feature recognition from LiDAR point clouds are very complex tasks. This becomes even more complex when the data is incomplete (occlusion problem or uncertain. In this paper, we propose to build a knowledge base comprising of ontology and semantic rules aiming at automatic feature recognition from point clouds in support of 3D modeling. First, several modules for ontology are defined from different perspectives to describe an urban scene. For instance, the spatial relations module allows the formalized representation of possible topological relations extracted from point clouds. Then, a knowledge base is proposed that contains different concepts, their properties and their relations, together with constraints and semantic rules. Then, instances and their specific relations form an urban scene and are added to the knowledge base as facts. Based on the knowledge and semantic rules, a reasoning process is carried out to extract semantic features of the objects and their components in the urban scene. Finally, several experiments are presented to show the validity of our approach to recognize different semantic features of buildings from LiDAR point clouds.

  9. Cloud-Based Applications for Organizing and Reviewing Plastic Surgery Content.

    Science.gov (United States)

    Luan, Anna; Momeni, Arash; Lee, Gordon K; Galvez, Michael G

    2015-01-01

    Cloud-based applications including Box, Dropbox, Google Drive, Evernote, Notability, and Zotero are available for smartphones, tablets, and laptops and have revolutionized the manner in which medical students and surgeons read and utilize plastic surgery literature. Here we provide an overview of the use of Cloud computing in practice and propose an algorithm for organizing the vast amount of plastic surgery literature. Given the incredible amount of data being produced in plastic surgery and other surgical subspecialties, it is prudent for plastic surgeons to lead the process of providing solutions for the efficient organization and effective integration of the ever-increasing data into clinical practice.

  10. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    Science.gov (United States)

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  11. Software Product Lines for Multi-Cloud Microservices-Based Applications

    OpenAIRE

    Sousa , Gustavo; Rudametkin , Walter; Duchien , Laurence

    2016-01-01

    International audience; Multi-cloud computing is the use of resources and services from multiple independent cloud providers. It is used to avoid vendor lock-in, comply with location regulations, and optimize reliability, performance and costs. Microservices is an architectural style becoming increasingly used in cloud computing as it allows for better resources usage. However, building multi-cloud systems is a very complex and time consuming task, which calls for automation and supporting to...

  12. A new data collaboration service based on cloud computing security

    Science.gov (United States)

    Ying, Ren; Li, Hua-Wei; Wang, Li na

    2017-09-01

    With the rapid development of cloud computing, the storage and usage of data have undergone revolutionary changes. Data owners can store data in the cloud. While bringing convenience, it also brings many new challenges to cloud data security. A key issue is how to support a secure data collaboration service that supports access and updates to cloud data. This paper proposes a secure, efficient and extensible data collaboration service, which prevents data leaks in cloud storage, supports one to many encryption mechanisms, and also enables cloud data writing and fine-grained access control.

  13. Thin Cloud Detection Method by Linear Combination Model of Cloud Image

    Science.gov (United States)

    Liu, L.; Li, J.; Wang, Y.; Xiao, Y.; Zhang, W.; Zhang, S.

    2018-04-01

    The existing cloud detection methods in photogrammetry often extract the image features from remote sensing images directly, and then use them to classify images into cloud or other things. But when the cloud is thin and small, these methods will be inaccurate. In this paper, a linear combination model of cloud images is proposed, by using this model, the underlying surface information of remote sensing images can be removed. So the cloud detection result can become more accurate. Firstly, the automatic cloud detection program in this paper uses the linear combination model to split the cloud information and surface information in the transparent cloud images, then uses different image features to recognize the cloud parts. In consideration of the computational efficiency, AdaBoost Classifier was introduced to combine the different features to establish a cloud classifier. AdaBoost Classifier can select the most effective features from many normal features, so the calculation time is largely reduced. Finally, we selected a cloud detection method based on tree structure and a multiple feature detection method using SVM classifier to compare with the proposed method, the experimental data shows that the proposed cloud detection program in this paper has high accuracy and fast calculation speed.

  14. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  15. Cloud-Based Social Media Visual Analytics Disaster Response System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose a next-generation cloud-based social media visual analytics disaster response system that will enable decision-makers and first-responders to obtain...

  16. Study on Huizhou architecture of point cloud registration based on optimized ICP algorithm

    Science.gov (United States)

    Zhang, Runmei; Wu, Yulu; Zhang, Guangbin; Zhou, Wei; Tao, Yuqian

    2018-03-01

    In view of the current point cloud registration software has high hardware requirements, heavy workload and moltiple interactive definition, the source of software with better processing effect is not open, a two--step registration method based on normal vector distribution feature and coarse feature based iterative closest point (ICP) algorithm is proposed in this paper. This method combines fast point feature histogram (FPFH) algorithm, define the adjacency region of point cloud and the calculation model of the distribution of normal vectors, setting up the local coordinate system for each key point, and obtaining the transformation matrix to finish rough registration, the rough registration results of two stations are accurately registered by using the ICP algorithm. Experimental results show that, compared with the traditional ICP algorithm, the method used in this paper has obvious time and precision advantages for large amount of point clouds.

  17. Risk in the Clouds?: Security Issues Facing Government Use of Cloud Computing

    Science.gov (United States)

    Wyld, David C.

    Cloud computing is poised to become one of the most important and fundamental shifts in how computing is consumed and used. Forecasts show that government will play a lead role in adopting cloud computing - for data storage, applications, and processing power, as IT executives seek to maximize their returns on limited procurement budgets in these challenging economic times. After an overview of the cloud computing concept, this article explores the security issues facing public sector use of cloud computing and looks to the risk and benefits of shifting to cloud-based models. It concludes with an analysis of the challenges that lie ahead for government use of cloud resources.

  18. Cloud System Evolution in the Trades—CSET

    Science.gov (United States)

    Albrecht, B. A.; Zuidema, P.; Bretherton, C. S.; Wood, R.; Ghate, V. P.

    2015-12-01

    The Cloud System Evolution in the Trades (CSET) study was designed to describe and explain the evolution of the boundary layer aerosol, cloud, and thermodynamic structures along trajectories within the north-Pacific trade-winds. The observational component of this study centered on 7 round-trips made by the NSF NCAR Gulfstream V (GV) between Sacramento, CA and Kona, Hawaii between 1 July and 15 August 2015. The CSET observing strategy used a Lagrangian approach to sample aerosol, cloud, and boundary layer properties upwind from the transition zone over the North Pacific and to resample these areas two days later. GFS forecast trajectories were used to plan the outbound flight to Hawaii and then updated forecast trajectories helped set the return flight plan two days later. Two key elements of the CSET observing system were the newly developed HIAPER Cloud Radar (HCR) and the HIAPER Spectral Resolution Lidar (HSRL). Together they provided unprecedented characterizations of aerosol, cloud and precipitation structures. A full suite of probes on the aircraft were used for in situ measurements of aerosol, cloud, precipitation, and turbulence properties during the low-level aircraft profiling portions of the flights. A wide range of boundary layer structures and aerosol, cloud, and precipitation conditions were observed during CSET. The cloud systems sampled included solid stratocumulus infused with smoke from Canadian wildfires, mesoscale (100-200 km) cloud-precipitation complexes, and patches of shallow cumuli in environments with accumulation mode aerosol concentrations of less than 50 cm-3. Ultra clean layers (UCLs with accumulation mode concentrations of less than 10 cm-3) were observed frequently near the top of the boundary layer and were often associated with shallow, gray (optically thin) layered clouds—features that are the subject of focused investigations by the CSET science team. The extent of aerosol, cloud, drizzle and boundary layer sampling that was

  19. Information content of OCO-2 oxygen A-band channels for retrieving marine liquid cloud properties

    Science.gov (United States)

    Richardson, Mark; Stephens, Graeme L.

    2018-03-01

    Information content analysis is used to select channels for a marine liquid cloud retrieval using the high-spectral-resolution oxygen A-band instrument on NASA's Orbiting Carbon Observatory-2 (OCO-2). Desired retrieval properties are cloud optical depth, cloud-top pressure and cloud pressure thickness, which is the geometric thickness expressed in hectopascals. Based on information content criteria we select a micro-window of 75 of the 853 functioning OCO-2 channels spanning 763.5-764.6 nm and perform a series of synthetic retrievals with perturbed initial conditions. We estimate posterior errors from the sample standard deviations and obtain ±0.75 in optical depth and ±12.9 hPa in both cloud-top pressure and cloud pressure thickness, although removing the 10 % of samples with the highest χ2 reduces posterior error in cloud-top pressure to ±2.9 hPa and cloud pressure thickness to ±2.5 hPa. The application of this retrieval to real OCO-2 measurements is briefly discussed, along with limitations and the greatest caution is urged regarding the assumption of a single homogeneous cloud layer, which is often, but not always, a reasonable approximation for marine boundary layer clouds.

  20. Lagrangian evolution of the marine boundary layer from the Cloud System Evolution in the Trades (CSET) campaign

    Science.gov (United States)

    Mohrmann, J.; Ghate, V. P.; McCoy, I. L.; Bretherton, C. S.; Wood, R.; Minnis, P.; Palikonda, R.

    2017-12-01

    The Cloud System Evolution in the Trades (CSET) field campaign took place July/August 2015 to study the evolution of clouds, precipitation, and aerosols in the stratocumulus-to-cumulus (Sc-Cu) transition region of the northeast Pacific marine boundary layer (MBL). Aircraft observations sampled across a wide range of cloud and aerosol conditions. The sampling strategy, where MBL airmasses were sampled with the NSF/NCAR Gulfstream-V (HIAPER) and resampled then at their advected location two days later, resulted in a dataset of 14 paired flights suitable for Lagrangian analysis. This analysis shows that Lagrangian coherence of long-lived species (namely CO and O3) across 48 hours are high, but that of subcloud aerosol, MBL depth, and cloud properties is limited. Geostationary satellite retrievals are compared against aircraft observations; these are combined with reanalysis data and HYSPLIT trajectories to document the Lagrangian evolution of cloud fraction, cloud droplet number concentration, liquid water path, estimated inversion strength (EIS), and MBL depth, which are used to expand upon and validate the aircraft-based analysis. Many of the trajectories sampled by the aircraft show a clear Sc-Cu transition. Although satellite cloud fraction and EIS were found to be strongly spatiotemporally correlated, changes in MBL cloud fraction along trajectories did not correlate with any measure of EIS forcing.

  1. BUSINESS INTELLIGENCE IN CLOUD

    OpenAIRE

    Celina M. Olszak

    2014-01-01

    . The paper reviews and critiques current research on Business Intelligence (BI) in cloud. This review highlights that organizations face various challenges using BI cloud. The research objectives for this study are a conceptualization of the BI cloud issue, as well as an investigation of some benefits and risks from BI cloud. The study was based mainly on a critical analysis of literature and some reports on BI cloud using. The results of this research can be used by IT and business leaders ...

  2. Towards Constraint-based High Performance Cloud System in the Process of Cloud Computing Adoption in an Organization

    OpenAIRE

    Simalango, Mikael Fernandus; Kang, Mun-Young; Oh, Sangyoon

    2010-01-01

    Cloud computing is penetrating into various domains and environments, from theoretical computer science to economy, from marketing hype to educational curriculum and from R&D lab to enterprise IT infrastructure. Yet, the currently developing state of cloud computing leaves several issues to address and also affects cloud computing adoption by organizations. In this paper, we explain how the transition into the cloud can occur in an organization and describe the mechanism for transforming lega...

  3. Statistical Comparison of Cloud and Aerosol Vertical Properties between Two Eastern China Regions Based on CloudSat/CALIPSO Data

    Directory of Open Access Journals (Sweden)

    Yujun Qiu

    2017-01-01

    Full Text Available The relationship between cloud and aerosol properties was investigated over two 4° × 4° adjacent regions in the south (R1 and in the north (R2 in eastern China. The CloudSat/CALIPSO data were used to extract the cloud and aerosol profiles properties. The mean value of cloud occurrence probability (COP was the highest in the mixed cloud layer (−40°C~0°C and the lowest in the warm cloud layer (>0°C. The atmospheric humidity was more statistically relevant to COP in the warm cloud layer than aerosol condition. The differences in COP between the two regions in the mixed cloud layer and ice cloud layer (<−40°C had good correlations with those in the aerosol extinction coefficient. A radar reflectivity factor greater than −10 dBZ occurred mainly in warm cloud layers and mixed cloud layers. A high-COP zone appeared in the above-0°C layer with cloud thicknesses of 2-3 km in both regions and in all the four seasons, but the distribution of the zonal layer in R2 was more continuous than that in R1, which was consistent with the higher aerosol optical thickness in R2 than in R1 in the above-0°C layer, indicating a positive correlation between aerosol and cloud probability.

  4. Impact of different cloud deployments on real-time video applications for mobile video cloud users

    Science.gov (United States)

    Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos

    2015-02-01

    The latest trend to access mobile cloud services through wireless network connectivity has amplified globally among both entrepreneurs and home end users. Although existing public cloud service vendors such as Google, Microsoft Azure etc. are providing on-demand cloud services with affordable cost for mobile users, there are still a number of challenges to achieve high-quality mobile cloud based video applications, especially due to the bandwidth-constrained and errorprone mobile network connectivity, which is the communication bottleneck for end-to-end video delivery. In addition, existing accessible clouds networking architectures are different in term of their implementation, services, resources, storage, pricing, support and so on, and these differences have varied impact on the performance of cloud-based real-time video applications. Nevertheless, these challenges and impacts have not been thoroughly investigated in the literature. In our previous work, we have implemented a mobile cloud network model that integrates localized and decentralized cloudlets (mini-clouds) and wireless mesh networks. In this paper, we deploy a real-time framework consisting of various existing Internet cloud networking architectures (Google Cloud, Microsoft Azure and Eucalyptus Cloud) and a cloudlet based on Ubuntu Enterprise Cloud over wireless mesh networking technology for mobile cloud end users. It is noted that the increasing trend to access real-time video streaming over HTTP/HTTPS is gaining popularity among both research and industrial communities to leverage the existing web services and HTTP infrastructure in the Internet. To study the performance under different deployments using different public and private cloud service providers, we employ real-time video streaming over the HTTP/HTTPS standard, and conduct experimental evaluation and in-depth comparative analysis of the impact of different deployments on the quality of service for mobile video cloud users. Empirical

  5. A cloud-based X73 ubiquitous mobile healthcare system: design and implementation.

    Science.gov (United States)

    Ji, Zhanlin; Ganchev, Ivan; O'Droma, Máirtín; Zhang, Xin; Zhang, Xueji

    2014-01-01

    Based on the user-centric paradigm for next generation networks, this paper describes a ubiquitous mobile healthcare (uHealth) system based on the ISO/IEEE 11073 personal health data (PHD) standards (X73) and cloud computing techniques. A number of design issues associated with the system implementation are outlined. The system includes a middleware on the user side, providing a plug-and-play environment for heterogeneous wireless sensors and mobile terminals utilizing different communication protocols and a distributed "big data" processing subsystem in the cloud. The design and implementation of this system are envisaged as an efficient solution for the next generation of uHealth systems.

  6. Island based radar and microwave radiometer measurements of stratus cloud parameters during the Atlantic Stratocumulus Transition Experiment (ASTEX)

    Energy Technology Data Exchange (ETDEWEB)

    Frisch, A.S. [Colorado State Univ., Fort Collins, CO (United States); Fairall, C.W.; Snider, J.B. [NOAA Environmental Technology Lab., Boulder, CO (United States); Lenshow, D.H.; Mayer, S.D. [National Center for Atmospheric Research, Boulder, CO (United States)

    1996-04-01

    During the Atlantic Stratocumulus Transition Experiment (ASTEX) in June 1992, simultaneous measurements were made with a vertically pointing cloud sensing radar and a microwave radiometer. The radar measurements are used to estimate stratus cloud drizzle and turbulence parameters. In addition, with the microwave radiometer measurements of reflectivity, we estimated the profiles of cloud liquid water and effective radius. We used radar data for computation of vertical profiles of various drizzle parameters such as droplet concentration, modal radius, and spread. A sample of these results is shown in Figure 1. In addition, in non-drizzle clouds, with the radar and radiometer we can estimate the verticle profiles of stratus cloud parameters such as liquid water concentration and effective radius. This is accomplished by assuming a droplet distribution with droplet number concentration and width constant with height.

  7. Cooperative Cloud Service Aware Mobile Internet Coverage Connectivity Guarantee Protocol Based on Sensor Opportunistic Coverage Mechanism

    Directory of Open Access Journals (Sweden)

    Qin Qin

    2015-01-01

    Full Text Available In order to improve the Internet coverage ratio and provide connectivity guarantee, based on sensor opportunistic coverage mechanism and cooperative cloud service, we proposed the coverage connectivity guarantee protocol for mobile Internet. In this scheme, based on the opportunistic covering rules, the network coverage algorithm of high reliability and real-time security was achieved by using the opportunity of sensor nodes and the Internet mobile node. Then, the cloud service business support platform is created based on the Internet application service management capabilities and wireless sensor network communication service capabilities, which is the architecture of the cloud support layer. The cooperative cloud service aware model was proposed. Finally, we proposed the mobile Internet coverage connectivity guarantee protocol. The results of experiments demonstrate that the proposed algorithm has excellent performance, in terms of the security of the Internet and the stability, as well as coverage connectivity ability.

  8. A Dynamic Resource Scheduling Method Based on Fuzzy Control Theory in Cloud Environment

    Directory of Open Access Journals (Sweden)

    Zhijia Chen

    2015-01-01

    Full Text Available The resources in cloud environment have features such as large-scale, diversity, and heterogeneity. Moreover, the user requirements for cloud computing resources are commonly characterized by uncertainty and imprecision. Hereby, to improve the quality of cloud computing service, not merely should the traditional standards such as cost and bandwidth be satisfied, but also particular emphasis should be laid on some extended standards such as system friendliness. This paper proposes a dynamic resource scheduling method based on fuzzy control theory. Firstly, the resource requirements prediction model is established. Then the relationships between resource availability and the resource requirements are concluded. Afterwards fuzzy control theory is adopted to realize a friendly match between user needs and resources availability. Results show that this approach improves the resources scheduling efficiency and the quality of service (QoS of cloud computing.

  9. Research on Digital Forensic Readiness Design in a Cloud Computing-Based Smart Work Environment

    Directory of Open Access Journals (Sweden)

    Sangho Park

    2018-04-01

    Full Text Available Recently, the work environments of organizations have been in the process of transitioning into smart work environments by applying cloud computing technology in the existing work environment. The smart work environment has the characteristic of being able to access information assets inside the company from outside the company through cloud computing technology, share information without restrictions on location by using mobile terminals, and provide a work environment where work can be conducted effectively in various locations and mobile environments. Thus, in the cloud computing-based smart work environment, changes are occurring in terms of security risks, such as an increase in the leakage risk of an organization’s information assets through mobile terminals which have a high risk of loss and theft and increase the hacking risk of wireless networks in mobile environments. According to these changes in security risk, the reactive digital forensic method, which investigates digital evidence after the occurrence of security incidents, appears to have a limit which has led to a rise in the necessity of proactive digital forensic approaches wherein security incidents can be addressed preemptively. Accordingly, in this research, we design a digital forensic readiness model at the level of preemptive prevention by considering changes in the cloud computing-based smart work environment. Firstly, we investigate previous research related to the cloud computing-based smart work environment and digital forensic readiness and analyze a total of 50 components of digital forensic readiness. In addition, through the analysis of the corresponding preceding research, we design seven detailed areas, namely, outside the organization environment, within the organization guideline, system information, terminal information, user information, usage information, and additional function. Then, we design a draft of the digital forensic readiness model in the cloud

  10. A cloud computing based 12-lead ECG telemedicine service

    Science.gov (United States)

    2012-01-01

    Background Due to the great variability of 12-lead ECG instruments and medical specialists’ interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists’ decision making support in emergency telecardiology. Methods We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG) service to realize ubiquitous 12-lead ECG tele-diagnosis. Results This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. Conclusions This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan. PMID:22838382

  11. A cloud computing based 12-lead ECG telemedicine service

    Directory of Open Access Journals (Sweden)

    Hsieh Jui-chien

    2012-07-01

    Full Text Available Abstract Background Due to the great variability of 12-lead ECG instruments and medical specialists’ interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists’ decision making support in emergency telecardiology. Methods We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG service to realize ubiquitous 12-lead ECG tele-diagnosis. Results This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. Conclusions This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan.

  12. HealthNode: Software Framework for Efficiently Designing and Developing Cloud-Based Healthcare Applications

    Directory of Open Access Journals (Sweden)

    Ho-Kyeong Ra

    2018-01-01

    Full Text Available With the exponential improvement of software technology during the past decade, many efforts have been made to design remote and personalized healthcare applications. Many of these applications are built on mobile devices connected to the cloud. Although appealing, however, prototyping and validating the feasibility of an application-level idea is yet challenging without a solid understanding of the cloud, mobile, and the interconnectivity infrastructure. In this paper, we provide a solution to this by proposing a framework called HealthNode, which is a general-purpose framework for developing healthcare applications on cloud platforms using Node.js. To fully exploit the potential of Node.js when developing cloud applications, we focus on the fact that the implementation process should be eased. HealthNode presents an explicit guideline while supporting necessary features to achieve quick and expandable cloud-based healthcare applications. A case study applying HealthNode to various real-world health applications suggests that HealthNode can express architectural structure effectively within an implementation and that the proposed platform can support system understanding and software evolution.

  13. Vertical microphysical profiles of convective clouds as a tool for obtaining aerosol cloud-mediated climate forcings

    Energy Technology Data Exchange (ETDEWEB)

    Rosenfeld, Daniel [Hebrew Univ. of Jerusalem (Israel)

    2015-12-23

    Quantifying the aerosol/cloud-mediated radiative effect at a global scale requires simultaneous satellite retrievals of cloud condensation nuclei (CCN) concentrations and cloud base updraft velocities (Wb). Hitherto, the inability to do so has been a major cause of high uncertainty regarding anthropogenic aerosol/cloud-mediated radiative forcing. This can be addressed by the emerging capability of estimating CCN and Wb of boundary layer convective clouds from an operational polar orbiting weather satellite. Our methodology uses such clouds as an effective analog for CCN chambers. The cloud base supersaturation (S) is determined by Wb and the satellite-retrieved cloud base drop concentrations (Ndb), which is the same as CCN(S). Developing and validating this methodology was possible thanks to the ASR/ARM measurements of CCN and vertical updraft profiles. Validation against ground-based CCN instruments at the ARM sites in Oklahoma, Manaus, and onboard a ship in the northeast Pacific showed a retrieval accuracy of ±25% to ±30% for individual satellite overpasses. The methodology is presently limited to boundary layer not raining convective clouds of at least 1 km depth that are not obscured by upper layer clouds, including semitransparent cirrus. The limitation for small solar backscattering angles of <25º restricts the satellite coverage to ~25% of the world area in a single day. This methodology will likely allow overcoming the challenge of quantifying the aerosol indirect effect and facilitate a substantial reduction of the uncertainty in anthropogenic climate forcing.

  14. Global Software Development with Cloud Platforms

    Science.gov (United States)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  15. Towards a service centric contextualized vehicular cloud

    NARCIS (Netherlands)

    Hu, Xiping; Wang, Lei; Sheng, Zhengguo; TalebiFard, Peyman; Zhou, Li; Liu, Jia; Leung, Victor C.M.

    2014-01-01

    This paper proposes a service-centric contextualized vehicular (SCCV) cloud platform to facilitate the deployment and delivery of cloud-based mobile applications over vehicular networks. SCCV cloud employs a multi-tier architecture that consists of the network, mobile device, and cloud tiers. Based

  16. Using a New Event-Based Simulation Framework for Investigating Resource Provisioning in Clouds

    Directory of Open Access Journals (Sweden)

    Simon Ostermann

    2011-01-01

    Full Text Available Today, Cloud computing proposes an attractive alternative to building large-scale distributed computing environments by which resources are no longer hosted by the scientists' computational facilities, but leased from specialised data centres only when and for how long they are needed. This new class of Cloud resources raises new interesting research questions in the fields of resource management, scheduling, fault tolerance, or quality of service, requiring hundreds to thousands of experiments for finding valid solutions. To enable such research, a scalable simulation framework is typically required for early prototyping, extensive testing and validation of results before the real deployment is performed. The scope of this paper is twofold. In the first part we present GroudSim, a Grid and Cloud simulation toolkit for scientific computing based on a scalable simulation-independent discrete-event engine. GroudSim provides a comprehensive set of features for complex simulation scenarios from simple job executions on leased computing resources to file transfers, calculation of costs and background load on resources. Simulations can be parameterised and are easily extendable by probability distribution packages for failures which normally occur in complex distributed environments. Experimental results demonstrate the improved scalability of GroudSim compared to a related process-based simulation approach. In the second part, we show the use of the GroudSim simulator to analyse the problem of dynamic provisioning of Cloud resources to scientific workflows that do not benefit from sufficient Grid resources as required by their computational demands. We propose and study four strategies for provisioning and releasing Cloud resources that take into account the general leasing model encountered in today's commercial Cloud environments based on resource bulks, fuzzy descriptions and hourly payment intervals. We study the impact of our techniques to the

  17. Intelligent Aggregation Based on Content Routing Scheme for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jiachen Xu

    2017-10-01

    Full Text Available Cloud computing has emerged as today’s most exciting computing paradigm for providing services using a shared framework, which opens a new door for solving the problems of the explosive growth of digital resource demands and their corresponding convenience. With the exponential growth of the number of data types and data size in so-called big data work, the backbone network is under great pressure due to its transmission capacity, which is lower than the growth of the data size and would seriously hinder the development of the network without an effective approach to solve this problem. In this paper, an Intelligent Aggregation based on a Content Routing (IACR scheme for cloud computing, which could reduce the amount of data in the network effectively and play a basic supporting role in the development of cloud computing, is first put forward. All in all, the main innovations in this paper are: (1 A framework for intelligent aggregation based on content routing is proposed, which can support aggregation based content routing; (2 The proposed IACR scheme could effectively route the high aggregation ratio data to the data center through the same routing path so as to effectively reduce the amount of data that the network transmits. The theoretical analyses experiments and results show that, compared with the previous original routing scheme, the IACR scheme can balance the load of the whole network, reduce the amount of data transmitted in the network by 41.8%, and reduce the transmission time by 31.6% in the same network with a more balanced network load.

  18. Considering polarization in MODIS-based cloud property retrievals by using a vector radiative transfer code

    International Nuclear Information System (INIS)

    Yi, Bingqi; Huang, Xin; Yang, Ping; Baum, Bryan A.; Kattawar, George W.

    2014-01-01

    In this study, a full-vector, adding–doubling radiative transfer model is used to investigate the influence of the polarization state on cloud property retrievals from Moderate Resolution Imaging Spectroradiometer (MODIS) satellite observations. Two sets of lookup tables (LUTs) are developed for the retrieval purposes, both of which provide water cloud and ice cloud reflectivity functions at two wavelengths in various sun-satellite viewing geometries. However, only one of the LUTs considers polarization. The MODIS reflectivity observations at 0.65 μm (band 1) and 2.13 μm (band 7) are used to infer the cloud optical thickness and particle effective diameter, respectively. Results indicate that the retrievals for both water cloud and ice cloud show considerable sensitivity to polarization. The retrieved water and ice cloud effective diameter and optical thickness differences can vary by as much as ±15% due to polarization state considerations. In particular, the polarization state has more influence on completely smooth ice particles than on severely roughened ice particles. - Highlights: • Impact of polarization on satellite-based retrieval of water/ice cloud properties is studied. • Inclusion of polarization can change water/ice optical thickness and effective diameter values by up to ±15%. • Influence of polarization on cloud property retrievals depends on sun-satellite viewing geometries

  19. Cloud Computing Governance Lifecycle

    Directory of Open Access Journals (Sweden)

    Soňa Karkošková

    2016-06-01

    Full Text Available Externally provisioned cloud services enable flexible and on-demand sourcing of IT resources. Cloud computing introduces new challenges such as need of business process redefinition, establishment of specialized governance and management, organizational structures and relationships with external providers and managing new types of risk arising from dependency on external providers. There is a general consensus that cloud computing in addition to challenges brings many benefits but it is unclear how to achieve them. Cloud computing governance helps to create business value through obtain benefits from use of cloud computing services while optimizing investment and risk. Challenge, which organizations are facing in relation to governing of cloud services, is how to design and implement cloud computing governance to gain expected benefits. This paper aims to provide guidance on implementation activities of proposed Cloud computing governance lifecycle from cloud consumer perspective. Proposed model is based on SOA Governance Framework and consists of lifecycle for implementation and continuous improvement of cloud computing governance model.

  20. Criteria for the evaluation of a cloud-based hospital information system outsourcing provider.

    Science.gov (United States)

    Low, Chinyao; Hsueh Chen, Ya

    2012-12-01

    As cloud computing technology has proliferated rapidly worldwide, there has been a trend toward adopting cloud-based hospital information systems (CHISs). This study examines the critical criteria for selecting the CHISs outsourcing provider. The fuzzy Delphi method (FDM) is used to evaluate the primary indicator collected from 188 useable responses at a working hospital in Taiwan. Moreover, the fuzzy analytic hierarchy process (FAHP) is employed to calculate the weights of these criteria and establish a fuzzy multi-criteria model of CHISs outsourcing provider selection from 42 experts. The results indicate that the five most critical criteria related to CHISs outsourcing provider selection are (1) system function, (2) service quality, (3) integration, (4) professionalism, and (5) economics. This study may contribute to understanding how cloud-based hospital systems can reinforce content design and offer a way to compete in the field by developing more appropriate systems.

  1. A privacy authentication scheme based on cloud for medical environment.

    Science.gov (United States)

    Chen, Chin-Ling; Yang, Tsai-Tung; Chiang, Mao-Lun; Shih, Tzay-Farn

    2014-11-01

    With the rapid development of the information technology, the health care technologies already became matured. Such as electronic medical records that can be easily stored. However, how to get medical resources more convenient is currently concerning issue. In spite of many literatures discussed about medical systems, these literatures should face many security challenges. The most important issue is patients' privacy. Therefore, we propose a privacy authentication scheme based on cloud environment. In our scheme, we use mobile device's characteristics, allowing peoples to use medical resources on the cloud environment to find medical advice conveniently. The digital signature is used to ensure the security of the medical information that is certified by the medical department in our proposed scheme.

  2. Embracing the Cloud: Six Ways to Look at the Shift to Cloud Computing

    Science.gov (United States)

    Ullman, David F.; Haggerty, Blake

    2010-01-01

    Cloud computing is the latest paradigm shift for the delivery of IT services. Where previous paradigms (centralized, decentralized, distributed) were based on fairly straightforward approaches to technology and its management, cloud computing is radical in comparison. The literature on cloud computing, however, suffers from many divergent…

  3. Three-dimensional reconstruction of indoor whole elements based on mobile LiDAR point cloud data

    Science.gov (United States)

    Gong, Yuejian; Mao, Wenbo; Bi, Jiantao; Ji, Wei; He, Zhanjun

    2014-11-01

    Ground-based LiDAR is one of the most effective city modeling tools at present, which has been widely used for three-dimensional reconstruction of outdoor objects. However, as for indoor objects, there are some technical bottlenecks due to lack of GPS signal. In this paper, based on the high-precision indoor point cloud data which was obtained by LiDAR, an international advanced indoor mobile measuring equipment, high -precision model was fulfilled for all indoor ancillary facilities. The point cloud data we employed also contain color feature, which is extracted by fusion with CCD images. Thus, it has both space geometric feature and spectral information which can be used for constructing objects' surface and restoring color and texture of the geometric model. Based on Autodesk CAD platform and with help of PointSence plug, three-dimensional reconstruction of indoor whole elements was realized. Specifically, Pointools Edit Pro was adopted to edit the point cloud, then different types of indoor point cloud data was processed, including data format conversion, outline extracting and texture mapping of the point cloud model. Finally, three-dimensional visualization of the real-world indoor was completed. Experiment results showed that high-precision 3D point cloud data obtained by indoor mobile measuring equipment can be used for indoor whole elements' 3-d reconstruction and that methods proposed in this paper can efficiently realize the 3 -d construction of indoor whole elements. Moreover, the modeling precision could be controlled within 5 cm, which was proved to be a satisfactory result.

  4. A Mobility Management Using Follow-Me Cloud-Cloudlet in Fog-Computing-Based RANs for Smart Cities.

    Science.gov (United States)

    Chen, Yuh-Shyan; Tsai, Yi-Ting

    2018-02-06

    Mobility management for supporting the location tracking and location-based service (LBS) is an important issue of smart city by providing the means for the smooth transportation of people and goods. The mobility is useful to contribute the innovation in both public and private transportation infrastructures for smart cities. With the assistance of edge/fog computing, this paper presents a fully new mobility management using the proposed follow-me cloud-cloudlet (FMCL) approach in fog-computing-based radio access networks (Fog-RANs) for smart cities. The proposed follow-me cloud-cloudlet approach is an integration strategy of follow-me cloud (FMC) and follow-me edge (FME) (or called cloudlet). A user equipment (UE) receives the data, transmitted from original cloud, into the original edge cloud before the handover operation. After the handover operation, an UE searches for a new cloud, called as a migrated cloud, and a new edge cloud, called as a migrated edge cloud near to UE, where the remaining data is migrated from the original cloud to the migrated cloud and all the remaining data are received in the new edge cloud. Existing FMC results do not have the property of the VM migration between cloudlets for the purpose of reducing the transmission latency, and existing FME results do not keep the property of the service migration between data centers for reducing the transmission latency. Our proposed FMCL approach can simultaneously keep the VM migration between cloudlets and service migration between data centers to significantly reduce the transmission latency. The new proposed mobility management using FMCL approach aims to reduce the total transmission time if some data packets are pre-scheduled and pre-stored into the cache of cloudlet if UE is switching from the previous Fog-RAN to the serving Fog-RAN. To illustrate the performance achievement, the mathematical analysis and simulation results are examined in terms of the total transmission time, the

  5. A Mobility Management Using Follow-Me Cloud-Cloudlet in Fog-Computing-Based RANs for Smart Cities

    Directory of Open Access Journals (Sweden)

    Yuh-Shyan Chen

    2018-02-01

    Full Text Available Mobility management for supporting the location tracking and location-based service (LBS is an important issue of smart city by providing the means for the smooth transportation of people and goods. The mobility is useful to contribute the innovation in both public and private transportation infrastructures for smart cities. With the assistance of edge/fog computing, this paper presents a fully new mobility management using the proposed follow-me cloud-cloudlet (FMCL approach in fog-computing-based radio access networks (Fog-RANs for smart cities. The proposed follow-me cloud-cloudlet approach is an integration strategy of follow-me cloud (FMC and follow-me edge (FME (or called cloudlet. A user equipment (UE receives the data, transmitted from original cloud, into the original edge cloud before the handover operation. After the handover operation, an UE searches for a new cloud, called as a migrated cloud, and a new edge cloud, called as a migrated edge cloud near to UE, where the remaining data is migrated from the original cloud to the migrated cloud and all the remaining data are received in the new edge cloud. Existing FMC results do not have the property of the VM migration between cloudlets for the purpose of reducing the transmission latency, and existing FME results do not keep the property of the service migration between data centers for reducing the transmission latency. Our proposed FMCL approach can simultaneously keep the VM migration between cloudlets and service migration between data centers to significantly reduce the transmission latency. The new proposed mobility management using FMCL approach aims to reduce the total transmission time if some data packets are pre-scheduled and pre-stored into the cache of cloudlet if UE is switching from the previous Fog-RAN to the serving Fog-RAN. To illustrate the performance achievement, the mathematical analysis and simulation results are examined in terms of the total transmission time

  6. Web-based Tsunami Early Warning System with instant Tsunami Propagation Calculations in the GPU Cloud

    Science.gov (United States)

    Hammitzsch, M.; Spazier, J.; Reißland, S.

    2014-12-01

    Usually, tsunami early warning and mitigation systems (TWS or TEWS) are based on several software components deployed in a client-server based infrastructure. The vast majority of systems importantly include desktop-based clients with a graphical user interface (GUI) for the operators in early warning centers. However, in times of cloud computing and ubiquitous computing the use of concepts and paradigms, introduced by continuously evolving approaches in information and communications technology (ICT), have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in three research projects - 'German Indonesian Tsunami Early Warning System' (GITEWS), 'Distant Early Warning System' (DEWS), and 'Collaborative, Complex, and Critical Decision-Support in Evolving Crises' (TRIDEC) - new technologies are exploited to implement a cloud-based and web-based prototype to open up new prospects for EWS. This prototype, named 'TRIDEC Cloud', merges several complementary external and in-house cloud-based services into one platform for automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The prototype in its current version addresses tsunami early warning and mitigation. The integration of GPU accelerated tsunami simulation computations have been an integral part of this prototype to foster early warning with on-demand tsunami predictions based on actual source parameters. However, the platform is meant for researchers around the world to make use of the cloud-based GPU computation to analyze other types of geohazards and natural hazards and react upon the computed situation picture with a web-based GUI in a web browser at remote sites. The current website is an early alpha version for demonstration purposes to give the

  7. A Novel Cloud-Based Platform for Implementation of Oblivious Power Routing for Clusters of Microgrids

    DEFF Research Database (Denmark)

    Broojeni, Kianoosh; Amini, M. Hadi; Nejadpak, Arash

    2016-01-01

    is verified by MATLAB simulation. We also present a comprehensive cloud-based platform for further implementation of the proposed algorithm on the OPAL-RT real-time digital simulation system. The communication paths between the microgrids and the cloud environment can be emulated by OMNeT++....

  8. The thin border between cloud and aerosol: Sensitivity of several ground based observation techniques

    Science.gov (United States)

    Calbó, Josep; Long, Charles N.; González, Josep-Abel; Augustine, John; McComiskey, Allison

    2017-11-01

    Cloud and aerosol are two manifestations of what it is essentially the same physical phenomenon: a suspension of particles in the air. The differences between the two come from the different composition (e.g., much higher amount of condensed water in particles constituting a cloud) and/or particle size, and also from the different number of such particles (10-10,000 particles per cubic centimeter depending on conditions). However, there exist situations in which the distinction is far from obvious, and even when broken or scattered clouds are present in the sky, the borders between cloud/not cloud are not always well defined, a transition area that has been coined as the ;twilight zone;. The current paper presents a discussion on the definition of cloud and aerosol, the need for distinguishing or for considering the continuum between the two, and suggests a quantification of the importance and frequency of such ambiguous situations, founded on several ground-based observing techniques. Specifically, sensitivity analyses are applied on sky camera images and broadband and spectral radiometric measurements taken at Girona (Spain) and Boulder (Co, USA). Results indicate that, at these sites, in more than 5% of the daytime hours the sky may be considered cloudless (but containing aerosols) or cloudy (with some kind of optically thin clouds) depending on the observing system and the thresholds applied. Similarly, at least 10% of the time the extension of scattered or broken clouds into clear areas is problematic to establish, and depends on where the limit is put between cloud and aerosol. These findings are relevant to both technical approaches for cloud screening and sky cover categorization algorithms and radiative transfer studies, given the different effect of clouds and aerosols (and the different treatment in models) on the Earth's radiation balance.

  9. Automating NEURON Simulation Deployment in Cloud Resources.

    Science.gov (United States)

    Stockton, David B; Santamaria, Fidel

    2017-01-01

    Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.

  10. A Cloud-Based Car Parking Middleware for IoT-Based Smart Cities: Design and Implementation

    Directory of Open Access Journals (Sweden)

    Zhanlin Ji

    2014-11-01

    Full Text Available This paper presents the generic concept of using cloud-based intelligent car parking services in smart cities as an important application of the Internet of Things (IoT paradigm. This type of services will become an integral part of a generic IoT operational platform for smart cities due to its pure business-oriented features. A high-level view of the proposed middleware is outlined and the corresponding operational platform is illustrated. To demonstrate the provision of car parking services, based on the proposed middleware, a cloud-based intelligent car parking system for use within a university campus is described along with details of its design, implementation, and operation. A number of software solutions, including Kafka/Storm/Hbase clusters, OSGi web applications with distributed NoSQL, a rule engine, and mobile applications, are proposed to provide ‘best’ car parking service experience to mobile users, following the Always Best Connected and best Served (ABC&S paradigm.

  11. A Cloud-Based Car Parking Middleware for IoT-Based Smart Cities: Design and Implementation

    Science.gov (United States)

    Ji, Zhanlin; Ganchev, Ivan; O'Droma, Máirtín; Zhao, Li; Zhang, Xueji

    2014-01-01

    This paper presents the generic concept of using cloud-based intelligent car parking services in smart cities as an important application of the Internet of Things (IoT) paradigm. This type of services will become an integral part of a generic IoT operational platform for smart cities due to its pure business-oriented features. A high-level view of the proposed middleware is outlined and the corresponding operational platform is illustrated. To demonstrate the provision of car parking services, based on the proposed middleware, a cloud-based intelligent car parking system for use within a university campus is described along with details of its design, implementation, and operation. A number of software solutions, including Kafka/Storm/Hbase clusters, OSGi web applications with distributed NoSQL, a rule engine, and mobile applications, are proposed to provide ‘best’ car parking service experience to mobile users, following the Always Best Connected and best Served (ABC&S) paradigm. PMID:25429416

  12. A cloud-based car parking middleware for IoT-based smart cities: design and implementation.

    Science.gov (United States)

    Ji, Zhanlin; Ganchev, Ivan; O'Droma, Máirtín; Zhao, Li; Zhang, Xueji

    2014-11-25

    This paper presents the generic concept of using cloud-based intelligent car parking services in smart cities as an important application of the Internet of Things (IoT) paradigm. This type of services will become an integral part of a generic IoT operational platform for smart cities due to its pure business-oriented features. A high-level view of the proposed middleware is outlined and the corresponding operational platform is illustrated. To demonstrate the provision of car parking services, based on the proposed middleware, a cloud-based intelligent car parking system for use within a university campus is described along with details of its design, implementation, and operation. A number of software solutions, including Kafka/Storm/Hbase clusters, OSGi web applications with distributed NoSQL, a rule engine, and mobile applications, are proposed to provide 'best' car parking service experience to mobile users, following the Always Best Connected and best Served (ABC&S) paradigm.

  13. New Stereo Vision Digital Camera System for Simultaneous Measurement of Cloud Base Height and Atmospheric Visibility

    Science.gov (United States)

    Janeiro, F. M.; Carretas, F.; Palma, N.; Ramos, P. M.; Wagner, F.

    2013-12-01

    Clouds play an important role in many aspects of everyday life. They affect both the local weather as well as the global climate and are an important parameter on climate change studies. Cloud parameters are also important for weather prediction models which make use of actual measurements. It is thus important to have low-cost instrumentation that can be deployed in the field to measure those parameters. This kind of instruments should also be automated and robust since they may be deployed in remote places and be subject to adverse weather conditions. Although clouds are very important in environmental systems, they are also an essential component of airplane safety when visual flight rules (VFR) are enforced, such as in most small aerodromes where it is not economically viable to install instruments for assisted flying. Under VFR there are strict limits on the height of the cloud base, cloud cover and atmospheric visibility that ensure the safety of the pilots and planes. Although there are instruments, available in the market, to measure those parameters, their relatively high cost makes them unavailable in many local aerodromes. In this work we present a new prototype which has been recently developed and deployed in a local aerodrome as proof of concept. It is composed by two digital cameras that capture photographs of the sky and allow the measurement of the cloud height from the parallax effect. The new developments consist on having a new geometry which allows the simultaneous measurement of cloud base height, wind speed at cloud base height and atmospheric visibility, which was not previously possible with only two cameras. The new orientation of the cameras comes at the cost of a more complex geometry to measure the cloud base height. The atmospheric visibility is calculated from the Lambert-Beer law after the measurement of the contrast between a set of dark objects and the background sky. The prototype includes the latest hardware developments that

  14. Research on Methods for Discovering and Selecting Cloud Infrastructure Services Based on Feature Modeling

    Directory of Open Access Journals (Sweden)

    Huamin Zhu

    2016-01-01

    Full Text Available Nowadays more and more cloud infrastructure service providers are providing large numbers of service instances which are a combination of diversified resources, such as computing, storage, and network. However, for cloud infrastructure services, the lack of a description standard and the inadequate research of systematic discovery and selection methods have exposed difficulties in discovering and choosing services for users. First, considering the highly configurable properties of a cloud infrastructure service, the feature model method is used to describe such a service. Second, based on the description of the cloud infrastructure service, a systematic discovery and selection method for cloud infrastructure services are proposed. The automatic analysis techniques of the feature model are introduced to verify the model’s validity and to perform the matching of the service and demand models. Finally, we determine the critical decision metrics and their corresponding measurement methods for cloud infrastructure services, where the subjective and objective weighting results are combined to determine the weights of the decision metrics. The best matching instances from various providers are then ranked by their comprehensive evaluations. Experimental results show that the proposed methods can effectively improve the accuracy and efficiency of cloud infrastructure service discovery and selection.

  15. 2.5D Multi-View Gait Recognition Based on Point Cloud Registration

    Science.gov (United States)

    Tang, Jin; Luo, Jian; Tjahjadi, Tardi; Gao, Yan

    2014-01-01

    This paper presents a method for modeling a 2.5-dimensional (2.5D) human body and extracting the gait features for identifying the human subject. To achieve view-invariant gait recognition, a multi-view synthesizing method based on point cloud registration (MVSM) to generate multi-view training galleries is proposed. The concept of a density and curvature-based Color Gait Curvature Image is introduced to map 2.5D data onto a 2D space to enable data dimension reduction by discrete cosine transform and 2D principle component analysis. Gait recognition is achieved via a 2.5D view-invariant gait recognition method based on point cloud registration. Experimental results on the in-house database captured by a Microsoft Kinect camera show a significant performance gain when using MVSM. PMID:24686727

  16. Reducing Surface Clutter in Cloud Profiling Radar Data

    Science.gov (United States)

    Tanelli, Simone; Pak, Kyung; Durden, Stephen; Im, Eastwood

    2008-01-01

    An algorithm has been devised to reduce ground clutter in the data products of the CloudSat Cloud Profiling Radar (CPR), which is a nadir-looking radar instrument, in orbit around the Earth, that measures power backscattered by clouds as a function of distance from the instrument. Ground clutter contaminates the CPR data in the lowest 1 km of the atmospheric profile, heretofore making it impossible to use CPR data to satisfy the scientific interest in studying clouds and light rainfall at low altitude. The algorithm is based partly on the fact that the CloudSat orbit is such that the geodetic altitude of the CPR varies continuously over a range of approximately 25 km. As the geodetic altitude changes, the radar timing parameters are changed at intervals defined by flight software in order to keep the troposphere inside a data-collection time window. However, within each interval, the surface of the Earth continuously "scans through" (that is, it moves across) a few range bins of the data time window. For each radar profile, only few samples [one for every range-bin increment ((Delta)r = 240 m)] of the surface-clutter signature are available around the range bin in which the peak of surface return is observed, but samples in consecutive radar profiles are offset slightly (by amounts much less than (Delta)r) with respect to each other according to the relative change in geodetic altitude. As a consequence, in a case in which the surface area under examination is homogenous (e.g., an ocean surface), a sequence of consecutive radar profiles of the surface in that area contains samples of the surface response with range resolution (Delta)p much finer than the range-bin increment ((Delta)p 10 dB and a reduction of the contaminated altitude over ocean from about 1 km to about 0.5 km (over the ocean). The algorithm has been embedded in CloudSat L1B processing as of Release 04 (July 2007), and the estimated flat surface clutter is removed in L2B-GEOPROF product from the

  17. Cloud Computing and Its Applications in GIS

    Science.gov (United States)

    Kang, Cao

    2011-12-01

    Cloud computing is a novel computing paradigm that offers highly scalable and highly available distributed computing services. The objectives of this research are to: 1. analyze and understand cloud computing and its potential for GIS; 2. discover the feasibilities of migrating truly spatial GIS algorithms to distributed computing infrastructures; 3. explore a solution to host and serve large volumes of raster GIS data efficiently and speedily. These objectives thus form the basis for three professional articles. The first article is entitled "Cloud Computing and Its Applications in GIS". This paper introduces the concept, structure, and features of cloud computing. Features of cloud computing such as scalability, parallelization, and high availability make it a very capable computing paradigm. Unlike High Performance Computing (HPC), cloud computing uses inexpensive commodity computers. The uniform administration systems in cloud computing make it easier to use than GRID computing. Potential advantages of cloud-based GIS systems such as lower barrier to entry are consequently presented. Three cloud-based GIS system architectures are proposed: public cloud- based GIS systems, private cloud-based GIS systems and hybrid cloud-based GIS systems. Public cloud-based GIS systems provide the lowest entry barriers for users among these three architectures, but their advantages are offset by data security and privacy related issues. Private cloud-based GIS systems provide the best data protection, though they have the highest entry barriers. Hybrid cloud-based GIS systems provide a compromise between these extremes. The second article is entitled "A cloud computing algorithm for the calculation of Euclidian distance for raster GIS". Euclidean distance is a truly spatial GIS algorithm. Classical algorithms such as the pushbroom and growth ring techniques require computational propagation through the entire raster image, which makes it incompatible with the distributed nature

  18. Literature Review of Cloud Based E-learning Adoption by Students: State of the Art and Direction for Future Work

    Science.gov (United States)

    Hassan Kayali, Mohammad; Safie, Nurhizam; Mukhtar, Muriati

    2016-11-01

    Cloud computing is a new paradigm shift in information technology. Most of the studies in the cloud are business related while the studies in cloud based e-learning are few. The field is still in its infancy and researchers have used several adoption theories to discover the dimensions of this field. The purpose of this paper is to review and integrate the literature to understand the current situation of the cloud based e-learning adoption. A total of 312 articles were extracted from Science direct, emerald, and IEEE. Screening processes were applied to select only the articles that are related to the cloud based e-learning. A total of 231 removed because they are related to business organization. Next, a total of 63 articles were removed because they are technical articles. A total of 18 articles were included in this paper. A frequency analysis was conducted on the paper to identify the most frequent factors, theories, statistical software, respondents, and countries of the studies. The findings showed that usefulness and ease of use are the most frequent factors. TAM is the most prevalent adoption theories in the literature. The mean of the respondents in the reviewed studies is 377 and Malaysia is the most researched countries in terms of cloud based e-learning. Studies of cloud based e-learning are few and more empirical studies are needed.

  19. VMware vCloud director cookbook

    CERN Document Server

    Langenhan, Daniel

    2013-01-01

    VMware vCloud Director Cookbook will adopt a Cookbook-based approach. Packed with illustrations and programming examples, this book explains the simple as well as the complex recipes in an easy-to-understand language.""VMware vCloud Director Cookbook"" is aimed at system administrators and technical architects moving from a virtualized environment to cloud environments. Familiarity with cloud computing platforms and some knowledge of virtualization and managing cloud environments is expected.

  20. Effect of CALIPSO Cloud Aerosol Discrimination (CAD) Confidence Levels on Observations of Aerosol Properties near Clouds

    Science.gov (United States)

    Yang, Weidong; Marshak, Alexander; Varnai, Tamas; Liu, Zhaoyan

    2012-01-01

    CALIPSO aerosol backscatter enhancement in the transition zone between clouds and clear sky areas is revisited with particular attention to effects of data selection based on the confidence level of cloud-aerosol discrimination (CAD). The results show that backscatter behavior in the transition zone strongly depends on the CAD confidence level. Higher confidence level data has a flatter backscatter far away from clouds and a much sharper increase near clouds (within 4 km), thus a smaller transition zone. For high confidence level data it is shown that the overall backscatter enhancement is more pronounced for small clear-air segments and horizontally larger clouds. The results suggest that data selection based on CAD reduces the possible effects of cloud contamination when studying aerosol properties in the vicinity of clouds.

  1. Clouds vertical properties over the Northern Hemisphere monsoon regions from CloudSat-CALIPSO measurements

    Science.gov (United States)

    Das, Subrata Kumar; Golhait, R. B.; Uma, K. N.

    2017-01-01

    The CloudSat spaceborne radar and Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) space-borne lidar measurements, provide opportunities to understand the intriguing behavior of the vertical structure of monsoon clouds. The combined CloudSat-CALIPSO data products have been used for the summer season (June-August) of 2006-2010 to present the statistics of cloud macrophysical (such as cloud occurrence frequency, distribution of cloud top and base heights, geometrical thickness and cloud types base on occurrence height), and microphysical (such as ice water content, ice water path, and ice effective radius) properties of the Northern Hemisphere (NH) monsoon region. The monsoon regions considered in this work are the North American (NAM), North African (NAF), Indian (IND), East Asian (EAS), and Western North Pacific (WNP). The total cloud fraction over the IND (mostly multiple-layered cloud) appeared to be more frequent as compared to the other monsoon regions. Three distinctive modes of cloud top height distribution are observed over all the monsoon regions. The high-level cloud fraction is comparatively high over the WNP and IND. The ice water content and ice water path over the IND are maximum compared to the other monsoon regions. We found that the ice water content has little variations over the NAM, NAF, IND, and WNP as compared to their macrophysical properties and thus give an impression that the regional differences in dynamics and thermodynamics properties primarily cause changes in the cloud frequency or coverage and only secondary in the cloud ice properties. The background atmospheric dynamics using wind and relative humidity from the ERA-Interim reanalysis data have also been investigated which helps in understanding the variability of the cloud properties over the different monsoon regions.

  2. A Cross-Entropy-Based Admission Control Optimization Approach for Heterogeneous Virtual Machine Placement in Public Clouds

    Directory of Open Access Journals (Sweden)

    Li Pan

    2016-03-01

    Full Text Available Virtualization technologies make it possible for cloud providers to consolidate multiple IaaS provisions into a single server in the form of virtual machines (VMs. Additionally, in order to fulfill the divergent service requirements from multiple users, a cloud provider needs to offer several types of VM instances, which are associated with varying configurations and performance, as well as different prices. In such a heterogeneous virtual machine placement process, one significant problem faced by a cloud provider is how to optimally accept and place multiple VM service requests into its cloud data centers to achieve revenue maximization. To address this issue, in this paper, we first formulate such a revenue maximization problem during VM admission control as a multiple-dimensional knapsack problem, which is known to be NP-hard to solve. Then, we propose to use a cross-entropy-based optimization approach to address this revenue maximization problem, by obtaining a near-optimal eligible set for the provider to accept into its data centers, from the waiting VM service requests in the system. Finally, through extensive experiments and measurements in a simulated environment with the settings of VM instance classes derived from real-world cloud systems, we show that our proposed cross-entropy-based admission control optimization algorithm is efficient and effective in maximizing cloud providers’ revenue in a public cloud computing environment.

  3. Implementation of a 4-tier Cloud-Based Architecture for Collaborative Health Care Delivery

    Directory of Open Access Journals (Sweden)

    N. A. Azeez

    2016-06-01

    Full Text Available Cloud services permit healthcare providers to ensure information handling and allow different service resources such as Software as a Service (SaaS, Platform as a Service (PaaS and Infrastructure as a Service (IaaS on the Internet, given that security and information proprietorship concerns are attended to. Health Care Providers (HCPs in Nigeria however, have been confronted with various issues because of their method of operations. Amongst the issues are ill-advised methods of data storage and unreliable nature of patient medical records. Apart from these challenges, trouble in accessing quality healthcare services, high cost of medical services, and wrong analysis and treatment methodology are not left out. Cloud Computing has relatively possessed the capacity to give proficient and reliable method for securing medical information and the need for data mining tools in this form of distributed system will go a long way in achieving the objective set out for this project. The aim of this research therefore is to implement a cloud-based architecture that is suitable to integrate Healthcare Delivery into the cloud to provide a productive mode of operation. The proposed architecture consists of four phases (4-Tier; a User Authentication and Access Control Engine (UAACE which prevents unauthorized access to patient medical records and also utilizes standard encryption/decoding techniques to ensure privacy of such records. The architecture likewise contains a Data Analysis and Pattern Prediction Unit (DAPPU which gives valuable data that guides decision making through standard Data mining procedures as well as Cloud Service Provider (CSP and Health Care Providers (HCPs. The architecture which has been implemented on CloudSim has proved to be efficient and reliable base on the results obtained when compared with previous work.

  4. Cloud-edge mixing: Direct numerical simulation and observations in Indian Monsoon clouds

    Science.gov (United States)

    Kumar, Bipin; Bera, Sudarsan; Prabha, Thara V.; Grabowski, Wojceich W.

    2017-03-01

    A direct numerical simulation (DNS) with the decaying turbulence setup has been carried out to study cloud-edge mixing and its impact on the droplet size distribution (DSD) applying thermodynamic conditions observed in monsoon convective clouds over Indian subcontinent during the Cloud Aerosol Interaction and Precipitation Enhancement EXperiment (CAIPEEX). Evaporation at the cloud-edges initiates mixing at small scale and gradually introduces larger-scale fluctuations of the temperature, moisture, and vertical velocity due to droplet evaporation. Our focus is on early evolution of simulated fields that show intriguing similarities to the CAIPEEX cloud observations. A strong dilution at the cloud edge, accompanied by significant spatial variations of the droplet concentration, mean radius, and spectral width, are found in both the DNS and in observations. In DNS, fluctuations of the mean radius and spectral width come from the impact of small-scale turbulence on the motion and evaporation of inertial droplets. These fluctuations decrease with the increase of the volume over which DNS data are averaged, as one might expect. In cloud observations, these fluctuations also come from other processes, such as entrainment/mixing below the observation level, secondary CCN activation, or variations of CCN activation at the cloud base. Despite large differences in the spatial and temporal scales, the mixing diagram often used in entrainment/mixing studies with aircraft data is remarkably similar for both DNS and cloud observations. We argue that the similarity questions applicability of heuristic ideas based on mixing between two air parcels (that the mixing diagram is designed to properly represent) to the evolution of microphysical properties during turbulent mixing between a cloud and its environment.

  5. Cloud Infrastructure & Applications - CloudIA

    Science.gov (United States)

    Sulistio, Anthony; Reich, Christoph; Doelitzscher, Frank

    The idea behind Cloud Computing is to deliver Infrastructure-as-a-Services and Software-as-a-Service over the Internet on an easy pay-per-use business model. To harness the potentials of Cloud Computing for e-Learning and research purposes, and to small- and medium-sized enterprises, the Hochschule Furtwangen University establishes a new project, called Cloud Infrastructure & Applications (CloudIA). The CloudIA project is a market-oriented cloud infrastructure that leverages different virtualization technologies, by supporting Service-Level Agreements for various service offerings. This paper describes the CloudIA project in details and mentions our early experiences in building a private cloud using an existing infrastructure.

  6. Section-Based Tree Species Identification Using Airborne LIDAR Point Cloud

    Science.gov (United States)

    Yao, C.; Zhang, X.; Liu, H.

    2017-09-01

    The application of LiDAR data in forestry initially focused on mapping forest community, particularly and primarily intended for largescale forest management and planning. Then with the smaller footprint and higher sampling density LiDAR data available, detecting individual tree overstory, estimating crowns parameters and identifying tree species are demonstrated practicable. This paper proposes a section-based protocol of tree species identification taking palm tree as an example. Section-based method is to detect objects through certain profile among different direction, basically along X-axis or Y-axis. And this method improve the utilization of spatial information to generate accurate results. Firstly, separate the tree points from manmade-object points by decision-tree-based rules, and create Crown Height Mode (CHM) by subtracting the Digital Terrain Model (DTM) from the digital surface model (DSM). Then calculate and extract key points to locate individual trees, thus estimate specific tree parameters related to species information, such as crown height, crown radius, and cross point etc. Finally, with parameters we are able to identify certain tree species. Comparing to species information measured on ground, the portion correctly identified trees on all plots could reach up to 90.65 %. The identification result in this research demonstrate the ability to distinguish palm tree using LiDAR point cloud. Furthermore, with more prior knowledge, section-based method enable the process to classify trees into different classes.

  7. Cloud4Psi: cloud computing for 3D protein structure similarity searching.

    Science.gov (United States)

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur

    2014-10-01

    Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. © The Author 2014. Published by Oxford University Press.

  8. Registration of vehicle based panoramic image and LiDAR point cloud

    Science.gov (United States)

    Chen, Changjun; Cao, Liang; Xie, Hong; Zhuo, Xiangyu

    2013-10-01

    Higher quality surface information would be got when data from optical images and LiDAR were integrated, owing to the fact that optical images and LiDAR point cloud have unique characteristics that make them preferable in many applications. While most previous works focus on registration of pinhole perspective cameras to 2D or 3D LiDAR data. In this paper, a method for the registration of vehicle based panoramic image and LiDAR point cloud is proposed. Using the translation among panoramic image, single CCD image, laser scanner and Position and Orientation System (POS) along with the GPS/IMU data, precise co-registration between the panoramic image and the LiDAR point cloud in the world system is achieved. Results are presented under a real world data set collected by a new developed Mobile Mapping System (MMS) integrated with a high resolution panoramic camera, two laser scanners and a POS.

  9. Enhancing data utilization through adoption of cloud-based data architectures (Invited Paper 211869)

    Science.gov (United States)

    Kearns, E. J.

    2017-12-01

    A traditional approach to data distribution and utilization of open government data involves continuously moving those data from a central government location to each potential user, who would then utilize them on their local computer systems. An alternate approach would be to bring those users to the open government data, where users would also have access to computing and analytics capabilities that would support data utilization. NOAA's Big Data Project is exploring such an alternate approach through an experimental collaboration with Amazon Web Services, Google Cloud Platform, IBM, Microsoft Azure, and the Open Commons Consortium. As part of this ongoing experiment, NOAA is providing open data of interest which are freely hosted by the Big Data Project Collaborators, who provide a variety of cloud-based services and capabilities to enable utilization by data users. By the terms of the agreement, the Collaborators may charge for those value-added services and processing capacities to recover their costs to freely host the data and to generate profits if so desired. Initial results have shown sustained increases in data utilization from 2 to over 100 times previously-observed access patterns from traditional approaches. Significantly increased utilization speed as compared to the traditional approach has also been observed by NOAA data users who have volunteered their experiences on these cloud-based systems. The potential for implementing and sustaining the alternate cloud-based approach as part of a change in operational data utilization strategies will be discussed.

  10. Reliability Evaluation for the Surface to Air Missile Weapon Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Deng Jianjun

    2015-01-01

    Full Text Available The fuzziness and randomness is integrated by using digital characteristics, such as Expected value, Entropy and Hyper entropy. The cloud model adapted to reliability evaluation is put forward based on the concept of the surface to air missile weapon. The cloud scale of the qualitative evaluation is constructed, and the quantitative variable and the qualitative variable in the system reliability evaluation are corresponded. The practical calculation result shows that it is more effective to analyze the reliability of the surface to air missile weapon by this way. The practical calculation result also reflects the model expressed by cloud theory is more consistent with the human thinking style of uncertainty.

  11. Spectral Dependence of MODIS Cloud Droplet Effective Radius Retrievals for Marine Boundary Layer Clouds

    Science.gov (United States)

    Zhang, Zhibo; Platnick, Steven E.; Ackerman, Andrew S.; Cho, Hyoun-Myoung

    2014-01-01

    Low-level warm marine boundary layer (MBL) clouds cover large regions of Earth's surface. They have a significant role in Earth's radiative energy balance and hydrological cycle. Despite the fundamental role of low-level warm water clouds in climate, our understanding of these clouds is still limited. In particular, connections between their properties (e.g. cloud fraction, cloud water path, and cloud droplet size) and environmental factors such as aerosol loading and meteorological conditions continue to be uncertain or unknown. Modeling these clouds in climate models remains a challenging problem. As a result, the influence of aerosols on these clouds in the past and future, and the potential impacts of these clouds on global warming remain open questions leading to substantial uncertainty in climate projections. To improve our understanding of these clouds, we need continuous observations of cloud properties on both a global scale and over a long enough timescale for climate studies. At present, satellite-based remote sensing is the only means of providing such observations.

  12. A point cloud based pipeline for depth reconstruction from autostereoscopic sets

    Science.gov (United States)

    Niquin, Cédric; Prévost, Stéphanie; Remion, Yannick

    2010-02-01

    This is a three step pipeline to construct a 3D mesh of a scene from a set of N images, destined to be viewed on auto-stereoscopic displays. The first step matches the pixels to create a point cloud using a new algorithm based on graph-cuts. It exploits the data redundancy of the N images to ensure the geometric consistency of the scene and to reduce the graph complexity, in order to speed up the computation. It performs an accurate detection of occlusions and its results can then be used in applications like view synthesis. The second step slightly moves the points along the Z-axis to refine the point cloud. It uses a new cost including both occlusion positions and light variations deduced from the matching. The Z values are selected using a dynamic programming algorithm. This step finally generates a point cloud, which is fine enough for applications like augmented reality. From any of the two previously defined point clouds, the last step creates a colored mesh, which is a convenient data structure to be used in graphics APIs. It also generates N depth maps, allowing a comparison between the results of our method with those of other methods.

  13. Automatic atlas based electron density and structure contouring for MRI-based prostate radiation therapy on the cloud

    International Nuclear Information System (INIS)

    Dowling, J A; Burdett, N; Chandra, S; Rivest-Hénault, D; Ghose, S; Salvado, O; Fripp, J; Greer, P B; Sun, J; Parker, J; Pichler, P; Stanwell, P

    2014-01-01

    Our group have been developing methods for MRI-alone prostate cancer radiation therapy treatment planning. To assist with clinical validation of the workflow we are investigating a cloud platform solution for research purposes. Benefits of cloud computing can include increased scalability, performance and extensibility while reducing total cost of ownership. In this paper we demonstrate the generation of DICOM-RT directories containing an automatic average atlas based electron density image and fast pelvic organ contouring from whole pelvis MR scans.

  14. Automatic Atlas Based Electron Density and Structure Contouring for MRI-based Prostate Radiation Therapy on the Cloud

    Science.gov (United States)

    Dowling, J. A.; Burdett, N.; Greer, P. B.; Sun, J.; Parker, J.; Pichler, P.; Stanwell, P.; Chandra, S.; Rivest-Hénault, D.; Ghose, S.; Salvado, O.; Fripp, J.

    2014-03-01

    Our group have been developing methods for MRI-alone prostate cancer radiation therapy treatment planning. To assist with clinical validation of the workflow we are investigating a cloud platform solution for research purposes. Benefits of cloud computing can include increased scalability, performance and extensibility while reducing total cost of ownership. In this paper we demonstrate the generation of DICOM-RT directories containing an automatic average atlas based electron density image and fast pelvic organ contouring from whole pelvis MR scans.

  15. A cloud based tool for knowledge exchange on local scale flood risk.

    Science.gov (United States)

    Wilkinson, M E; Mackay, E; Quinn, P F; Stutter, M; Beven, K J; MacLeod, C J A; Macklin, M G; Elkhatib, Y; Percy, B; Vitolo, C; Haygarth, P M

    2015-09-15

    There is an emerging and urgent need for new approaches for the management of environmental challenges such as flood hazard in the broad context of sustainability. This requires a new way of working which bridges disciplines and organisations, and that breaks down science-culture boundaries. With this, there is growing recognition that the appropriate involvement of local communities in catchment management decisions can result in multiple benefits. However, new tools are required to connect organisations and communities. The growth of cloud based technologies offers a novel way to facilitate this process of exchange of information in environmental science and management; however, stakeholders need to be engaged with as part of the development process from the beginning rather than being presented with a final product at the end. Here we present the development of a pilot Local Environmental Virtual Observatory Flooding Tool. The aim was to develop a cloud based learning platform for stakeholders, bringing together fragmented data, models and visualisation tools that will enable these stakeholders to make scientifically informed environmental management decisions at the local scale. It has been developed by engaging with different stakeholder groups in three catchment case studies in the UK and a panel of national experts in relevant topic areas. However, these case study catchments are typical of many northern latitude catchments. The tool was designed to communicate flood risk in locally impacted communities whilst engaging with landowners/farmers about the risk of runoff from the farmed landscape. It has been developed iteratively to reflect the needs, interests and capabilities of a wide range of stakeholders. The pilot tool combines cloud based services, local catchment datasets, a hydrological model and bespoke visualisation tools to explore real time hydrometric data and the impact of flood risk caused by future land use changes. The novel aspects of the

  16. Cloud Type Classification (cldtype) Value-Added Product

    Energy Technology Data Exchange (ETDEWEB)

    Flynn, Donna [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Shi, Yan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lim, K-S [Korean Atomic Energy Research Inst., Daejeon (South Korea); Riihimaki, Laura [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-08-15

    The Cloud Type (cldtype) value-added product (VAP) provides an automated cloud type classification based on macrophysical quantities derived from vertically pointing lidar and radar. Up to 10 layers of clouds are classified into seven cloud types based on predetermined and site-specific thresholds of cloud top, base and thickness. Examples of thresholds for selected U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility sites are provided in Tables 1 and 2. Inputs for the cldtype VAP include lidar and radar cloud boundaries obtained from the Active Remotely Sensed Cloud Location (ARSCL) and Surface Meteorological Systems (MET) data. Rain rates from MET are used to determine when radar signal attenuation precludes accurate cloud detection. Temporal resolution and vertical resolution for cldtype are 1 minute and 30 m respectively and match the resolution of ARSCL. The cldtype classification is an initial step for further categorization of clouds. It was developed for use by the Shallow Cumulus VAP to identify potential periods of interest to the LASSO model and is intended to find clouds of interest for a variety of users.

  17. Mesoscale meteorological model based on radioactive explosion cloud simulation

    International Nuclear Information System (INIS)

    Zheng Yi; Zhang Yan; Ying Chuntong

    2008-01-01

    In order to simulate nuclear explosion and dirty bomb radioactive cloud movement and concentration distribution, mesoscale meteorological model RAMS was used. Particles-size, size-active distribution and gravitational fallout in the cloud were considered. The results show that the model can simulate the 'mushroom' clouds of explosion. Three-dimension fluid field and radioactive concentration field were received. (authors)

  18. A hybrid cloud read aligner based on MinHash and kmer voting that preserves privacy

    Science.gov (United States)

    Popic, Victoria; Batzoglou, Serafim

    2017-05-01

    Low-cost clouds can alleviate the compute and storage burden of the genome sequencing data explosion. However, moving personal genome data analysis to the cloud can raise serious privacy concerns. Here, we devise a method named Balaur, a privacy preserving read mapper for hybrid clouds based on locality sensitive hashing and kmer voting. Balaur can securely outsource a substantial fraction of the computation to the public cloud, while being highly competitive in accuracy and speed with non-private state-of-the-art read aligners on short read data. We also show that the method is significantly faster than the state of the art in long read mapping. Therefore, Balaur can enable institutions handling massive genomic data sets to shift part of their analysis to the cloud without sacrificing accuracy or exposing sensitive information to an untrusted third party.

  19. Molecular clouds without detectable CO

    International Nuclear Information System (INIS)

    Blitz, L.; Bazell, D.; Desert, F.X.

    1990-01-01

    The clouds identified by Desert, Bazell, and Boulanger (DBB clouds) in their search for high-latitude molecular clouds were observed in the CO (J = 1-0) line, but only 13 percent of the sample was detected. The remaining 87 percent are diffuse molecular clouds with CO abundances of about 10 to the -6th, a typical value for diffuse clouds. This hypothesis is shown to be consistent with Copernicus data. The DBB clouds are shown to be an essentially complete catalog of diffuse molecular clouds in the solar vicinity. The total molecular surface density in the vicinity of the sun is then only about 20 percent greater than the 1.3 solar masses/sq pc determined by Dame et al. (1987). Analysis of the CO detections indicates that there is a sharp threshold in extinction of 0.25 mag before CO is detectable and is derived from the IRAS I(100) micron threshold of 4 MJy/sr. This threshold is presumably where the CO abundance exhibits a sharp increase 18 refs

  20. Cloud Storage and Bioinformatics in a private cloud deployment: Lessons for Data Intensive research

    OpenAIRE

    Chang, Victor; Walters, Robert John; Wills, Gary

    2013-01-01

    This paper describes service portability for a private cloud deployment, including a detailed case study about Cloud Storage and bioinformatics services developed as part of the Cloud Computing Adoption Framework (CCAF). Our Cloud Storage design and deployment is based on Storage Area Network (SAN) technologies, details of which include functionalities, technical implementation, architecture and user support. Experiments for data services (backup automation, data recovery and data migration) ...

  1. Development and clinical study of mobile 12-lead electrocardiography based on cloud computing for cardiac emergency.

    Science.gov (United States)

    Fujita, Hideo; Uchimura, Yuji; Waki, Kayo; Omae, Koji; Takeuchi, Ichiro; Ohe, Kazuhiko

    2013-01-01

    To improve emergency services for accurate diagnosis of cardiac emergency, we developed a low-cost new mobile electrocardiography system "Cloud Cardiology®" based upon cloud computing for prehospital diagnosis. This comprises a compact 12-lead ECG unit equipped with Bluetooth and Android Smartphone with an application for transmission. Cloud server enables us to share ECG simultaneously inside and outside the hospital. We evaluated the clinical effectiveness by conducting a clinical trial with historical comparison to evaluate this system in a rapid response car in the real emergency service settings. We found that this system has an ability to shorten the onset to balloon time of patients with acute myocardial infarction, resulting in better clinical outcome. Here we propose that cloud-computing based simultaneous data sharing could be powerful solution for emergency service for cardiology, along with its significant clinical outcome.

  2. CONTINUOUSLY DEFORMATION MONITORING OF SUBWAY TUNNEL BASED ON TERRESTRIAL POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    Z. Kang

    2012-07-01

    Full Text Available The deformation monitoring of subway tunnel is of extraordinary necessity. Therefore, a method for deformation monitoring based on terrestrial point clouds is proposed in this paper. First, the traditional adjacent stations registration is replaced by sectioncontrolled registration, so that the common control points can be used by each station and thus the error accumulation avoided within a section. Afterwards, the central axis of the subway tunnel is determined through RANSAC (Random Sample Consensus algorithm and curve fitting. Although with very high resolution, laser points are still discrete and thus the vertical section is computed via the quadric fitting of the vicinity of interest, instead of the fitting of the whole model of a subway tunnel, which is determined by the intersection line rotated about the central axis of tunnel within a vertical plane. The extraction of the vertical section is then optimized using RANSAC for the purpose of filtering out noises. Based on the extracted vertical sections, the volume of tunnel deformation is estimated by the comparison between vertical sections extracted at the same position from different epochs of point clouds. Furthermore, the continuously extracted vertical sections are deployed to evaluate the convergent tendency of the tunnel. The proposed algorithms are verified using real datasets in terms of accuracy and computation efficiency. The experimental result of fitting accuracy analysis shows the maximum deviation between interpolated point and real point is 1.5 mm, and the minimum one is 0.1 mm; the convergent tendency of the tunnel was detected by the comparison of adjacent fitting radius. The maximum error is 6 mm, while the minimum one is 1 mm. The computation cost of vertical section abstraction is within 3 seconds/section, which proves high efficiency..

  3. Scenario-Based Digital Forensics Challenges in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Erik Miranda Lopez

    2016-10-01

    Full Text Available The aim of digital forensics is to extract information to answer the 5Ws (Why, When, Where, What, and Who from the data extracted from the evidence. In order to achieve this, most digital forensic processes assume absolute control of digital evidence. However, in a cloud environment forensic investigation, this is not always possible. Additionally, the unique characteristics of cloud computing create new technical, legal and architectural challenges when conducting a forensic investigation. We propose a hypothetical scenario to uncover and explain the challenges forensic practitioners face during cloud investigations. Additionally, we also provide solutions to address the challenges. Our hypothetical case scenario has shown that, in the long run, better live forensic tools, development of new methods tailored for cloud investigations and new procedures and standards are indeed needed. Furthermore, we have come to the conclusion that forensic investigations biggest challenge is not technical but legal.

  4. Analysis and Research on Spatial Data Storage Model Based on Cloud Computing Platform

    Science.gov (United States)

    Hu, Yong

    2017-12-01

    In this paper, the data processing and storage characteristics of cloud computing are analyzed and studied. On this basis, a cloud computing data storage model based on BP neural network is proposed. In this data storage model, it can carry out the choice of server cluster according to the different attributes of the data, so as to complete the spatial data storage model with load balancing function, and have certain feasibility and application advantages.

  5. Trust management in cloud services

    CERN Document Server

    Noor, Talal H; Bouguettaya, Athman

    2014-01-01

    This book describes the design and implementation of Cloud Armor, a novel approach for credibility-based trust management and automatic discovery of cloud services in distributed and highly dynamic environments. This book also helps cloud users to understand the difficulties of establishing trust in cloud computing and the best criteria for selecting a service cloud. The techniques have been validated by a prototype system implementation and experimental studies using a collection of real world trust feedbacks on cloud services.The authors present the design and implementation of a novel pro

  6. Making Cloud-based Systems Elasticity Testing Reproducible

    OpenAIRE

    Albonico , Michel; Mottu , Jean-Marie; Sunyé , Gerson; Alvares , Frederico

    2017-01-01

    International audience; Elastic cloud infrastructures vary computational resources at runtime, i. e., elasticity, which is error-prone. That makes testing throughout elasticity crucial for those systems. Those errors are detected thanks to tests that should run deterministically many times all along the development. However, elasticity testing reproduction requires several features not supported natively by the main cloud providers, such as Amazon EC2. We identify three requirements that we c...

  7. Evaluation of Multilayer Cloud Detection Using a MODIS CO2-Slicing Algorithm With CALIPSO-CloudSat Measurements

    Science.gov (United States)

    Viudez-Mora, Antonio; Kato, Seiji

    2015-01-01

    This work evaluates the multilayer cloud (MCF) algorithm based on CO2-slicing techniques against CALISPO-CloudSat (CLCS) measurement. This evaluation showed that the MCF underestimates the presence of multilayered clouds compared with CLCS and are retrained to cloud emissivities below 0.8 and cloud optical septs no larger than 0.3.

  8. Cloud properties derived from two lidars over the ARM SGP site

    Energy Technology Data Exchange (ETDEWEB)

    Dupont, Jean-Charles; Haeffelin, Martial; Morille, Y.; Comstock, Jennifer M.; Flynn, Connor J.; Long, Charles N.; Sivaraman, Chitra; Newsom, Rob K.

    2011-02-16

    [1] Active remote sensors such as lidars or radars can be used with other data to quantify the cloud properties at regional scale and at global scale (Dupont et al., 2009). Relative to radar, lidar remote sensing is sensitive to very thin and high clouds but has a significant limitation due to signal attenuation in the ability to precisely quantify the properties of clouds with a 20 cloud optical thickness larger than 3. In this study, 10-years of backscatter lidar signal data are analysed by a unique algorithm called STRucture of ATmosphere (STRAT, Morille et al., 2007). We apply the STRAT algorithm to data from both the collocated Micropulse lidar (MPL) and a Raman lidar (RL) at the Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) site between 1998 and 2009. Raw backscatter lidar signal is processed and 25 corrections for detector deadtime, afterpulse, and overlap are applied. (Campbell et al.) The cloud properties for all levels of clouds are derived and distributions of cloud base height (CBH), top height (CTH), physical cloud thickness (CT), and optical thickness (COT) from local statistics are compared. The goal of this study is (1) to establish a climatology of macrophysical and optical properties for all levels of clouds observed over the ARM SGP site 30 and (2) to estimate the discrepancies induced by the two remote sensing systems (pulse energy, sampling, resolution, etc.). Our first results tend to show that the MPLs, which are the primary ARM lidars, have a distinctly limited range where all of these cloud properties are detectable, especially cloud top and cloud thickness, but even actual cloud base especially during summer daytime period. According to the comparisons between RL and MPL, almost 50% of situations show a signal to noise ratio too low (smaller than 3) for the MPL in order to detect clouds higher than 7km during daytime period in summer. Consequently, the MPLderived annual cycle of cirrus cloud base (top) altitude is

  9. Comparison of CERES-MODIS stratus cloud properties with ground-based measurements at the DOE ARM Southern Great Plains site

    Science.gov (United States)

    Dong, Xiquan; Minnis, Patrick; Xi, Baike; Sun-Mack, Sunny; Chen, Yan

    2008-02-01

    Overcast stratus cloud properties derived for the Clouds and the Earth's Radiant Energy System (CERES) project using Terra and Aqua Moderate Resolution Imaging Spectroradiometer (MODIS) data are compared with observations taken at the Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Southern Great Plains site from March 2000 through December 2004. Retrievals from ARM surface-based data were averaged over a 1-h interval centered at the time of each satellite overpass, and the CERES-MODIS cloud properties were averaged within a 30 km × 30 km box centered on the ARM SGP site. Two data sets were analyzed: all of the data (ALL), which include multilayered, single-layered, and slightly broken stratus decks and a subset, single-layered unbroken decks (SL). The CERES-MODIS effective cloud heights were determined from effective cloud temperature using a lapse rate method with the surface temperature specified as the 24-h mean surface air temperature. For SL stratus, they are, on average, within the ARM radar-lidar estimated cloud boundaries and are 0.534 ± 0.542 km and 0.108 ± 0.480 km lower than the cloud physical tops and centers, respectively, and are comparable for day and night observations. The mean differences and standard deviations are slightly larger for ALL data, but not statistically different to those of SL data. The MODIS-derived effective cloud temperatures are 2.7 ± 2.4 K less than the surface-observed SL cloud center temperatures with very high correlations (0.86-0.97). Variations in the height differences are mainly caused by uncertainties in the surface air temperatures, lapse rates, and cloud top height variability. The biases are mainly the result of the differences between effective and physical cloud top, which are governed by cloud liquid water content and viewing zenith angle, and the selected lapse rate, -7.1 K km-1. On the basis of a total of 43 samples, the means and standard deviations of the differences between the daytime

  10. Analysis of the Health Information and Communication System and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2015-05-01

    Full Text Available This paper describes an analysis and shows its use in analysing strengths, weaknesses, opportunities and threats (risks within the health care system.The aim is further more to show strengths, weaknesses, opportunities and threats when using cloud computing in the health care system. Cloud computing in medicine is an integral part of telemedicine. Based on the information presented in this paper, employees may identify the advantages and disadvantages of using cloud computing. When introducing new information technologies in the health care business the implementers will encounter numerous problems, such as: the complexity of the existing and the new information system, the costs of maintaining and updating the software, the cost of implementing new modules,a way of protecting the existing data in the database and the data that will be collected in the diagnosis. Using the SWOT analysis this paper evaluates the feasibility and possibility of adopting cloud computing in the health sector to improve health services based on samples (examples from abroad. The intent of cloud computing in medicine is to send data of the patient to the doctor instead of the patient sending it himself/herself.

  11. MIN-CUT BASED SEGMENTATION OF AIRBORNE LIDAR POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    S. Ural

    2012-07-01

    Full Text Available Introducing an organization to the unstructured point cloud before extracting information from airborne lidar data is common in many applications. Aggregating the points with similar features into segments in 3-D which comply with the nature of actual objects is affected by the neighborhood, scale, features and noise among other aspects. In this study, we present a min-cut based method for segmenting the point cloud. We first assess the neighborhood of each point in 3-D by investigating the local geometric and statistical properties of the candidates. Neighborhood selection is essential since point features are calculated within their local neighborhood. Following neighborhood determination, we calculate point features and determine the clusters in the feature space. We adapt a graph representation from image processing which is especially used in pixel labeling problems and establish it for the unstructured 3-D point clouds. The edges of the graph that are connecting the points with each other and nodes representing feature clusters hold the smoothness costs in the spatial domain and data costs in the feature domain. Smoothness costs ensure spatial coherence, while data costs control the consistency with the representative feature clusters. This graph representation formalizes the segmentation task as an energy minimization problem. It allows the implementation of an approximate solution by min-cuts for a global minimum of this NP hard minimization problem in low order polynomial time. We test our method with airborne lidar point cloud acquired with maximum planned post spacing of 1.4 m and a vertical accuracy 10.5 cm as RMSE. We present the effects of neighborhood and feature determination in the segmentation results and assess the accuracy and efficiency of the implemented min-cut algorithm as well as its sensitivity to the parameters of the smoothness and data cost functions. We find that smoothness cost that only considers simple distance

  12. Formation of Massive Molecular Cloud Cores by Cloud-cloud Collision

    OpenAIRE

    Inoue, Tsuyoshi; Fukui, Yasuo

    2013-01-01

    Recent observations of molecular clouds around rich massive star clusters including NGC3603, Westerlund 2, and M20 revealed that the formation of massive stars could be triggered by a cloud-cloud collision. By using three-dimensional, isothermal, magnetohydrodynamics simulations with the effect of self-gravity, we demonstrate that massive, gravitationally unstable, molecular cloud cores are formed behind the strong shock waves induced by the cloud-cloud collision. We find that the massive mol...

  13. SOME CONSIDERATIONS ON CLOUD ACCOUNTING

    OpenAIRE

    Doina Pacurari; Elena Nechita

    2013-01-01

    Cloud technologies have developed intensively during the last years. Cloud computing allows the customers to interact with their data and applications at any time, from any location, while the providers host these resources. A client company may choose to run in the cloud a part of its business (sales by agents, payroll, etc.), or even the entire business. The company can get access to a large category of cloud-based software, including accounting software. Cloud solutions are especially reco...

  14. Cloud point extraction of palladium in water samples and alloy mixtures using new synthesized reagent with flame atomic absorption spectrometry (FAAS)

    International Nuclear Information System (INIS)

    Priya, B. Krishna; Subrahmanayam, P.; Suvardhan, K.; Kumar, K. Suresh; Rekha, D.; Rao, A. Venkata; Rao, G.C.; Chiranjeevi, P.

    2007-01-01

    The present paper outlines novel, simple and sensitive method for the determination of palladium by flame atomic absorption spectrometry (FAAS) after separation and preconcentration by cloud point extraction (CPE). The cloud point methodology was successfully applied for palladium determination by using new reagent 4-(2-naphthalenyl)thiozol-2yl azo chromotropic acid (NTACA) and hydrophobic ligand Triton X-114 as chelating agent and nonionic surfactant respectively in the water samples and alloys. The following parameters such as pH, concentration of the reagent and Triton X-114, equilibrating temperature and centrifuging time were evaluated and optimized to enhance the sensitivity and extraction efficiency of the proposed method. The preconcentration factor was found to be (50-fold) for 250 ml of water sample. Under optimum condition the detection limit was found as 0.067 ng ml -1 for palladium in various environmental matrices. The present method was applied for the determination of palladium in various water samples, alloys and the result shows good agreement with reported method and the recoveries are in the range of 96.7-99.4%

  15. Keyword-based Ciphertext Search Algorithm under Cloud Storage

    Directory of Open Access Journals (Sweden)

    Ren Xunyi

    2016-01-01

    Full Text Available With the development of network storage services, cloud storage have the advantage of high scalability , inexpensive, without access limit and easy to manage. These advantages make more and more small or medium enterprises choose to outsource large quantities of data to a third party. This way can make lots of small and medium enterprises get rid of costs of construction and maintenance, so it has broad market prospects. But now lots of cloud storage service providers can not protect data security.This result leakage of user data, so many users have to use traditional storage method.This has become one of the important factors that hinder the development of cloud storage. In this article, establishing keyword index by extracting keywords from ciphertext data. After that, encrypted data and the encrypted index upload cloud server together.User get related ciphertext by searching encrypted index, so it can response data leakage problem.

  16. Continuously deformation monitoring of subway tunnel based on terrestrial point clouds

    NARCIS (Netherlands)

    Kang, Z.; Tuo, L.; Zlatanova, S.

    2012-01-01

    The deformation monitoring of subway tunnel is of extraordinary necessity. Therefore, a method for deformation monitoring based on terrestrial point clouds is proposed in this paper. First, the traditional adjacent stations registration is replaced by sectioncontrolled registration, so that the

  17. Sensitive Data Protection Based on Intrusion Tolerance in Cloud Computing

    OpenAIRE

    Jingyu Wang; xuefeng Zheng; Dengliang Luo

    2011-01-01

    Service integration and supply on-demand coming from cloud computing can significantly improve the utilization of computing resources and reduce power consumption of per service, and effectively avoid the error of computing resources. However, cloud computing is still facing the problem of intrusion tolerance of the cloud computing platform and sensitive data of new enterprise data center. In order to address the problem of intrusion tolerance of cloud computing platform and sensitive data in...

  18. Cloud Computing. Technology Briefing. Number 1

    Science.gov (United States)

    Alberta Education, 2013

    2013-01-01

    Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…

  19. Cloud-Based Speech Technology for Assistive Technology Applications (CloudCAST).

    Science.gov (United States)

    Cunningham, Stuart; Green, Phil; Christensen, Heidi; Atria, José Joaquín; Coy, André; Malavasi, Massimiliano; Desideri, Lorenzo; Rudzicz, Frank

    2017-01-01

    The CloudCAST platform provides a series of speech recognition services that can be integrated into assistive technology applications. The platform and the services provided by the public API are described. Several exemplar applications have been developed to demonstrate the platform to potential developers and users.

  20. a Gross Error Elimination Method for Point Cloud Data Based on Kd-Tree

    Science.gov (United States)

    Kang, Q.; Huang, G.; Yang, S.

    2018-04-01

    Point cloud data has been one type of widely used data sources in the field of remote sensing. Key steps of point cloud data's pro-processing focus on gross error elimination and quality control. Owing to the volume feature of point could data, existed gross error elimination methods need spend massive memory both in space and time. This paper employed a new method which based on Kd-tree algorithm to construct, k-nearest neighbor algorithm to search, settled appropriate threshold to determine with result turns out a judgement that whether target point is or not an outlier. Experimental results show that, our proposed algorithm will help to delete gross error in point cloud data and facilitate to decrease memory consumption, improve efficiency.

  1. A GROSS ERROR ELIMINATION METHOD FOR POINT CLOUD DATA BASED ON KD-TREE

    Directory of Open Access Journals (Sweden)

    Q. Kang

    2018-04-01

    Full Text Available Point cloud data has been one type of widely used data sources in the field of remote sensing. Key steps of point cloud data’s pro-processing focus on gross error elimination and quality control. Owing to the volume feature of point could data, existed gross error elimination methods need spend massive memory both in space and time. This paper employed a new method which based on Kd-tree algorithm to construct, k-nearest neighbor algorithm to search, settled appropriate threshold to determine with result turns out a judgement that whether target point is or not an outlier. Experimental results show that, our proposed algorithm will help to delete gross error in point cloud data and facilitate to decrease memory consumption, improve efficiency.

  2. Cloud point extraction-flame atomic absorption spectrometry for pre-concentration and determination of trace amounts of silver ions in water samples.

    Science.gov (United States)

    Yang, Xiupei; Jia, Zhihui; Yang, Xiaocui; Li, Gu; Liao, Xiangjun

    2017-03-01

    A cloud point extraction (CPE) method was used as a pre-concentration strategy prior to the determination of trace levels of silver in water by flame atomic absorption spectrometry (FAAS) The pre-concentration is based on the clouding phenomena of non-ionic surfactant, triton X-114, with Ag (I)/diethyldithiocarbamate (DDTC) complexes in which the latter is soluble in a micellar phase composed by the former. When the temperature increases above its cloud point, the Ag (I)/DDTC complexes are extracted into the surfactant-rich phase. The factors affecting the extraction efficiency including pH of the aqueous solution, concentration of the DDTC, amount of the surfactant, incubation temperature and time were investigated and optimized. Under the optimal experimental conditions, no interference was observed for the determination of 100 ng·mL -1 Ag + in the presence of various cations below their maximum concentrations allowed in this method, for instance, 50 μg·mL -1 for both Zn 2+ and Cu 2+ , 80 μg·mL -1 for Pb 2+ , 1000 μg·mL -1 for Mn 2+ , and 100 μg·mL -1 for both Cd 2+ and Ni 2+ . The calibration curve was linear in the range of 1-500 ng·mL -1 with a limit of detection (LOD) at 0.3 ng·mL -1 . The developed method was successfully applied for the determination of trace levels of silver in water samples such as river water and tap water.

  3. Education on the Cloud: Researching Student-Centered, Cloud-Based Learning Prospects in the Context of a European Network

    Science.gov (United States)

    Panoutsopoulos, Hercules; Donert, Karl; Papoutsis, Panos; Kotsanis, Ioannis

    2015-01-01

    During the last few years, ongoing developments in the technological field of Cloud computing have initiated discourse on the potential of the Cloud to be systematically exploited in educational contexts. Research interest has been stimulated by a range of advantages of Cloud technologies (e.g. adaptability, flexibility, scalability,…

  4. Personalized cloud-based bioinformatics services for research and education: use cases and the elasticHPC package.

    Science.gov (United States)

    El-Kalioby, Mohamed; Abouelhoda, Mohamed; Krüger, Jan; Giegerich, Robert; Sczyrba, Alexander; Wall, Dennis P; Tonellato, Peter

    2012-01-01

    Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org.

  5. Automatic cloud coverage assessment of Formosat-2 image

    Science.gov (United States)

    Hsu, Kuo-Hsien

    2011-11-01

    Formosat-2 satellite equips with the high-spatial-resolution (2m ground sampling distance) remote sensing instrument. It has been being operated on the daily-revisiting mission orbit by National Space organization (NSPO) of Taiwan since May 21 2004. NSPO has also serving as one of the ground receiving stations for daily processing the received Formosat- 2 images. The current cloud coverage assessment of Formosat-2 image for NSPO Image Processing System generally consists of two major steps. Firstly, an un-supervised K-means method is used for automatically estimating the cloud statistic of Formosat-2 image. Secondly, manual estimation of cloud coverage from Formosat-2 image is processed by manual examination. Apparently, a more accurate Automatic Cloud Coverage Assessment (ACCA) method certainly increases the efficiency of processing step 2 with a good prediction of cloud statistic. In this paper, mainly based on the research results from Chang et al, Irish, and Gotoh, we propose a modified Formosat-2 ACCA method which considered pre-processing and post-processing analysis. For pre-processing analysis, cloud statistic is determined by using un-supervised K-means classification, Sobel's method, Otsu's method, non-cloudy pixels reexamination, and cross-band filter method. Box-Counting fractal method is considered as a post-processing tool to double check the results of pre-processing analysis for increasing the efficiency of manual examination.

  6. Optical and geometrical properties of cirrus clouds in Amazonia derived from 1 year of ground-based lidar measurements

    Science.gov (United States)

    Gouveia, Diego A.; Barja, Boris; Barbosa, Henrique M. J.; Seifert, Patric; Baars, Holger; Pauliquevis, Theotonio; Artaxo, Paulo

    2017-03-01

    Cirrus clouds cover a large fraction of tropical latitudes and play an important role in Earth's radiation budget. Their optical properties, altitude, vertical and horizontal coverage control their radiative forcing, and hence detailed cirrus measurements at different geographical locations are of utmost importance. Studies reporting cirrus properties over tropical rain forests like the Amazon, however, are scarce. Studies with satellite profilers do not give information on the diurnal cycle, and the satellite imagers do not report on the cloud vertical structure. At the same time, ground-based lidar studies are restricted to a few case studies. In this paper, we derive the first comprehensive statistics of optical and geometrical properties of upper-tropospheric cirrus clouds in Amazonia. We used 1 year (July 2011 to June 2012) of ground-based lidar atmospheric observations north of Manaus, Brazil. This dataset was processed by an automatic cloud detection and optical properties retrieval algorithm. Upper-tropospheric cirrus clouds were observed more frequently than reported previously for tropical regions. The frequency of occurrence was found to be as high as 88 % during the wet season and not lower than 50 % during the dry season. The diurnal cycle shows a minimum around local noon and maximum during late afternoon, associated with the diurnal cycle of precipitation. The mean values of cirrus cloud top and base heights, cloud thickness, and cloud optical depth were 14.3 ± 1.9 (SD) km, 12.9 ± 2.2 km, 1.4 ± 1.1 km, and 0.25 ± 0.46, respectively. Cirrus clouds were found at temperatures down to -90 °C. Frequently cirrus were observed within the tropical tropopause layer (TTL), which are likely associated to slow mesoscale uplifting or to the remnants of overshooting convection. The vertical distribution was not uniform, and thin and subvisible cirrus occurred more frequently closer to the tropopause. The mean lidar ratio was 23.3 ± 8.0 sr. However, for

  7. Towards trustworthy health platform cloud

    NARCIS (Netherlands)

    Deng, M.; Nalin, M.; Petkovic, M.; Baroni, I.; Marco, A.; Jonker, W.; Petkovic, M.

    2012-01-01

    To address today’s major concerns of health service providers regarding security, resilience and data protection when moving on the cloud, we propose an approach to build a trustworthy healthcare platform cloud, based on a trustworthy cloud infrastructure. This paper first highlights the main

  8. Macquarie Island Cloud and Radiation Experiment (MICRE) Science Plan

    Energy Technology Data Exchange (ETDEWEB)

    Marchand, RT [University of Washington; Protat, A [Australian Bureau of Meterology; Alexander, SP [Australian Antarctic Division

    2015-12-01

    Clouds over the Southern Ocean are poorly represented in present day reanalysis products and global climate model simulations. Errors in top-of-atmosphere (TOA) broadband radiative fluxes in this region are among the largest globally, with large implications for modeling both regional and global scale climate responses (e.g., Trenberth and Fasullo 2010, Ceppi et al. 2012). Recent analyses of model simulations suggest that model radiative errors in the Southern Ocean are due to a lack of low-level postfrontal clouds (including clouds well behind the front) and perhaps a lack of supercooled liquid water that contribute most to the model biases (Bodas-Salcedo et al. 2013, Huang et al. 2014). These assessments of model performance, as well as our knowledge of cloud and aerosol properties over the Southern Ocean, rely heavily on satellite data sets. Satellite data sets are incomplete in that the observations are not continuous (i.e., they are acquired only when the satellite passes nearby), generally do not sample the diurnal cycle, and view primarily the tops of cloud systems (especially for the passive instruments). This is especially problematic for retrievals of aerosol, low-cloud properties, and layers of supercooled water embedded within (rather than at the top of) clouds, as well as estimates of surface shortwave and longwave fluxes based on these properties.

  9. Influences of cloud heterogeneity on cirrus optical properties retrieved from the visible and near-infrared channels of MODIS/SEVIRI for flat and optically thick cirrus clouds

    International Nuclear Information System (INIS)

    Zhou, Yongbo; Sun, Xuejin; Zhang, Riwei; Zhang, Chuanliang; Li, Haoran; Zhou, Junhao; Li, Shaohui

    2017-01-01

    The influences of three-dimensional radiative effects and horizontal heterogeneity effects on the retrieval of cloud optical thickness (COT) and effective diameter (De) for cirrus clouds are explored by the SHDOM radiative transfer model. The stochastic cirrus clouds are generated by the Cloudgen model based on the Atmospheric Radiation Measurement program data. Incorporating a new ice cloud spectral model, we evaluate the retrieval errors for two solar zenith angles (SZAs) (30° and 60°), four solar azimuth angles (0°, 45°, 90°, and 180°), and two sensor settings (Moderate Resolution Imaging Spectrometer (MODIS) onboard Aqua and Spinning Enhanced Visible and Infrared Imager (SEVIRI) onboard METEOSAT-8). The domain-averaged relative error of COT (μ) ranges from −24.1 % to -1.0 % (SZA = 30°) and from −11.6 % to 3.3 % (SZA = 60°), with the uncertainty within 7.5 % to –12.5 % (SZA = 30°) and 20.0 % - 27.5 % (SZA = 60°). For the SZA of 60° only, the relative error and uncertainty are parameterized by the retrieved COT by linear functions, providing bases to correct the retrieved COT and estimate their uncertainties. Besides, De is overestimated by 0.7–15.0 μm on the domain average, with the corresponding uncertainty within 6.7–26.5 μm. The retrieval errors show no discernible dependence on solar azimuth angle due to the flat tops and full coverage of the cirrus samples. The results are valid only for the two samples and for the specific spatial resolution of the radiative transfer simulations. - Highlights: • The retrieved cloud optical properties for 3-D cirrus clouds are evaluated. • The cloud optical thickness and uncertainty could be corrected and estimated. • On the domain average, the effective diameter of ice crystal is overestimated. • The optical properties show non-obvious dependence on the solar azimuth angle.

  10. The importance of the sampling frequency in determining short-time-averaged irradiance and illuminance for rapidly changing cloud cover

    International Nuclear Information System (INIS)

    Delaunay, J.J.; Rommel, M.; Geisler, J.

    1994-01-01

    The sampling interval is an important parameter which must be chosen carefully, if measurements of the direct, global, and diffuse irradiance or illuminance are carried out to determine their averages over a given period. Using measurements from a day with rapidly moving clouds, we investigated the influence of the sampling interval on the uncertainly of the calculated 15-min averages. We conclude, for this averaging period, that the sampling interval should not exceed 60 s and 10 s for measurement of the diffuse and global components respectively, to reduce the influence of the sampling interval below 2%. For the direct component, even a 5 s sampling interval is too long to reach this influence level for days with extremely quickly changing insolation conditions. (author)

  11. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  12. Relationship between turbulence energy and density variance in the solar neighbourhood molecular clouds

    Science.gov (United States)

    Kainulainen, J.; Federrath, C.

    2017-11-01

    The relationship between turbulence energy and gas density variance is a fundamental prediction for turbulence-dominated media and is commonly used in analytic models of star formation. We determine this relationship for 15 molecular clouds in the solar neighbourhood. We use the line widths of the CO molecule as the probe of the turbulence energy (sonic Mach number, ℳs) and three-dimensional models to reconstruct the density probability distribution function (ρ-PDF) of the clouds, derived using near-infrared extinction and Herschel dust emission data, as the probe of the density variance (σs). We find no significant correlation between ℳs and σs among the studied clouds, but we cannot rule out a weak correlation either. In the context of turbulence-dominated gas, the range of the ℳs and σs values corresponds to the model predictions. The data cannot constrain whether the turbulence-driving parameter, b, and/or thermal-to-magnetic pressure ratio, β, vary among the sample clouds. Most clouds are not in agreement with field strengths stronger than given by β ≲ 0.05. A model with b2β/ (β + 1) = 0.30 ± 0.06 provides an adequate fit to the cloud sample as a whole. Based on the average behaviour of the sample, we can rule out three regimes: (i) strong compression combined with a weak magnetic field (b ≳ 0.7 and β ≳ 3); (ii) weak compression (b ≲ 0.35); and (iii) a strong magnetic field (β ≲ 0.1). When we include independent magnetic field strength estimates in the analysis, the data rule out solenoidal driving (b < 0.4) for the majority of the solar neighbourhood clouds. However, most clouds have b parameters larger than unity, which indicates a discrepancy with the turbulence-dominated picture; we discuss the possible reasons for this.

  13. Cloud-based services for your library a LITA guide

    CERN Document Server

    Mitchell, Erik T

    2013-01-01

    By exploring specific examples of cloud computing and virtualization, this book allows libraries considering cloud computing to start their exploration of these systems with a more informed perspective.

  14. Cloud-Coffee: implementation of a parallel consistency-based multiple alignment algorithm in the T-Coffee package and its benchmarking on the Amazon Elastic-Cloud.

    Science.gov (United States)

    Di Tommaso, Paolo; Orobitg, Miquel; Guirado, Fernando; Cores, Fernado; Espinosa, Toni; Notredame, Cedric

    2010-08-01

    We present the first parallel implementation of the T-Coffee consistency-based multiple aligner. We benchmark it on the Amazon Elastic Cloud (EC2) and show that the parallelization procedure is reasonably effective. We also conclude that for a web server with moderate usage (10K hits/month) the cloud provides a cost-effective alternative to in-house deployment. T-Coffee is a freeware open source package available from http://www.tcoffee.org/homepage.html

  15. Comparison of Cloud Properties from CALIPSO-CloudSat and Geostationary Satellite Data

    Science.gov (United States)

    Nguyen, L.; Minnis, P.; Chang, F.; Winker, D.; Sun-Mack, S.; Spangenberg, D.; Austin, R.

    2007-01-01

    Cloud properties are being derived in near-real time from geostationary satellite imager data for a variety of weather and climate applications and research. Assessment of the uncertainties in each of the derived cloud parameters is essential for confident use of the products. Determination of cloud amount, cloud top height, and cloud layering is especially important for using these real -time products for applications such as aircraft icing condition diagnosis and numerical weather prediction model assimilation. Furthermore, the distribution of clouds as a function of altitude has become a central component of efforts to evaluate climate model cloud simulations. Validation of those parameters has been difficult except over limited areas where ground-based active sensors, such as cloud radars or lidars, have been available on a regular basis. Retrievals of cloud properties are sensitive to the surface background, time of day, and the clouds themselves. Thus, it is essential to assess the geostationary satellite retrievals over a variety of locations. The availability of cloud radar data from CloudSat and lidar data from CALIPSO make it possible to perform those assessments over each geostationary domain at 0130 and 1330 LT. In this paper, CloudSat and CALIPSO data are matched with contemporaneous Geostationary Operational Environmental Satellite (GOES), Multi-functional Transport Satellite (MTSAT), and Meteosat-8 data. Unlike comparisons with cloud products derived from A-Train imagers, this study considers comparisons of nadir active sensor data with off-nadir retrievals. These matched data are used to determine the uncertainties in cloud-top heights and cloud amounts derived from the geostationary satellite data using the Clouds and the Earth s Radiant Energy System (CERES) cloud retrieval algorithms. The CERES multi-layer cloud detection method is also evaluated to determine its accuracy and limitations in the off-nadir mode. The results will be useful for

  16. Services Recommendation System based on Heterogeneous Network Analysis in Cloud Computing

    OpenAIRE

    Junping Dong; Qingyu Xiong; Junhao Wen; Peng Li

    2014-01-01

    Resources are provided mainly in the form of services in cloud computing. In the distribute environment of cloud computing, how to find the needed services efficiently and accurately is the most urgent problem in cloud computing. In cloud computing, services are the intermediary of cloud platform, services are connected by lots of service providers and requesters and construct the complex heterogeneous network. The traditional recommendation systems only consider the functional and non-functi...

  17. Proposed Network Intrusion Detection System ‎In Cloud Environment Based on Back ‎Propagation Neural Network

    Directory of Open Access Journals (Sweden)

    Shawq Malik Mehibs

    2017-12-01

    Full Text Available Cloud computing is distributed architecture, providing computing facilities and storage resource as a service over the internet. This low-cost service fulfills the basic requirements of users. Because of the open nature and services introduced by cloud computing intruders impersonate legitimate users and misuse cloud resource and services. To detect intruders and suspicious activities in and around the cloud computing environment, intrusion detection system used to discover the illegitimate users and suspicious action by monitors different user activities on the network .this work proposed based back propagation artificial neural network to construct t network intrusion detection in the cloud environment. The proposed module evaluated with kdd99 dataset the experimental results shows promising approach to detect attack with high detection rate and low false alarm rate

  18. Satellite-based trends of solar radiation and cloud parameters in Europe

    Science.gov (United States)

    Pfeifroth, Uwe; Bojanowski, Jedrzej S.; Clerbaux, Nicolas; Manara, Veronica; Sanchez-Lorenzo, Arturo; Trentmann, Jörg; Walawender, Jakub P.; Hollmann, Rainer

    2018-04-01

    Solar radiation is the main driver of the Earth's climate. Measuring solar radiation and analysing its interaction with clouds are essential for the understanding of the climate system. The EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF) generates satellite-based, high-quality climate data records, with a focus on the energy balance and water cycle. Here, multiple of these data records are analyzed in a common framework to assess the consistency in trends and spatio-temporal variability of surface solar radiation, top-of-atmosphere reflected solar radiation and cloud fraction. This multi-parameter analysis focuses on Europe and covers the time period from 1992 to 2015. A high correlation between these three variables has been found over Europe. An overall consistency of the climate data records reveals an increase of surface solar radiation and a decrease in top-of-atmosphere reflected radiation. In addition, those trends are confirmed by negative trends in cloud cover. This consistency documents the high quality and stability of the CM SAF climate data records, which are mostly derived independently from each other. The results of this study indicate that one of the main reasons for the positive trend in surface solar radiation since the 1990's is a decrease in cloud coverage even if an aerosol contribution cannot be completely ruled out.

  19. A Cloud-Based Internet of Things Platform for Ambient Assisted Living

    Science.gov (United States)

    Cubo, Javier; Nieto, Adrián; Pimentel, Ernesto

    2014-01-01

    A common feature of ambient intelligence is that many objects are inter-connected and act in unison, which is also a challenge in the Internet of Things. There has been a shift in research towards integrating both concepts, considering the Internet of Things as representing the future of computing and communications. However, the efficient combination and management of heterogeneous things or devices in the ambient intelligence domain is still a tedious task, and it presents crucial challenges. Therefore, to appropriately manage the inter-connection of diverse devices in these systems requires: (1) specifying and efficiently implementing the devices (e.g., as services); (2) handling and verifying their heterogeneity and composition; and (3) standardizing and managing their data, so as to tackle large numbers of systems together, avoiding standalone applications on local servers. To overcome these challenges, this paper proposes a platform to manage the integration and behavior-aware orchestration of heterogeneous devices as services, stored and accessed via the cloud, with the following contributions: (i) we describe a lightweight model to specify the behavior of devices, to determine the order of the sequence of exchanged messages during the composition of devices; (ii) we define a common architecture using a service-oriented standard environment, to integrate heterogeneous devices by means of their interfaces, via a gateway, and to orchestrate them according to their behavior; (iii) we design a framework based on cloud computing technology, connecting the gateway in charge of acquiring the data from the devices with a cloud platform, to remotely access and monitor the data at run-time and react to emergency situations; and (iv) we implement and generate a novel cloud-based IoT platform of behavior-aware devices as services for ambient intelligence systems, validating the whole approach in real scenarios related to a specific ambient assisted living application

  20. A cloud-based Internet of Things platform for ambient assisted living.

    Science.gov (United States)

    Cubo, Javier; Nieto, Adrián; Pimentel, Ernesto

    2014-08-04

    A common feature of ambient intelligence is that many objects are inter-connected and act in unison, which is also a challenge in the Internet of Things. There has been a shift in research towards integrating both concepts, considering the Internet of Things as representing the future of computing and communications. However, the efficient combination and management of heterogeneous things or devices in the ambient intelligence domain is still a tedious task, and it presents crucial challenges. Therefore, to appropriately manage the inter-connection of diverse devices in these systems requires: (1) specifying and efficiently implementing the devices (e.g., as services); (2) handling and verifying their heterogeneity and composition; and (3) standardizing and managing their data, so as to tackle large numbers of systems together, avoiding standalone applications on local servers. To overcome these challenges, this paper proposes a platform to manage the integration and behavior-aware orchestration of heterogeneous devices as services, stored and accessed via the cloud, with the following contributions: (i) we describe a lightweight model to specify the behavior of devices, to determine the order of the sequence of exchanged messages during the composition of devices; (ii) we define a common architecture using a service-oriented standard environment, to integrate heterogeneous devices by means of their interfaces, via a gateway, and to orchestrate them according to their behavior; (iii) we design a framework based on cloud computing technology, connecting the gateway in charge of acquiring the data from the devices with a cloud platform, to remotely access and monitor the data at run-time and react to emergency situations; and (iv) we implement and generate a novel cloud-based IoT platform of behavior-aware devices as services for ambient intelligence systems, validating the whole approach in real scenarios related to a specific ambient assisted living application.

  1. Clouds, Aerosol, and Precipitation in the Marine Boundary Layer: An ARM Mobile Facility Deployment

    Science.gov (United States)

    Wood, Robert; Wyant, Matthew; Bretherton, Christopher S.; Remillard, Jasmine; Kollias, Pavlos; Fletcher, Jennifer; Stemmler, Jayson; de Szoeke, Simone; Yuter, Sandra; Miller, Matthew; hide

    2015-01-01

    Capsule: A 21-month deployment to Graciosa Island in the northeastern Atlantic Ocean is providing an unprecedented record of the clouds, aerosols and meteorology in a poorly-sampled remote marine environment The Clouds, Aerosol, and Precipitation in the Marine Boundary Layer (CAP-MBL) deployment at Graciosa Island in the Azores generated a 21 month (April 2009- December 2010) comprehensive dataset documenting clouds, aerosols and precipitation using the Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF). The scientific aim of the deployment is to gain improved understanding of the interactions of clouds, aerosols and precipitation in the marine boundary layer. Graciosa Island straddles the boundary between the subtropics and midlatitudes in the Northeast Atlantic Ocean, and consequently experiences a great diversity of meteorological and cloudiness conditions. Low clouds are the dominant cloud type, with stratocumulus and cumulus occurring regularly. Approximately half of all clouds contained precipitation detectable as radar echoes below the cloud base. Radar and satellite observations show that clouds with tops from 1- 11 km contribute more or less equally to surface-measured precipitation at Graciosa. A wide range of aerosol conditions was sampled during the deployment consistent with the diversity of sources as indicated by back trajectory analysis. Preliminary findings suggest important two-way interactions between aerosols and clouds at Graciosa, with aerosols affecting light precipitation and cloud radiative properties while being controlled in part by precipitation scavenging. The data from at Graciosa are being compared with short-range forecasts made a variety of models. A pilot analysis with two climate and two weather forecast models shows that they reproduce the observed time-varying vertical structure of lower-tropospheric cloud fairly well, but the cloud-nucleating aerosol concentrations less well. The Graciosa site has been chosen to be a

  2. CLOUD TECHNOLOGY IN EDUCATION

    Directory of Open Access Journals (Sweden)

    Alexander N. Dukkardt

    2014-01-01

    Full Text Available This article is devoted to the review of main features of cloud computing that can be used in education. Particular attention is paid to those learning and supportive tasks, that can be greatly improved in the case of the using of cloud services. Several ways to implement this approach are proposed, based on widely accepted models of providing cloud services. Nevertheless, the authors have not ignored currently existing problems of cloud technologies , identifying the most dangerous risks and their impact on the core business processes of the university. 

  3. A Physically Based Algorithm for Non-Blackbody Correction of Cloud-Top Temperature and Application to Convection Study

    Science.gov (United States)

    Wang, Chunpeng; Lou, Zhengzhao Johnny; Chen, Xiuhong; Zeng, Xiping; Tao, Wei-Kuo; Huang, Xianglei

    2014-01-01

    Cloud-top temperature (CTT) is an important parameter for convective clouds and is usually different from the 11-micrometers brightness temperature due to non-blackbody effects. This paper presents an algorithm for estimating convective CTT by using simultaneous passive [Moderate Resolution Imaging Spectroradiometer (MODIS)] and active [CloudSat 1 Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO)] measurements of clouds to correct for the non-blackbody effect. To do this, a weighting function of the MODIS 11-micrometers band is explicitly calculated by feeding cloud hydrometer profiles from CloudSat and CALIPSO retrievals and temperature and humidity profiles based on ECMWF analyses into a radiation transfer model.Among 16 837 tropical deep convective clouds observed by CloudSat in 2008, the averaged effective emission level (EEL) of the 11-mm channel is located at optical depth; approximately 0.72, with a standard deviation of 0.3. The distance between the EEL and cloud-top height determined by CloudSat is shown to be related to a parameter called cloud-top fuzziness (CTF), defined as the vertical separation between 230 and 10 dBZ of CloudSat radar reflectivity. On the basis of these findings a relationship is then developed between the CTF and the difference between MODIS 11-micrometers brightness temperature and physical CTT, the latter being the non-blackbody correction of CTT. Correction of the non-blackbody effect of CTT is applied to analyze convective cloud-top buoyancy. With this correction, about 70% of the convective cores observed by CloudSat in the height range of 6-10 km have positive buoyancy near cloud top, meaning clouds are still growing vertically, although their final fate cannot be determined by snapshot observations.

  4. Market-based autonomous resource and application management in private clouds

    KAUST Repository

    Costache, Stefania; Kortas, Samuel; Morin, Christine; Parlavantzas, Nikos

    2016-01-01

    High Performance Computing (HPC) clouds need to be efficiently shared between selfish tenants having applications with different resource requirements and Service Level Objectives (SLOs). The main difficulty relies on providing concurrent resource access to such tenants while maximizing the resource utilization. To overcome this challenge, we propose Merkat, a market-based SLO-driven cloud platform. Merkat relies on a market-based model specifically designed for on-demand fine-grain resource allocation to maximize resource utilization and it uses a combination of currency distribution and dynamic resource pricing to ensure proper resource distribution among tenants. To meet the tenant’s SLO, Merkat uses autonomous controllers, which apply adaptation policies that: (i) dynamically tune the application’s provisioned CPU and memory per virtual machine in contention periods, or (ii) dynamically change the number of virtual machines. Our evaluation with simulation and on the Grid’5000 testbed shows that Merkat provides flexible support for different application types and SLOs and good tenant satisfaction compared to existing centralized systems, while the infrastructure resource utilization is improved.

  5. Market-based autonomous resource and application management in private clouds

    KAUST Repository

    Costache, Stefania

    2016-10-12

    High Performance Computing (HPC) clouds need to be efficiently shared between selfish tenants having applications with different resource requirements and Service Level Objectives (SLOs). The main difficulty relies on providing concurrent resource access to such tenants while maximizing the resource utilization. To overcome this challenge, we propose Merkat, a market-based SLO-driven cloud platform. Merkat relies on a market-based model specifically designed for on-demand fine-grain resource allocation to maximize resource utilization and it uses a combination of currency distribution and dynamic resource pricing to ensure proper resource distribution among tenants. To meet the tenant’s SLO, Merkat uses autonomous controllers, which apply adaptation policies that: (i) dynamically tune the application’s provisioned CPU and memory per virtual machine in contention periods, or (ii) dynamically change the number of virtual machines. Our evaluation with simulation and on the Grid’5000 testbed shows that Merkat provides flexible support for different application types and SLOs and good tenant satisfaction compared to existing centralized systems, while the infrastructure resource utilization is improved.

  6. Approach for Text Classification Based on the Similarity Measurement between Normal Cloud Models

    Directory of Open Access Journals (Sweden)

    Jin Dai

    2014-01-01

    Full Text Available The similarity between objects is the core research area of data mining. In order to reduce the interference of the uncertainty of nature language, a similarity measurement between normal cloud models is adopted to text classification research. On this basis, a novel text classifier based on cloud concept jumping up (CCJU-TC is proposed. It can efficiently accomplish conversion between qualitative concept and quantitative data. Through the conversion from text set to text information table based on VSM model, the text qualitative concept, which is extraction from the same category, is jumping up as a whole category concept. According to the cloud similarity between the test text and each category concept, the test text is assigned to the most similar category. By the comparison among different text classifiers in different feature selection set, it fully proves that not only does CCJU-TC have a strong ability to adapt to the different text features, but also the classification performance is also better than the traditional classifiers.

  7. Global cloud database from VIRS and MODIS for CERES

    Science.gov (United States)

    Minnis, Patrick; Young, David F.; Wielicki, Bruce A.; Sun-Mack, Sunny; Trepte, Qing Z.; Chen, Yan; Heck, Patrick W.; Dong, Xiquan

    2003-04-01

    The NASA CERES Project has developed a combined radiation and cloud property dataset using the CERES scanners and matched spectral data from high-resolution imagers, the Visible Infrared Scanner (VIRS) on the Tropical Rainfall Measuring Mission (TRMM) satellite and the Moderate Resolution Imaging Spectroradiometer (MODIS) on Terra and Aqua. The diurnal cycle can be well-characterized over most of the globe using the combinations of TRMM, Aqua, and Terra data. The cloud properties are derived from the imagers using state-of-the-art methods and include cloud fraction, height, optical depth, phase, effective particle size, emissivity, and ice or liquid water path. These cloud products are convolved into the matching CERES fields of view to provide simultaneous cloud and radiation data at an unprecedented accuracy. Results are available for at least 3 years of VIRS data and 1 year of Terra MODIS data. The various cloud products are compared with similar quantities from climatological sources and instantaneous active remote sensors. The cloud amounts are very similar to those from surface observer climatologies and are 6-7% less than those from a satellite-based climatology. Optical depths are 2-3 times smaller than those from the satellite climatology, but are within 5% of those from the surface remote sensing. Cloud droplet sizes and liquid water paths are within 10% of the surface results on average for stratus clouds. The VIRS and MODIS retrievals are very consistent with differences that usually can be explained by sampling, calibration, or resolution differences. The results should be extremely valuable for model validation and improvement and for improving our understanding of the relationship between clouds and the radiation budget.

  8. Tool-based Risk Assessment of Cloud Infrastructures as Socio-Technical Systems

    DEFF Research Database (Denmark)

    Nidd, Michael; Ivanova, Marieta Georgieva; Probst, Christian W.

    2015-01-01

    Assessing risk in cloud infrastructures is difficult. Typical cloud infrastructures contain potentially thousands of nodes that are highly interconnected and dynamic. Another important component is the set of human actors who get access to data and computing infrastructure. The cloud infrastructure...... exercise for cloud infrastructures using the socio-technical model developed in the TRESPASS project; after showing how to model typical components of a cloud infrastructure, we show how attacks are identified on this model and discuss their connection to risk assessment. The technical part of the model...... is extracted automatically from the configuration of the cloud infrastructure, which is especially important for systems so dynamic and complex....

  9. Polarized View of Supercooled Liquid Water Clouds

    Science.gov (United States)

    Alexandrov, Mikhail D.; Cairns, Brian; Van Diedenhoven, Bastiaan; Ackerman, Andrew S.; Wasilewski, Andrzej P.; McGill, Matthew J.; Yorks, John E.; Hlavka, Dennis L.; Platnick, Steven E.; Arnold, G. Thomas

    2016-01-01

    Supercooled liquid water (SLW) clouds, where liquid droplets exist at temperatures below 0 C present a well known aviation hazard through aircraft icing, in which SLW accretes on the airframe. SLW clouds are common over the Southern Ocean, and climate-induced changes in their occurrence is thought to constitute a strong cloud feedback on global climate. The two recent NASA field campaigns POlarimeter Definition EXperiment (PODEX, based in Palmdale, California, January-February 2013) and Studies of Emissions and Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS, based in Houston, Texas in August- September 2013) provided a unique opportunity to observe SLW clouds from the high-altitude airborne platform of NASA's ER-2 aircraft. We present an analysis of measurements made by the Research Scanning Polarimeter (RSP) during these experiments accompanied by correlative retrievals from other sensors. The RSP measures both polarized and total reflectance in 9 spectral channels with wavelengths ranging from 410 to 2250 nm. It is a scanning sensor taking samples at 0.8deg intervals within 60deg from nadir in both forward and backward directions. This unique angular resolution allows for characterization of liquid water droplet size using the rainbow structure observed in the polarized reflectances in the scattering angle range between 135deg and 165deg. Simple parametric fitting algorithms applied to the polarized reflectance provide retrievals of the droplet effective radius and variance assuming a prescribed size distribution shape (gamma distribution). In addition to this, we use a non-parametric method, Rainbow Fourier Transform (RFT),which allows retrieval of the droplet size distribution without assuming a size distribution shape. We present an overview of the RSP campaign datasets available from the NASA GISS website, as well as two detailed examples of the retrievals. In these case studies we focus on cloud fields with spatial features

  10. An Intelligent and Secure Health Monitoring Scheme Using IoT Sensor Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jin-Xin Hu

    2017-01-01

    Full Text Available Internet of Things (IoT is the network of physical objects where information and communication technology connect multiple embedded devices to the Internet for collecting and exchanging data. An important advancement is the ability to connect such devices to large resource pools such as cloud. The integration of embedded devices and cloud servers offers wide applicability of IoT to many areas of our life. With the aging population increasing every day, embedded devices with cloud server can provide the elderly with more flexible service without the need to visit hospitals. Despite the advantages of the sensor-cloud model, it still has various security threats. Therefore, the design and integration of security issues, like authentication and data confidentiality for ensuring the elderly’s privacy, need to be taken into consideration. In this paper, an intelligent and secure health monitoring scheme using IoT sensor based on cloud computing and cryptography is proposed. The proposed scheme achieves authentication and provides essential security requirements.

  11. Trusted cloud computing

    CERN Document Server

    Krcmar, Helmut; Rumpe, Bernhard

    2014-01-01

    This book documents the scientific results of the projects related to the Trusted Cloud Program, covering fundamental aspects of trust, security, and quality of service for cloud-based services and applications. These results aim to allow trustworthy IT applications in the cloud by providing a reliable and secure technical and legal framework. In this domain, business models, legislative circumstances, technical possibilities, and realizable security are closely interwoven and thus are addressed jointly. The book is organized in four parts on "Security and Privacy", "Software Engineering and

  12. Aerosol Chemical Composition and its Effects on Cloud-Aerosol Interactions during the 2007 CHAPS Experiment

    Science.gov (United States)

    Lee, Y.; Alexander, L.; Newburn, M.; Jayne, J.; Hubbe, J.; Springston, S.; Senum, G.; Andrews, B.; Ogren, J.; Kleinman, L.; Daum, P.; Berg, L.; Berkowitz, C.

    2007-12-01

    Chemical composition of submicron aerosol particles was determined using an Aerodyne Time-of-Flight Aerosol Mass Spectrometer (AMS) outfitted on the DOE G-1 aircraft during the Cumulus Humilis Aerosol Processing Study (CHAPS) conducted in Oklahoma City area in June 2007. The primary objective of CHAPS was to investigate the effects of urban emissions on cloud aerosol interactions as a function of processing of the emissions. Aerosol composition was typically determined at three different altitudes: below, in, and above cloud, in both upwind and downwind regions of the urban area. Aerosols were sampled from an isokinetic inlet with an upper size cut-off of ~1.5 micrometer. During cloud passages, the AMS also sampled particles that were dried from cloud droplets collected using a counter-flow virtual impactor (CVI) sampler. The aerosol mass concentrations were typically below 10 microgram per cubic meter, and were dominated by organics and sulfate. Ammonium was often less than required for complete neutralization of sulfate. Aerosol nitrate levels were very low. We noted that nitrate levels were significantly enhanced in cloud droplets compared to aerosols, most likely resulting from dissolution of gaseous nitric acid. Organic to sulfate ratios appeared to be lower in cloud droplets than in aerosols, suggesting cloud condensation nuclei properties of aerosol particles might be affected by loading and nature of the organic components in aerosols. In-cloud formation of sulfate was considered unimportant because of the very low SO2 concentration in the region. A detailed examination of the sources of the aerosol organic components (based on hydrocarbons determined using a proton transfer reaction mass spectrometer) and their effects on cloud formation as a function of atmospheric processing (based on the degree of oxidation of the organic components) will be presented.

  13. RFID-based Electronic Identity Security Cloud Platform in Cyberspace

    OpenAIRE

    Bing Chen; Chengxiang Tan; Bo Jin; Xiang Zou; Yuebo Dai

    2012-01-01

    With the moving development of networks, especially Internet of Things, electronic identity administration in cyberspace is becoming more and more important. And personal identity management in cyberspace associated with individuals in reality has been one significant and urgent task for the further development of information construction in China. So this paper presents a RFID-based electronic identity security cloud platform in cyberspace to implement an efficient security management of cyb...

  14. Size-density relations in dark clouds: Non-LTE effects

    International Nuclear Information System (INIS)

    Maloney, P.

    1986-01-01

    One of the major goals of molecular astronomy has been to understand the physics and dynamics of dense interstellar clouds. Because the interpretation of observations of giant molecular clouds is complicated by their very complex structure and the dynamical effects of star formation, a number of studies have concentrated on dark clouds. Leung, Kutner and Mead (1982) (hereafter LKM) and Myers (1983), in studies of CO and NH 3 emission, concluded that dark clouds exhibit significant correlations between linewidth and cloud radius of the form delta v varies as R(0.5) and between mean density and radius of the form n varies as R(-1), as originally suggested by Larson (1981). This result suggests that these objects are in virial equilibrium. However, the mean densities inferred from the CO data of LKM are based on an local thermodynamic equilibrium (LTE) analysis of their 13CO data. At the very low mean densities inferred by LKM for the larger clouds in their samples, the assumption of LTE becomes very questionable. As most of the range in R in the density-size correlation comes from the clouds observed in CO, it seems worthwhile to examine how non-LTE effects will influence the derived densities. Microturbulent models of inhomogeneous clouds of varying central concentration with the linewidth-size and mean density-size relations found by Myers show sub-thermal excitation of the 13CO line in the larger clouds, with the result that LTE analysis considerbly underestimates the actual column density. A more general approach which doesn't require detailed modeling of the clouds is to consider whether the observed T/sub R/*(13CO)/T/sub R/*(12CO) ratios in the clouds studied by LKM are in the range where the LTE-derived optical depths be seriously in error due to sub-thermal excitation of the 13CO molecule

  15. Enhancing student motivation using LectureTools: A cloud-based teaching and learning platform

    Directory of Open Access Journals (Sweden)

    P. H. Patrio Chiu

    2015-06-01

    Full Text Available A cloud-based teaching and learning platform, LectureTools, was piloted at City University of Hong Kong in the 2012-13 academic year. LectureTools is an online platform that provides a suite of cloud-based teaching and learning applications. It combines the functions of interactive presentation, real-time student response system, student inquiry and online note-taking synchronised with the presentation slides, into one cloud-based platform. A comprehensive study investigated the effectiveness of the platform for enhancing student motivation among graduate (n=158 and undergraduate (n=96 students. Both groups of students reported enhanced motivation when using LectureTools. The scores on all six learning motivation scales of the Motivated Strategies for Learning Questionnaire, a psychometric instrument based on the cognitive view of motivation, increased when students engaged with the tool in class. Those who used the tool scored significantly higher on intrinsic goal orientation than those who did not use the tool. The students’ quantitative feedback showed that they found the tool useful and that it improved their motivation. Qualitative feedback from the instructors indicated that the tool was useful for engaging passive students. They reported that the most useful function was the interactive online questions with real-time results, while the in-class student inquiry function was difficult to use in practice.

  16. Biometric technology authentication, biocryptography, and cloud-based architecture

    CERN Document Server

    Das, Ravi

    2014-01-01

    Most biometric books are either extraordinarily technical for technophiles or extremely elementary for the lay person. Striking a balance between the two, Biometric Technology: Authentication, Biocryptography, and Cloud-Based Architecture is ideal for business, IT, or security managers that are faced with the task of making purchasing, migration, or adoption decisions. It brings biometrics down to an understandable level, so that you can immediately begin to implement the concepts discussed.Exploring the technological and social implications of widespread biometric use, the book considers the

  17. Cloud Privacy Audit Framework: A Value-Based Design

    Science.gov (United States)

    Coss, David Lewis

    2013-01-01

    The rapid expansion of cloud technology provides enormous capacity, which allows for the collection, dissemination and re-identification of personal information. It is the cloud's resource capabilities such as these that fuel the concern for privacy. The impetus of these concerns are not to far removed from those expressed by Mason in 1986…

  18. Progress in Understanding the Impacts of 3-D Cloud Structure on MODIS Cloud Property Retrievals for Marine Boundary Layer Clouds

    Science.gov (United States)

    Zhang, Zhibo; Werner, Frank; Miller, Daniel; Platnick, Steven; Ackerman, Andrew; DiGirolamo, Larry; Meyer, Kerry; Marshak, Alexander; Wind, Galina; Zhao, Guangyu

    2016-01-01

    Theory: A novel framework based on 2-D Tayler expansion for quantifying the uncertainty in MODIS retrievals caused by sub-pixel reflectance inhomogeneity. (Zhang et al. 2016). How cloud vertical structure influences MODIS LWP retrievals. (Miller et al. 2016). Observation: Analysis of failed MODIS cloud property retrievals. (Cho et al. 2015). Cloud property retrievals from 15m resolution ASTER observations. (Werner et al. 2016). Modeling: LES-Satellite observation simulator (Zhang et al. 2012, Miller et al. 2016).

  19. Cloud computing task scheduling strategy based on improved differential evolution algorithm

    Science.gov (United States)

    Ge, Junwei; He, Qian; Fang, Yiqiu

    2017-04-01

    In order to optimize the cloud computing task scheduling scheme, an improved differential evolution algorithm for cloud computing task scheduling is proposed. Firstly, the cloud computing task scheduling model, according to the model of the fitness function, and then used improved optimization calculation of the fitness function of the evolutionary algorithm, according to the evolution of generation of dynamic selection strategy through dynamic mutation strategy to ensure the global and local search ability. The performance test experiment was carried out in the CloudSim simulation platform, the experimental results show that the improved differential evolution algorithm can reduce the cloud computing task execution time and user cost saving, good implementation of the optimal scheduling of cloud computing tasks.

  20. A UAV-Based Fog Collector Design for Fine-Scale Aerobiological Sampling

    Science.gov (United States)

    Gentry, Diana; Guarro, Marcello; Demachkie, Isabella Siham; Stumfall, Isabel; Dahlgren, Robert P.

    2017-01-01

    Airborne microbes are found throughout the troposphere and into the stratosphere. Knowing how the activity of airborne microorganisms can alter water, carbon, and other geochemical cycles is vital to a full understanding of local and global ecosystems. Just as on the land or in the ocean, atmospheric regions vary in habitability; the underlying geochemical, climatic, and ecological dynamics must be characterized at different scales to be effectively modeled. Most aerobiological studies have focused on a high level: 'How high are airborne microbes found?' and 'How far can they travel?' Most fog and cloud water studies collect from stationary ground stations (point) or along flight transects (1D). To complement and provide context for this data, we have designed a UAV-based modified fog and cloud water collector to retrieve 4D-resolved samples for biological and chemical analysis.Our design uses a passive impacting collector hanging from a rigid rod suspended between two multi-rotor UAVs. The suspension design reduces the effect of turbulence and potential for contamination from the UAV downwash. The UAVs are currently modeled in a leader-follower configuration, taking advantage of recent advances in modular UAVs, UAV swarming, and flight planning.The collector itself is a hydrophobic mesh. Materials including Tyvek, PTFE, nylon, and polypropylene monofilament fabricated via laser cutting, CNC knife, or 3D printing were characterized for droplet collection efficiency using a benchtop atomizer and particle counter. Because the meshes can be easily and inexpensively fabricated, a set can be pre-sterilized and brought to the field for 'hot swapping' to decrease cross-contamination between flight sessions or use as negative controls.An onboard sensor and logging system records the time and location of each sample; when combined with flight tracking data, the samples can be resolved into a 4D volumetric map of the fog bank. Collected samples can be returned to the lab for

  1. Analysis of Observation Data of Earth-Rockfill Dam Based on Cloud Probability Distribution Density Algorithm

    Directory of Open Access Journals (Sweden)

    Han Liwei

    2014-07-01

    Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.

  2. Reusability Framework for Cloud Computing

    OpenAIRE

    Singh, Sukhpal; Singh, Rishideep

    2012-01-01

    Cloud based development is a challenging task for several software engineering projects, especially for those which needs development with reusability. Present time of cloud computing is allowing new professional models for using the software development. The expected upcoming trend of computing is assumed to be this cloud computing because of speed of application deployment, shorter time to market, and lower cost of operation. Until Cloud Co mputing Reusability Model is considered a fundamen...

  3. The role of cloud-scale resolution on radiative properties of oceanic cumulus clouds

    International Nuclear Information System (INIS)

    Kassianov, Evgueni; Ackerman, Thomas; Kollias, Pavlos

    2005-01-01

    Both individual and combined effects of the horizontal and vertical variability of cumulus clouds on solar radiative transfer are investigated using a two-dimensional (x- and z-directions) cloud radar dataset. This high-resolution dataset of typical fair-weather marine cumulus is derived from ground-based 94GHz cloud radar observations. The domain-averaged (along x-direction) radiative properties are computed by a Monte Carlo method. It is shown that (i) different cloud-scale resolutions can be used for accurate calculations of the mean absorption, upward and downward fluxes; (ii) the resolution effects can depend strongly on the solar zenith angle; and (iii) a few cloud statistics can be successfully applied for calculating the averaged radiative properties

  4. Cloud-Based Perception and Control of Sensor Nets and Robot Swarms

    Science.gov (United States)

    2016-04-01

    distributed stream processing framework provides the necessary API and infrastructure to develop and execute such applications in a cluster of computation...streaming DDDAS applications based on challenges they present to the backend Cloud control system. Figure 2 Parallel SLAM Application 3 1) Set of...the art deep learning- based object detectors can recognize among hundreds of object classes and this capability would be very useful for mobile

  5. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    Science.gov (United States)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  6. Cloud detection, classification and motion estimation using geostationary satellite imagery for cloud cover forecast

    International Nuclear Information System (INIS)

    Escrig, H.; Batlles, F.J.; Alonso, J.; Baena, F.M.; Bosch, J.L.; Salbidegoitia, I.B.; Burgaleta, J.I.

    2013-01-01

    Considering that clouds are the greatest causes to solar radiation blocking, short term cloud forecasting can help power plant operation and therefore improve benefits. Cloud detection, classification and motion vector determination are key to forecasting sun obstruction by clouds. Geostationary satellites provide cloud information covering wide areas, allowing cloud forecast to be performed for several hours in advance. Herein, the methodology developed and tested in this study is based on multispectral tests and binary cross correlations followed by coherence and quality control tests over resulting motion vectors. Monthly synthetic surface albedo image and a method to reject erroneous correlation vectors were developed. Cloud classification in terms of opacity and height of cloud top is also performed. A whole-sky camera has been used for validation, showing over 85% of agreement between the camera and the satellite derived cloud cover, whereas error in motion vectors is below 15%. - Highlights: ► A methodology for detection, classification and movement of clouds is presented. ► METEOSAT satellite images are used to obtain a cloud mask. ► The prediction of cloudiness is estimated with 90% in overcast conditions. ► Results for partially covered sky conditions showed a 75% accuracy. ► Motion vectors are estimated from the clouds with a success probability of 86%

  7. All-sky photogrammetry techniques to georeference a cloud field

    Science.gov (United States)

    Crispel, Pierre; Roberts, Gregory

    2018-01-01

    In this study, we present a novel method of identifying and geolocalizing cloud field elements from a portable all-sky camera stereo network based on the ground and oriented towards zenith. The methodology is mainly based on stereophotogrammetry which is a 3-D reconstruction technique based on triangulation from corresponding stereo pixels in rectified images. In cases where clouds are horizontally separated, identifying individual positions is performed with segmentation techniques based on hue filtering and contour detection algorithms. Macroscopic cloud field characteristics such as cloud layer base heights and velocity fields are also deduced. In addition, the methodology is fitted to the context of measurement campaigns which impose simplicity of implementation, auto-calibration, and portability. Camera internal geometry models are achieved a priori in the laboratory and validated to ensure a certain accuracy in the peripheral parts of the all-sky image. Then, stereophotogrammetry with dense 3-D reconstruction is applied with cameras spaced 150 m apart for two validation cases. The first validation case is carried out with cumulus clouds having a cloud base height at 1500 m a.g.l. The second validation case is carried out with two cloud layers: a cumulus fractus layer with a base height at 1000 m a.g.l. and an altocumulus stratiformis layer with a base height of 2300 m a.g.l. Velocity fields at cloud base are computed by tracking image rectangular patterns through successive shots. The height uncertainty is estimated by comparison with a Vaisala CL31 ceilometer located on the site. The uncertainty on the horizontal coordinates and on the velocity field are theoretically quantified by using the experimental uncertainties of the cloud base height and camera orientation. In the first cumulus case, segmentation of the image is performed to identify individuals clouds in the cloud field and determine the horizontal positions of the cloud centers.

  8. Cloud Service Platform: Hospital Information eXchange(HIX)

    OpenAIRE

    Fang Zhiyuan; Wei Li

    2013-01-01

    Health Information eXchange (HIX) is a part of Happiness Cloud Service Platform of Happiness Guangdong in Guangdong Province of China based on innovation of cloud-based business model. This article illustrates the hospital health care business services system based on cloud computing. major business functions of HIX includes integrated mobile medical information services, and mobile health information services. Key cloud service platform capabilities include appointment of HIX registration, d...

  9. Top-down and Bottom-up aerosol-cloud-closure: towards understanding sources of unvertainty in deriving cloud radiative flux

    Science.gov (United States)

    Sanchez, K.; Roberts, G.; Calmer, R.; Nicoll, K.; Hashimshoni, E.; Rosenfeld, D.; Ovadnevaite, J.; Preissler, J.; Ceburnis, D.; O'Dowd, C. D. D.; Russell, L. M.

    2017-12-01

    Top-down and bottom-up aerosol-cloud shortwave radiative flux closures were conducted at the Mace Head atmospheric research station in Galway, Ireland in August 2015. Instrument platforms include ground-based, unmanned aerial vehicles (UAV), and satellite measurements of aerosols, clouds and meteorological variables. The ground-based and airborne measurements of aerosol size distributions and cloud condensation nuclei (CCN) concentration were used to initiate a 1D microphysical aerosol-cloud parcel model (ACPM). UAVs were equipped for a specific science mission, with an optical particle counter for aerosol distribution profiles, a cloud sensor to measure cloud extinction, or a 5-hole probe for 3D wind vectors. These are the first UAV measurements at Mace Head. ACPM simulations are compared to in-situ cloud extinction measurements from UAVs to quantify closure in terms of cloud shortwave radiative flux. Two out of seven cases exhibit sub-adiabatic vertical temperature profiles within the cloud, which suggests that entrainment processes affect cloud microphysical properties and lead to an overestimate of simulated cloud shortwave radiative flux. Including an entrainment parameterization and explicitly calculating the entrainment fraction in the ACPM simulations both improved cloud-top radiative closure. Entrainment reduced the difference between simulated and observation-derived cloud-top shortwave radiative flux (δRF) by between 25 W m-2 and 60 W m-2. After accounting for entrainment, satellite-derived cloud droplet number concentrations (CDNC) were within 30% of simulated CDNC. In cases with a well-mixed boundary layer, δRF is no greater than 20 W m-2 after accounting for cloud-top entrainment, and up to 50 W m-2 when entrainment is not taken into account. In cases with a decoupled boundary layer, cloud microphysical properties are inconsistent with ground-based aerosol measurements, as expected, and δRF is as high as 88 W m-2, even high (> 30 W m-2) after

  10. Determination of gold nanoparticles in environmental water samples by second-order optical scattering using dithiotreitol-functionalized CdS quantum dots after cloud point extraction

    Energy Technology Data Exchange (ETDEWEB)

    Mandyla, Spyridoula P.; Tsogas, George Z.; Vlessidis, Athanasios G.; Giokas, Dimosthenis L., E-mail: dgiokas@cc.uoi.gr

    2017-02-05

    Highlights: • A new method has been developed to determine gold nanoparticles in water samples. • Extraction was achieved by cloud point extraction. • A nano-hybrid assembly between AuNPs and dithiol-coated quantum dots was formulated. • Detection was accomplished at pico-molar levels by second-order light scattering. • The method was selective against ionic gold and other nanoparticle species. - Abstract: This work presents a new method for the sensitive and selective determination of gold nanoparticles in water samples. The method combines a sample preparation and enrichment step based on cloud point extraction with a new detection motif that relies on the optical incoherent light scattering of a nano-hybrid assembly that is formed by hydrogen bond interactions between gold nanoparticles and dithiotreitol-functionalized CdS quantum dots. The experimental parameters affecting the extraction and detection of gold nanoparticles were optimized and evaluated to the analysis of gold nanoparticles of variable size and surface coating. The selectivity of the method against gold ions and other nanoparticle species was also evaluated under different conditions reminiscent to those usually found in natural water samples. The developed method was applied to the analysis of gold nanoparticles in natural waters and wastewater with satisfactory results in terms of sensitivity (detection limit at the low pmol L{sup −1} levels), recoveries (>80%) and reproducibility (<9%). Compared to other methods employing molecular spectrometry for metal nanoparticle analysis, the developed method offers improved sensitivity and it is easy-to-operate thus providing an additional tool for the monitoring and the assessment of nanoparticles toxicity and hazards in the environment.

  11. Determination of gold nanoparticles in environmental water samples by second-order optical scattering using dithiotreitol-functionalized CdS quantum dots after cloud point extraction

    International Nuclear Information System (INIS)

    Mandyla, Spyridoula P.; Tsogas, George Z.; Vlessidis, Athanasios G.; Giokas, Dimosthenis L.

    2017-01-01

    Highlights: • A new method has been developed to determine gold nanoparticles in water samples. • Extraction was achieved by cloud point extraction. • A nano-hybrid assembly between AuNPs and dithiol-coated quantum dots was formulated. • Detection was accomplished at pico-molar levels by second-order light scattering. • The method was selective against ionic gold and other nanoparticle species. - Abstract: This work presents a new method for the sensitive and selective determination of gold nanoparticles in water samples. The method combines a sample preparation and enrichment step based on cloud point extraction with a new detection motif that relies on the optical incoherent light scattering of a nano-hybrid assembly that is formed by hydrogen bond interactions between gold nanoparticles and dithiotreitol-functionalized CdS quantum dots. The experimental parameters affecting the extraction and detection of gold nanoparticles were optimized and evaluated to the analysis of gold nanoparticles of variable size and surface coating. The selectivity of the method against gold ions and other nanoparticle species was also evaluated under different conditions reminiscent to those usually found in natural water samples. The developed method was applied to the analysis of gold nanoparticles in natural waters and wastewater with satisfactory results in terms of sensitivity (detection limit at the low pmol L −1 levels), recoveries (>80%) and reproducibility (<9%). Compared to other methods employing molecular spectrometry for metal nanoparticle analysis, the developed method offers improved sensitivity and it is easy-to-operate thus providing an additional tool for the monitoring and the assessment of nanoparticles toxicity and hazards in the environment.

  12. Aircraft-based investigation of Dynamics-Aerosol-Chemistry-Cloud Interactions in Southern West Africa

    Science.gov (United States)

    Flamant, Cyrille

    2017-04-01

    The EU-funded project DACCIWA (Dynamics-Aerosol-Chemistry-Cloud Interactions in West Africa, http://www.dacciwa.eu) is investigating the relationship between weather, climate and air pollution in southern West Africa. The air over the coastal region of West Africa is a unique mixture of natural and anthropogenic gases, liquids and particles, emitted in an environment, in which multi-layer cloud decks frequently form. These exert a large influence on the local weather and climate, mainly due to their impact on radiation, the surface energy balance and thus the diurnal cycle of the atmospheric boundary layer. The main objective for the aircraft detachment was to build robust statistics of cloud properties in southern West Africa in different chemical landscapes to investigate the physical processes involved in their life cycle in such a complex chemical environment. As part of the DACCIWA field campaigns, three European aircraft (the German DLR Falcon 20, the French SAFIRE ATR 42 and the British BAS Twin Otter) conducted a total of 50 research flights across Ivory Coast, Ghana, Togo, and Benin from 27 June to 16 July 2016 for a total of 155 flight hours, including hours sponsored through 3 EUFAR projects. The aircraft were used in different ways based on their strengths, but all three had comparable instrumentation with the the capability to do gas-phase chemistry, aerosol and clouds, thereby generating a rich dataset of atmospheric conditions across the region. Eight types of flight objectives were conducted to achieve the goals of the DACCIWA: (i) Stratus clouds, (ii) Land-sea breeze clouds, (iii) Mid-level clouds, (iv) Biogenic emission, (v) City emissions, (vi) Flaring and ship emissions, (vii) Dust and biomass burning aerosols, and (viii) air-sea interactions. An overview of the DACCIWA aircraft campaign as well as first highlights from the airborne observations will be presented.

  13. Moving ERP Systems to the Cloud - Data Security Issues

    Directory of Open Access Journals (Sweden)

    Pablo Saa

    2017-08-01

    Full Text Available This paper brings to light data security issues and concerns for organizations by moving their Enterprise Resource Planning (ERP systems to the cloud. Cloud computing has become the new trend of how organizations conduct business and has enabled them to innovate and compete in a dynamic environment through new and innovative business models. The growing popularity and success of the cloud has led to the emergence of cloud-based Software-as-a-Service (SaaS ERP systems, a new alternative approach to traditional on-premise ERP systems. Cloud-based ERP has a myriad of benefits for organizations. However, infrastructure engineers need to address data security issues before moving their enterprise applications to the cloud. Cloud-based ERP raises specific concerns about the confidentiality and integrity of the data stored in the cloud. Such concerns that affect the adoption of cloud-based ERP are based on the size of the organization. Small to medium enterprises (SMEs gain the maximum benefits from cloud-based ERP as many of the concerns around data security are not relevant to them. On the contrary, larger organizations are more cautious in moving their mission critical enterprise applications to the cloud. A hybrid solution where organizations can choose to keep their sensitive applications on-premise while leveraging the benefits of the cloud is proposed in this paper as an effective solution that is gaining momentum and popularity for large organizations.

  14. Cloud-based processing of multi-spectral imaging data

    Science.gov (United States)

    Bernat, Amir S.; Bolton, Frank J.; Weiser, Reuven; Levitz, David

    2017-03-01

    Multispectral imaging holds great promise as a non-contact tool for the assessment of tissue composition. Performing multi - spectral imaging on a hand held mobile device would allow to bring this technology and with it knowledge to low resource settings to provide a state of the art classification of tissue health. This modality however produces considerably larger data sets than white light imaging and requires preliminary image analysis for it to be used. The data then needs to be analyzed and logged, while not requiring too much of the system resource or a long computation time and battery use by the end point device. Cloud environments were designed to allow offloading of those problems by allowing end point devices (smartphones) to offload computationally hard tasks. For this end we present a method where the a hand held device based around a smartphone captures a multi - spectral dataset in a movie file format (mp4) and compare it to other image format in size, noise and correctness. We present the cloud configuration used for segmenting images to frames where they can later be used for further analysis.

  15. Cloud solution for histopathological image analysis using region of interest based compression.

    Science.gov (United States)

    Kanakatte, Aparna; Subramanya, Rakshith; Delampady, Ashik; Nayak, Rajarama; Purushothaman, Balamuralidhar; Gubbi, Jayavardhana

    2017-07-01

    Recent technological gains have led to the adoption of innovative cloud based solutions in medical imaging field. Once the medical image is acquired, it can be viewed, modified, annotated and shared on many devices. This advancement is mainly due to the introduction of Cloud computing in medical domain. Tissue pathology images are complex and are normally collected at different focal lengths using a microscope. The single whole slide image contains many multi resolution images stored in a pyramidal structure with the highest resolution image at the base and the smallest thumbnail image at the top of the pyramid. Highest resolution image will be used for tissue pathology diagnosis and analysis. Transferring and storing such huge images is a big challenge. Compression is a very useful and effective technique to reduce the size of these images. As pathology images are used for diagnosis, no information can be lost during compression (lossless compression). A novel method of extracting the tissue region and applying lossless compression on this region and lossy compression on the empty regions has been proposed in this paper. The resulting compression ratio along with lossless compression on tissue region is in acceptable range allowing efficient storage and transmission to and from the Cloud.

  16. BlueSky Cloud Framework: An E-Learning Framework Embracing Cloud Computing

    Science.gov (United States)

    Dong, Bo; Zheng, Qinghua; Qiao, Mu; Shu, Jian; Yang, Jie

    Currently, E-Learning has grown into a widely accepted way of learning. With the huge growth of users, services, education contents and resources, E-Learning systems are facing challenges of optimizing resource allocations, dealing with dynamic concurrency demands, handling rapid storage growth requirements and cost controlling. In this paper, an E-Learning framework based on cloud computing is presented, namely BlueSky cloud framework. Particularly, the architecture and core components of BlueSky cloud framework are introduced. In BlueSky cloud framework, physical machines are virtualized, and allocated on demand for E-Learning systems. Moreover, BlueSky cloud framework combines with traditional middleware functions (such as load balancing and data caching) to serve for E-Learning systems as a general architecture. It delivers reliable, scalable and cost-efficient services to E-Learning systems, and E-Learning organizations can establish systems through these services in a simple way. BlueSky cloud framework solves the challenges faced by E-Learning, and improves the performance, availability and scalability of E-Learning systems.

  17. CloudNeo: a cloud pipeline for identifying patient-specific tumor neoantigens.

    Science.gov (United States)

    Bais, Preeti; Namburi, Sandeep; Gatti, Daniel M; Zhang, Xinyu; Chuang, Jeffrey H

    2017-10-01

    We present CloudNeo, a cloud-based computational workflow for identifying patient-specific tumor neoantigens from next generation sequencing data. Tumor-specific mutant peptides can be detected by the immune system through their interactions with the human leukocyte antigen complex, and neoantigen presence has recently been shown to correlate with anti T-cell immunity and efficacy of checkpoint inhibitor therapy. However computing capabilities to identify neoantigens from genomic sequencing data are a limiting factor for understanding their role. This challenge has grown as cancer datasets become increasingly abundant, making them cumbersome to store and analyze on local servers. Our cloud-based pipeline provides scalable computation capabilities for neoantigen identification while eliminating the need to invest in local infrastructure for data transfer, storage or compute. The pipeline is a Common Workflow Language (CWL) implementation of human leukocyte antigen (HLA) typing using Polysolver or HLAminer combined with custom scripts for mutant peptide identification and NetMHCpan for neoantigen prediction. We have demonstrated the efficacy of these pipelines on Amazon cloud instances through the Seven Bridges Genomics implementation of the NCI Cancer Genomics Cloud, which provides graphical interfaces for running and editing, infrastructure for workflow sharing and version tracking, and access to TCGA data. The CWL implementation is at: https://github.com/TheJacksonLaboratory/CloudNeo. For users who have obtained licenses for all internal software, integrated versions in CWL and on the Seven Bridges Cancer Genomics Cloud platform (https://cgc.sbgenomics.com/, recommended version) can be obtained by contacting the authors. jeff.chuang@jax.org. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  18. Electric field measuring and display system. [for cloud formations

    Science.gov (United States)

    Wojtasinski, R. J.; Lovall, D. D. (Inventor)

    1974-01-01

    An apparatus is described for monitoring the electric fields of cloud formations within a particular area. It utilizes capacitor plates that are alternately shielded from the clouds for generating an alternating signal corresponding to the intensity of the electric field of the clouds. A synchronizing signal is produced for controlling sampling of the alternating signal. Such samplings are fed through a filter and converted by an analogue to digital converter into digital form and subsequently fed to a transmitter for transmission to the control station for recording.

  19. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    Science.gov (United States)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  20. Comparison of monthly nighttime cloud fraction products from MODIS and AIRS and ground-based camera over Manila Observatory (14.64N, 121.07E)

    Science.gov (United States)

    Gacal, G. F. B.; Lagrosas, N.

    2017-12-01

    Cloud detection nowadays is primarily achieved by the utilization of various sensors aboard satellites. These include MODIS Aqua, MODIS Terra, and AIRS with products that include nighttime cloud fraction. Ground-based instruments are, however, only secondary to these satellites when it comes to cloud detection. Nonetheless, these ground-based instruments (e.g., LIDARs, ceilometers, and sky-cameras) offer significant datasets about a particular region's cloud cover values. For nighttime operations of cloud detection instruments, satellite-based instruments are more reliably and prominently used than ground-based ones. Therefore if a ground-based instrument for nighttime operations is operated, it ought to produce reliable scientific datasets. The objective of this study is to do a comparison between the results of a nighttime ground-based instrument (sky-camera) and that of MODIS Aqua and MODIS Terra. A Canon Powershot A2300 is placed ontop of Manila Observatory (14.64N, 121.07E) and is configured to take images of the night sky at 5min intervals. To detect pixels with clouds, the pictures are converted to grayscale format. Thresholding technique is used to screen pixels with cloud and pixels without clouds. If the pixel value is greater than 17, it is considered as a cloud; otherwise, a noncloud (Gacal et al., 2016). This algorithm is applied to the data gathered from Oct 2015 to Oct 2016. A scatter plot between satellite cloud fraction in the area covering the area 14.2877N, 120.9869E, 14.7711N and 121.4539E and ground cloud cover is graphed to find the monthly correlation. During wet season (June - November), the satellite nighttime cloud fraction vs ground measured cloud cover produce an acceptable R2 (Aqua= 0.74, Terra= 0.71, AIRS= 0.76). However, during dry season, poor R2 values are obtained (AIRS= 0.39, Aqua & Terra = 0.01). The high correlation during wet season can be attributed to a high probability that the camera and satellite see the same clouds